Report recommends the initiative involve researchers from national labs, academia, and industry. Credit: Shutterstock AI Generator A recommendation by a US Department of Energy (DoE) working group to create an AI test bed that will be mandated with finding ways to create data centers that are more energy efficient than they are today represents a good step forward, but much needs to be done, an analyst said today. Lucas Beran, a research director with Dell’Oro Group, whose primary focus revolves around power and cooling technologies that enable sustainable data centers, said changes are needed because the “compute that runs AI workloads is very different from general purpose computing based on CPUs. This is significantly changing the trajectory of how you power this infrastructure, how you cool this infrastructure, and just how much energy in general could be consumed from these types of deployment in the future.” A report submitted to Secretary of Energy Jennifer Granholm in late July by a department advisory board working group tasked with finding alternative ways to create energy stated that “connection requests for hyperscale facilities of 300-1000MW or larger with lead times of one to three years are stretching the capacity of local grids to deliver and supply power at that pace.” It noted, “a significant factor today and in the medium-term (2030+) is expanding power demand of AI applications. Advancements in both hardware and software have enabled development of large language models (LLMs) that now approach human capabilities on a wide range of valuable tasks. As these models have grown larger, so have concerns about sizeable future increases in the energy to deploy LLMs as AI tools become more deeply woven into society.” The report added that the scale of the potential growth of both the electricity and the information technology sectors due to AI is extraordinary and represents the leading edge of projected electricity demand growth. Purpose of the testbed It stated that the AI testbed “can allow researchers from the national labs, academia, and industry to collaborate in development and assessment of algorithms for energy-efficient and/or energy-flexible AI training and inference, advancing the nation’s AI capabilities and building on the success of comparable public-private efforts that have accelerated advances in high-performance computing.” For immediate impact, authors of the report wrote, “the Secretary should convene energy utilities, data center developers and operators, and other key stakeholders to start active dialog on how to address current electricity supply bottlenecks,” as well as “to develop strategies for how to generate and deliver the power needed to sustain AI leadership into the future.” The recommendation to create a test bed, said Beran, “is really step one for the DoE in terms of understanding what infrastructure is being used, and how much energy it consumes. And then, once we have this starting data point, how do we improve from there? This really made me think of the old saying, ‘you can’t improve on what you can’t measure.’ They have to start somewhere and set a base case and that to me, is what this is.” Developing solutions He said the hyperscalers, to which the working group reached out to solicit views, face “unsolved problems in how to manage power demands from AI workloads. It is not like the industry has solved the problems or challenges, it is more like, ‘we have identified challenges with the AI workload and energy profile requirements, and now we need to start to develop some solutions for them.’” Those solutions, said Beran, range from changing how data center facilities are architected to making system changes to accommodate the workload profile. Of note, he said, is the need to improve the energy efficiency factor. He added that while sustainability has “been such a critical factor the past couple of years, it is really started to take a backseat to some of the AI growth requirements. Trying to manage both is important.” In addition, Thomas Randall, director of AI market research at Info-Tech Research Group, said via email, “as AI models get larger and require more compute power, the amount of energy required to support this market will equally increase. Without a broader energy strategy, countries that house the data centers AI companies are using will face ongoing CO2 emission issues, limitations on growth, and opportunity costs for energy use elsewhere.” Related reading: AI is wild card in data center energy demand, say electricity researchers Report: AI, crypto drive spike in data-center energy demands What IT needs to know about energy-efficiency directives for data centers Related content news Vertiv and Nvidia define liquid cooling reference architecture Jointly designed architecture is intended for GPU-loaded AI factories, which will generate tremendous amounts of heat. By Andy Patrizio Oct 22, 2024 4 mins CPUs and Processors Energy Efficiency Data Center news As global AI energy usage mounts, Nvidia claims efficiency gains of up to 100,000X However, the chipmaker’s dramatic claim for the performance gains of its GPUs is over a 10-year span, and only applies to one type of calculation. By Taryn Plumb Oct 08, 2024 6 mins CPUs and Processors Energy Efficiency news UK election may unlock access to new data center capacity The UK’s new Labour government is pushing ahead with plans for economic growth, including encouraging the construction of more data centers. By Gyana Swain Jul 09, 2024 4 mins Energy Efficiency Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe