The US Department of Energy is funding 15 projects aimed at developing energy-efficient cooling technologies for data centers. The Department of Energy has awarded $40 million to 15 vendors and university labs as part of a government program that aims to reduce the portion of data centers’ power usage that’s used for cooling to just 5% of their total energy consumption. The DOE’s Advanced Research Projects Agency–Energy (ARPA-E) is providing the funding to jumpstart a program called COOLERCHIPS, an acronym for Cooling Operations Optimized for Leaps in Energy, Reliability, and Carbon Hyperefficiency for Information Processing Systems. For chip cooling to account for just 5% of total energy consumption, that would translate to a PUE of 1.05. (Power usage effectiveness, or PUE, is a metric to measure data center efficiency. It’s the ratio of the total amount of energy used by a data center facility to the energy delivered to computing equipment.) While there are some extremely advanced data centers using liquid cooling and immersion cooling to get down to that level of power consumption, the average PUE for an enterprise data center is around 1.5, according to the Uptime Institute. US Secretary of Energy Jennifer Granholm says the motivation behind the program is to bring down the power draw of data centers and minimize their environmental impact. “The DOE is funding projects that will ensure the continued operation of these facilities while reducing the associated carbon emissions to beat climate change and reach our clean energy future,” Granholm said in a statement. The 15 recipients were awarded funds ranging from $1.2 million to $5 million to pursue a variety of cooling technologies, mostly around liquid chip cooling but also modular data-center design. For example, Nvidia will receive $5 million to develop “Green Refrigerant Compact Hybrid System for Ultra-Efficient and Sustainable HPC Cooling.” This is a cooling system that combines direct-to-chip, single- and two-phase immersion in a rack manifold with built-in pumps and a liquid-vapor separator. The University of California at Davis was awarded $3.5 million to develop “Holistic Modular Energy-efficient Directed Cooling Solutions (HoMEDiCS) for Edge Computing.” Their design performs heat extraction from CPUs and GPUs with a liquid-cooled loop and use of high-efficiency, low-cost heat exchangers. Flexnode was awarded $3.5 million to develop a prefabricated, modularly designed edge data center that could be built like Legos. The $40 million is peanuts compared to the $52.7 billion package of subsidies and grants to the US semiconductor manufacturing industry as part of the CHIPS Act. But every bit helps, says Jim McGregor, principal analyst with TIRIAS Research. “It is not surprising to see the investment by the DOE into data center cooling solutions, and the department appears to be spreading the funding rather wide. From my standpoint, this all plays into the US government’s investment in technology, which includes the CHIPS Act,” he said. “The technology value chain is very complex. To be globally competitive, the US must have competitive solutions for the entire value chain. And, it is good to have state-of-the-art technology for US government and military applications,” McGregor added. Related content news Pure Storage adds AI features for security and performance Updated infrastructure-as-code management capabilities and expanded SLAs are among the new features from Pure Storage. By Andy Patrizio Jun 26, 2024 3 mins Enterprise Storage Data Center news Nvidia teases next-generation Rubin platform, shares physical AI vision ‘I'm not sure yet whether I'm going to regret this or not,' said Nvidia CEO Jensen Huang as he revealed 2026 plans for the company’s Rubin GPU platform. By Andy Patrizio Jun 17, 2024 4 mins CPUs and Processors Data Center news Intel launches sixth-generation Xeon processor line With the new generation chips, Intel is putting an emphasis on energy efficiency. By Andy Patrizio Jun 06, 2024 3 mins CPUs and Processors Data Center news AMD updates Instinct data center GPU line Unveiled at Computex 2024. the new AI processing card from AMD will come with much more high-bandwidth memory than its predecessor. By Andy Patrizio Jun 04, 2024 3 mins CPUs and Processors Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe