Nvidia testing of data-processing units in servers says the DPUs free up CPUs to do more of the work they are designed for and also reduce overall power draw by the servers. Credit: iStock The chip maker says tests of its BlueField-2 data-processing units (DPU) in servers results in significant power savings over servers that don’t use the specialized chips to offload tasks from the CPUs. The DPUs, or SmartNICs, take on certain workloads—packet routing, encryption, real-time data analysis—leaving the CPU free to process data. But Nvidia says they can also reduce power consumption. The four tests involved running similar workloads on servers with and without DPUs, and Nvidia concluded that even with the additional power draw by the DPUs, overall power consumption by the servers dropped. For example, one test found that when a DPU took on processing IPsec encryption, the server used 21% less power processing the task than when the CPU did it alone—525W with the DPU and 665W without. “I can’t speak for others,” said Ami Badani vice president of marketing and developer ecosystem strategy at Nvidia. “But for the workloads that we’ve tested, if you run those same workloads with a DPU in those servers, you would ultimately need fewer servers to run those same workloads.” In addition to Nvidia, competitors Intel, AMD, and Marvell also make DPUs. (Nvidia acquired its BlueField-2 DPU line with its acquisition of Mellanox in 2019.) The tests were run in cooperation with Ericsson, VMware, and an unnamed North American wireless carrier. The best-case results in the tests said offloading specific networking tasks to a BlueField DPU reduced power consumption by as much as 34%–up to 247 Watts per server. And that could reduce the number of servers needed in certain data centers, Nvidia says. How much that translates into dollar savings depends on the price of electricity and the power usage effectiveness (PUE) of the data center, Nvidia says. PUE is the ratio between the total power drawn by a data center and the amount used to power the networking gear within it. However, data centers cashing in by getting rid of servers is unlikely, Badan said. “In realitywhat will happen is instead of most enterprises saying, ‘I’m just going to return five servers that I didn’t need,’ most folks will repurpose those servers for other workloads,” she said. Still, the power savings could help organizations meet their green/ESG initiatives But if they do choose to save on servers, it could help enterprises with their environmental, social, and governance initiatives, Badani said. “Saving cores ultimately means saving servers, so you don’t need the capacity that you originally needed for those same workloads,” she said. Related content news Supermicro unveils AI-optimized storage powered by Nvidia New storage system features multiple Nvidia GPUs for high-speed throughput. By Andy Patrizio Oct 24, 2024 3 mins Enterprise Storage Data Center news Nvidia to power India’s AI factories with tens of thousands of AI chips India’s cloud providers and server manufacturers plan to boost Nvidia GPU deployment nearly tenfold by the year’s end compared to 18 months ago. By Prasanth Aby Thomas Oct 24, 2024 5 mins GPUs Artificial Intelligence Data Center news Gartner: 13 AI insights for enterprise IT Costs, security, management and employee impact are among the core AI challenges that enterprises face. By Michael Cooney Oct 23, 2024 6 mins Generative AI Careers Data Center news Network jobs watch: Hiring, skills and certification trends What IT leaders need to know about expanding responsibilities, new titles and hot skills for network professionals and I&O teams. By Denise Dubie Oct 23, 2024 33 mins Careers Data Center Networking PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe