New storage system features multiple Nvidia GPUs for high-speed throughput. Credit: Shutterstock/Gorodenkoff Supermicro announced the launch of a new storage system optimized for AI workloads using multiple Nvidia BlueField-3 data processing units (DPU) combined with an all-flash array. The new Just a Bunch of Flash (JBOF) system features a 2U rack that can house up to four BlueField-3 DPUs. These units support 400Gb Ethernet or InfiniBand networking and provide hardware acceleration for demanding storage and networking workloads. The Supermicro JBOF replaces the traditional storage CPU and memory subsystem with the BlueField-3 DPU and runs the storage application on the DPU’s 16 Arm cores. BlueField-3 accelerates networking traffic through hardware support for RoCE (RDMA over converged Ethernet), GPU direct storage and GPU initiated storage. The dual-port JBOF architecture is designed for active-active clustering, ensuring high availability for scale-up storage applications as well as scale-out storage such as object storage and parallel file systems. The systems support 24 or 36 SSDs, with a maximum capacity of 1.105PB of raw storage using 30.71TB SSDs. “Our balanced network and storage I/O design can saturate the full 400 Gb/s BlueField-3 line-rate realizing more than 250GB/s bandwidth of the Gen 5 SSDs,” said Charles Liang, president and CEO of Supermicro, in a statement. As part of its collaboration with Nvidia, Supermicro is creating a new JBOF ecosystem including data platform company Hammerspace and object storage provider Cloudian to enable these storage infrastructure software platforms to run natively on the BlueField-3 DPU. Read more about AI servers HPE, Dell launch another round of AI servers: The AI spin continues, with Hewlett Packard Enterprise and Dell Technologies both introducing new servers oriented towards training large language models (LLM). Elon Musk’s Grok AI ‘compute factory’ will use Dell and Supermicro servers: The factory’s location is still shrouded in mystery and shrink wrap. Omdia: Servers for AI processing are big sellers: Once optional, GPUs are becoming mandatory in servers. Companies are prioritizing investment in highly configured server clusters for AI, research firm Omdia reports. Dell updates servers with AMD AI accelerators: The Dell PowerEdge XE9680 ships with Instinct MI300X accelerators, AMD’s high-end GPU accelerators that are designed to compete with Nvidia Hopper processors. AI server market: Cloud giants to command 60% demand: Nvidia, along with AMD and other leading ASIC chip manufacturers, is poised to drive the supply. However, ongoing restrictions on exports to China increased proprietary technologies, and intensified competition may present hurdles. Related content news Nvidia to power India’s AI factories with tens of thousands of AI chips India’s cloud providers and server manufacturers plan to boost Nvidia GPU deployment nearly tenfold by the year’s end compared to 18 months ago. By Prasanth Aby Thomas Oct 24, 2024 5 mins GPUs Artificial Intelligence Data Center news Gartner: 13 AI insights for enterprise IT Costs, security, management and employee impact are among the core AI challenges that enterprises face. By Michael Cooney Oct 23, 2024 6 mins Generative AI Careers Data Center news Network jobs watch: Hiring, skills and certification trends What IT leaders need to know about expanding responsibilities, new titles and hot skills for network professionals and I&O teams. By Denise Dubie Oct 23, 2024 33 mins Careers Data Center Networking PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe