Nvidia, along with AMD and other leading ASIC chip manufacturers, is poised to drive the supply. However, ongoing restrictions on exports to China increased proprietary technologies, and intensified competition may present hurdles. Credit: Shutterstock Microsoft, Google, AWS, and Meta will collectively account for over 60% of global demand for high-end AI servers in 2024, according to a report from Trendforce. Microsoft will account for 20.2% of the demand, followed by Google at 16.6%, AWS at 16%, and Meta at 10.8%. Nvidia, along with AMD and other leading ASIC chip manufacturers, is poised to drive the supply. However, ongoing restrictions on exports to China increased proprietary technologies, and intensified competition may present hurdles. Challenges amid global demand surge While demand from the cloud-tech giants could rise exponentially, AI server makers could face significant challenges amid geopolitical concerns, supply chain concerns, and competition. A major hurdle is the US ban on technological exports, prompting China to pursue self-reliance in AI chip development and elevating Huawei as a formidable competitor. This move could undermine the appeal of Nvidia’s China-tailored H20 series due to cost-effectiveness issues. During the company’s latest earnings call, Nvidia said that the US restrictions had forced it to suspend its operations in China and revise its product offerings for the Chinese market. The company had also said that its latest Hopper GPU, the main driver of data center revenue for the company, presents a potential concern due to expected supply constraints, as demand substantially outstrips supply. The Nvidia vs AMD debate The Trendforce report also highlighted an increasing move towards proprietary ASIC development by cloud giants such as Google, AWS, Microsoft, and Meta, driven by scalability and cost efficiency. Additionally, AMD is intensifying competition by offering products at 60%–70% of the cost of similar Nvidia models, adopting a more cost-effective strategy. “This allows AMD to penetrate the market more aggressively, especially with flagship clients,” the report said. “Microsoft is expected to be the most enthusiastic adopter of AMD’s high-end GPU MI300 solutions in 2024.” During a recent investor conference, tech giants Meta, OpenAI, and Microsoft announced their plans to adopt AMD’s latest AI chip, the Instinct MI300X, according to a report by CNBC. This move is indicative of a broader industry trend where technology firms are exploring cost-effective alternatives to Nvidia’s high-priced graphics processors. “AMD is under tremendous pressure, primarily because of two things at work here,” said Sanchit Vir Gogia, chief analyst and CEO at Greyhound Research. “Firstly, the way Nvidia is sprucing up its product line, and secondly, the way it’s increasing the efficiency in each new product. At the same time, it’s pursuing a very aggressive pricing strategy as well, which is clearly adding a lot of pressure on our competitors.” Nvidia’s goals amid the persisting concerns Given the challenges, Nvidia is strategically overhauling its product range to set a robust course for its future. “Estimating Nvidia’s recent response to competition, it is expected that they will introduce different product lines (such as the H100/H200/B series) and adopt a more aggressive pricing strategy,” said Frank Kung, senior analyst at TrendForce. “In anticipation of fierce competition from main rivals like AMD, who are aiming to capture North American CSPs (Cloud Service Providers) customers and penetrate the CSPs AI market where Nvidia has traditionally held a dominant position, it is predicted that AMD will offer comparable products (e.g., MI300 vs H100) with lower pricing to gain market share.” Meanwhile, analysts also point out that the market remains large and open to anyone who can provide the right solutions. “This space needs more active competition,” Gogia said. “One vendor is just not enough to fulfill the demand in the coming times. There’s already a shortage in the market and a wait. Also, not everybody needs an advanced chip because a lot of these use cases will be limited in the kinds of elements they will use. So, you need an entire range of products to do justice to different use cases.” Related content news HPE, Dell launch another round of AI servers HPE unveils one server, and Dell launches several compute and storage products. By Andy Patrizio Oct 17, 2024 3 mins Servers Data Center news Dell updates servers with AMD AI accelerators The Dell PowerEdge XE9680 ships with Instinct MI300X accelerators, AMD's high-end GPU accelerators that are designed to compete with Nvidia Hopper processors. By Andy Patrizio Jul 17, 2024 3 mins Servers Data Center news Gartner: AI spurs 25% surge in data center systems spending Worldwide IT spending is expected to surpass $5 trillion in 2024, a 7.5% increase over 2023. Gartner's forecast reflects 'ravenous demand' for data center infrastructure. By Denise Dubie Jul 16, 2024 3 mins Generative AI Servers Data Center news Elon Musk’s Grok AI ‘compute factory’ will use Dell and Supermicro servers The factory’s location is still shrouded in mystery and shrink wrap. By John E. Dunn Jun 21, 2024 3 mins Generative AI Servers PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe