Nvidia, along with AMD and other leading ASIC chip manufacturers, is poised to drive the supply. However, ongoing restrictions on exports to China increased proprietary technologies, and intensified competition may present hurdles. Credit: Shutterstock Microsoft, Google, AWS, and Meta will collectively account for over 60% of global demand for high-end AI servers in 2024, according to a report from Trendforce. Microsoft will account for 20.2% of the demand, followed by Google at 16.6%, AWS at 16%, and Meta at 10.8%. Nvidia, along with AMD and other leading ASIC chip manufacturers, is poised to drive the supply. However, ongoing restrictions on exports to China increased proprietary technologies, and intensified competition may present hurdles. Challenges amid global demand surge While demand from the cloud-tech giants could rise exponentially, AI server makers could face significant challenges amid geopolitical concerns, supply chain concerns, and competition. A major hurdle is the US ban on technological exports, prompting China to pursue self-reliance in AI chip development and elevating Huawei as a formidable competitor. This move could undermine the appeal of Nvidia’s China-tailored H20 series due to cost-effectiveness issues. During the company’s latest earnings call, Nvidia said that the US restrictions had forced it to suspend its operations in China and revise its product offerings for the Chinese market. The company had also said that its latest Hopper GPU, the main driver of data center revenue for the company, presents a potential concern due to expected supply constraints, as demand substantially outstrips supply. The Nvidia vs AMD debate The Trendforce report also highlighted an increasing move towards proprietary ASIC development by cloud giants such as Google, AWS, Microsoft, and Meta, driven by scalability and cost efficiency. Additionally, AMD is intensifying competition by offering products at 60%–70% of the cost of similar Nvidia models, adopting a more cost-effective strategy. “This allows AMD to penetrate the market more aggressively, especially with flagship clients,” the report said. “Microsoft is expected to be the most enthusiastic adopter of AMD’s high-end GPU MI300 solutions in 2024.” During a recent investor conference, tech giants Meta, OpenAI, and Microsoft announced their plans to adopt AMD’s latest AI chip, the Instinct MI300X, according to a report by CNBC. This move is indicative of a broader industry trend where technology firms are exploring cost-effective alternatives to Nvidia’s high-priced graphics processors. “AMD is under tremendous pressure, primarily because of two things at work here,” said Sanchit Vir Gogia, chief analyst and CEO at Greyhound Research. “Firstly, the way Nvidia is sprucing up its product line, and secondly, the way it’s increasing the efficiency in each new product. At the same time, it’s pursuing a very aggressive pricing strategy as well, which is clearly adding a lot of pressure on our competitors.” Nvidia’s goals amid the persisting concerns Given the challenges, Nvidia is strategically overhauling its product range to set a robust course for its future. “Estimating Nvidia’s recent response to competition, it is expected that they will introduce different product lines (such as the H100/H200/B series) and adopt a more aggressive pricing strategy,” said Frank Kung, senior analyst at TrendForce. “In anticipation of fierce competition from main rivals like AMD, who are aiming to capture North American CSPs (Cloud Service Providers) customers and penetrate the CSPs AI market where Nvidia has traditionally held a dominant position, it is predicted that AMD will offer comparable products (e.g., MI300 vs H100) with lower pricing to gain market share.” Meanwhile, analysts also point out that the market remains large and open to anyone who can provide the right solutions. “This space needs more active competition,” Gogia said. “One vendor is just not enough to fulfill the demand in the coming times. There’s already a shortage in the market and a wait. Also, not everybody needs an advanced chip because a lot of these use cases will be limited in the kinds of elements they will use. So, you need an entire range of products to do justice to different use cases.” Related content news Cisco patches actively exploited zero-day flaw in Nexus switches The moderate-severity vulnerability has been observed being exploited in the wild by Chinese APT Velvet Ant. By Lucian Constantin Jul 02, 2024 1 min Network Switches Network Security news Nokia to buy optical networker Infinera for $2.3 billion Customers struggling with managing systems able to handle the scale and power needs of soaring generative AI and cloud operations is fueling the deal. By Evan Schuman Jul 02, 2024 4 mins Mergers and Acquisitions Networking news French antitrust charges threaten Nvidia amid AI chip market surge Enforcement of charges could significantly impact global AI markets and customers, prompting operational changes. By Prasanth Aby Thomas Jul 02, 2024 3 mins Technology Industry GPUs Cloud Computing news Lenovo adds new AI solutions, expands Neptune cooling range to enable heat reuse Lenovo’s updated liquid cooling addresses the heat generated by data centers running AI workloads, while new services help enterprises get started with AI. By Lynn Greiner Jul 02, 2024 4 mins Cooling Systems Generative AI Data Center PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe