Service hotline
+86 0755-83975897
Release date:2024-10-17Author source:KinghelmViews:847
On October 16, during the "AI Era Semiconductor Global Outlook 2025" seminar hosted by market research firm TrendForce, senior analyst Gong Mingde shared some insightful predictions for the AI server market. With a robust demand for AI infrastructure from cloud service providers (CSPs) and major brand clients, the global shipment of AI servers—including those equipped with GPUs, FPGAs, and ASICs—is set to experience significant growth.
Expected Growth in AI Server Shipments
According to Gong, AI server shipments are projected to increase by 42% year-over-year in 2024. This growth is largely driven by CSPs and sovereign cloud demands. By 2025, shipments are expected to rise by an additional 28%, elevating the share of AI servers in the overall server market to nearly 15%. Looking further ahead, by 2027, AI servers could account for close to 19% of the total market, thanks to advancements in cloud-based AI training and inference applications.
Nvidia’s Dominance in the AI GPU Market
A key player in this booming market is Nvidia, which continues to stand out as a leader. Gong noted that Nvidia's high-end GPU shipments are expected to soar, with a growth rate exceeding 150%. The H200 and other H-series GPUs are expected to be the main drivers of this success, pushing Nvidia's market share in the AI GPU sector to nearly 90% in 2024. The upcoming Blackwell GPU platform, set to launch in the first half of next year, is anticipated to ramp up its market presence from 4% to 84%.
Rising Competitors in AI Chips
While Nvidia is currently leading the pack, other suppliers such as AMD and Intel are also making strides in the AI server chip market. These companies are actively developing their own AI chip solutions, contributing to growth momentum in AI chip shipments, which are expected to see a significant increase by 2025. This demand is anticipated to drive the adoption of advanced packaging technologies like CoWoS, as well as high-bandwidth memory (HBM), with shipment volumes expected to double.
The Shift to Liquid Cooling Solutions
As computing power increases, the importance of effective cooling solutions in data centers is becoming more pronounced. TrendForce analyst Qiu Peiwen highlighted that with the heat design power (TDP) of Nvidia's GB200 NVL72 cabinet solution reaching approximately 140 kW, liquid cooling will be essential to manage heat effectively. Initially, the preferred method will likely be Liquid-to-Air (L2A) cooling.
In addition to Nvidia, major U.S. cloud providers like Google, Amazon Web Services (AWS), and Microsoft are also exploring liquid cooling solutions. Google has been particularly proactive, using a combination of air and liquid cooling for its TPU chips. In contrast, other CSPs still primarily rely on air cooling.
The ESG Factor
As global awareness of Environmental, Social, and Governance (ESG) issues rises, there will likely be a shift from air cooling to liquid cooling solutions. This transition is expected to accelerate, with the penetration rate of liquid cooling for AI chips projected to increase from 11% in 2024 to 24% in 2025.
Conclusion
The landscape of AI servers is evolving rapidly, driven by strong demand for AI infrastructure and cutting-edge technology advancements. With key players like Nvidia leading the charge and a growing emphasis on efficient cooling solutions, the future of AI servers looks promising. As these trends unfold, businesses and consumers alike will benefit from the enhanced performance and capabilities of AI technology in the years to come.
Copyright © Shenzhen Kinghelm Electronics Co., Ltd. all rights reservedYue ICP Bei No. 17113853