close
close

The global increase in demand for AI servers is expected to increase the market value to $187 billion by 2024, accounting for 65% of the server market, says TrendForce

The global increase in demand for AI servers is expected to increase the market value to 7 billion by 2024, accounting for 65% of the server market, says TrendForce



TrendForce’s latest AI server industry report shows that high demand for advanced AI servers from major CSPs and brand customers is expected to continue in 2024. Meanwhile, the gradual production expansion of TSMC, SK hynix, Samsung and Micron has significantly alleviated the bottlenecks in Q2 2024. Consequently, the lead time for NVIDIA’s flagship H100 solution has shortened from the previous 40-50 weeks to less than 16 weeks.

TrendForce expects AI server shipments to increase nearly 20% quarter-on-quarter in the second quarter and has revised its annual shipment forecast upward to 1.67 million units, representing year-on-year growth of 41.5%.

TrendForce notes that major CSPs continue to focus their budgets on AI server procurement this year, crowding out the growth momentum of general servers. Compared with the high growth rate of AI servers, the annual growth rate of general server shipments is only 1.9%. The proportion of AI servers in total server shipments is expected to reach 12.2%, an increase of about 3.4 percentage points from 2023.

In terms of market value, AI servers contribute significantly more to revenue growth than general servers. The market value of AI servers is expected to exceed $187 billion in 2024, with a growth rate of 69%, accounting for 65% of the total server market value.

North American CSPs (e.g. AWS, Meta) are continuously expanding their proprietary ASICs, and Chinese companies such as Alibaba, Baidu, and Huawei are actively expanding their own ASIC AI solutions. As a result, the share of ASIC servers in the overall AI server market is expected to increase to 26% by 2024, while mainstream AI servers with GPU will account for about 71%.

Among AI chip suppliers for AI servers, NVIDIA holds the highest market share—nearly 90% for GPU-equipped AI servers—while AMD’s market share is only about 8%. However, if you include all AI chips (GPU, ASIC, FPGA) used in AI servers, NVIDIA’s market share this year is about 64%.

TrendForce observes that demand for advanced AI servers is expected to remain strong through 2025, especially as NVIDIA’s next-generation Blackwell (including GB200, B100/B200) will replace the Hopper platform as the market mainstream. This will also drive demand for CoWoS and HBM. NVIDIA’s B100 will have a chip size twice that of H100, consuming more CoWoS. Major supplier TSMC’s CoWoS production capacity is estimated to reach 550,000-600,000 units by the end of 2025, with a growth rate of nearly 80%.

Mainstream H100 will be equipped with 80GB HMB3 in 2024. By 2025, mainstream chips such as NVIDIA’s Blackwell Ultra or AMD’s MI350 are expected to be equipped with up to 288GB HBM3e, tripling device usage. Total HBM supply is expected to double by 2025 as demand in the AI ​​server market remains strong.

TrendForce analysts will be giving presentations at the FMS24 – Future of Memory and Storage forum from August 6th to 8th. The presentations will cover topics such as HBM, memory (DRAM/NAND Flash), servers, AI servers, storage, and developments in technology and capacity. We will also make booth #956 available for meetings with analysts. If these topics interest you, we invite you to contact us. Feel free to Time schedule Make an appointment or visit our stand!


Next post