首页 News 正文

Nvidia's mass production and delivery wave is approaching! Multiple GPU+servers in the supply chain have started preparations

因醉鞭名马幌
216 0 0

Nvidia is about to ship multiple AI GPUs and servers.
According to Taiwan Business Times, the upstream chip end of Nvidia AI GPU H200 will enter the mass production period in late Q2, and is expected to be delivered in large quantities after Q3.
A supply chain insider pointed out that the H200 that will be mass-produced and delivered in the third quarter is mainly Nvidia's DGX H200, but currently, the majority of pending orders are still concentrated in H100, and the proportion of H200 is still limited.
Nvidia stated that the shortage or even supply-demand shortage of H200 may continue until 2025. In fact, in March of this year, TSMC's 4nm production capacity was almost fully loaded due to the production of H200, directly reversing the expectation of a decrease in 4nm capacity utilization at the beginning of the year.
At the same time, with the early launch of Nvidia Blackwell, the willingness of end customers such as Gigabyte and Asus to purchase H200 has also been affected, and they have turned to the next generation B100/B200 series. It is reported that B100 is currently partially visible, and the expected shipment schedule is set for the first half of next year.
B100 and B200 belong to the Blackwell architecture GPU, which is an important component of the "superchip" GB200 (two B200 GPUs plus an Arm based Grace CPU). In fact, this chip is usually delivered to customers using an AI server called the GB200 NV series.
Coincidentally, MoneyDJ announced today that Nvidia GB200 NVL36 servers will also enter mass production in Q4, and some component factories have stated that they have been required to complete mass production preparations in Q3.
The news of the mass production of GB200 servers confirms the statement made by Zheng Hongmeng, Chairman of Industrial Fulian, not long ago. He stated that the GB200, whether it is version 36 or 72, is progressing smoothly and is expected to be officially launched this year. It is expected that AI will contribute 40% of the company's total cloud computing revenue and AI servers will account for 40% of the global market share in 2024.
The latest research report from TrendForce shows that server demand will show a growth trend from the second quarter to the third quarter. In addition to the known strong demand for AI server orders, it has recently been driven by bidding projects and supported by storage server demand driven by AI, driving up the second quarter shipment performance. The momentum will continue into the third quarter, with an estimated month on month growth of about 4-5% in the third quarter.
According to the organization's prediction, the global AI servers (including AI Training and AI Inference) will exceed 1.6 million in 2024, with an annual growth rate of 40%.
According to a research report by Zhejiang Securities on June 24th, server products equipped with GB200 are expected to be mass-produced and shipped in September this year. There is strong demand for orders related to OEM factories, which is beneficial for the release of performance in the AI server industry chain. The demand for various links in the AI computing power industry chain is high and prosperous, and the subsequent increase in GB200 volume is expected to further bring growth momentum. At the same time, the increase in end-to-end AI products is expected to promote AI applications and further strengthen the demand for cloud computing power. Suggest paying attention to investment opportunities related to optical modules, liquid cooling, and power supply.
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

因醉鞭名马幌 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    43