首页 News 正文

After NVIDIA's financial report, which giants are still participating in the AI chip battle as their stock prices continue to soar?

六月清晨搅
191 0 0

What are the key constraints on the development of AI as it becomes a new growth point for global technological breakthroughs? In addition to the development of the large model itself, another key factor is computing power. At present, the training and inference costs of AI are mostly on GPU chips. Global AI startups, big and small, rely on Nvidia's H100 and other chips to train their models. A single H100 chip costs $35000, which is in short supply.
On February 21st local time, Nvidia announced its fourth quarter and fiscal year 2024 results. According to the financial report, Nvidia's revenue for the quarter ended January 28th was $22.1 billion, higher than analysts' expectations of $20.41 billion, a 22% month on month increase and a 265% year-on-year surge. The net profit in the fourth quarter was 12.3 billion US dollars, a year-on-year increase of 765%. The annual revenue reached a historic high of 60.9 billion US dollars, an increase of 126%. On the eve of the financial report, Nvidia's stock price had fallen for four consecutive trading days, and the unexpected financial report caused Nvidia's stock price to rise by over 9% after the US stock market closed.
Nvidia's ascent took advantage of the AI big model. In November 2022, OpenAI released ChatGPT, a chatbot trained on NVIDIA's A100 chip. Once this news was disclosed, it triggered one of the largest daily gains in stock market history.
In a report, The New Yorker quoted a Wall Street analyst as saying:; Quota; There is a war going on in the field of artificial intelligence, and Nvidia is the only arms dealer.
But the scorching battlefield has attracted a crowd of players, with major chip manufacturers and cloud service providers vying for a new track of AI computing power: in April 2023, Amazon released the AI inference chip Inferentia 2; In May, Meta announced the AI chip project MITA; In August, Google released TPU v5e; In November, Microsoft released the AI chip Maia 100. In addition, giants such as Sun Zhengyi and Sam Altman are eyeing the AI chip field.
Can Nvidia still maintain its monopoly position in the field of AI chips and remain the only arms dealer?
Nvidia: A patient monopolist
On February 16th, OpenAI released the Sora, a cultural video model, which exploded globally, and as of the close of the day, Nvidia's market value reached $1.79 trillion. In 2023, Nvidia's market value more than doubled, and in just one and a half months this year, Nvidia's stock price has risen by over 50% again.
Currently, Nvidia is a monopolist in the field of AI chips. How profitable are Nvidia's AI chips? The material cost of the H100 chip is $3000, with an official selling price of $35000 and a second-hand platform transaction price of $50000, which is 10 times higher than the cost and still in short supply in the market. Elon Musk said, "H100 is harder to buy than drugs."
How hard currency is Nvidia's H100? It can be used as collateral to borrow money, just like houses, land, and currency. According to the Yuanchuan Research Institute, in August 2023, American startup CoreWeave secured $2.3 billion in debt financing from Wall Street by pledging its H100 chip. Moreover, at that time, the number of H100 chips in CoreWeave's hands was not enough to reach $2.3 billion. However, with Nvidia's commitment to supply the H100 to CoreWeave, a huge loan could be borrowed from the bank, similar to how real estate developers could borrow from banks through land acquisition contracts and land approvals back then.
Global AI startups, big and small, rely on Nvidia's H100 chip for training their models. According to consulting firm Omidia, Nvidia sold nearly 500000 H100 chips in the third quarter of 2023, with Meta and Microsoft being the largest buyers of Nvidia's H100 GPUs, each purchasing up to 150000, far exceeding the number of H100 chips purchased by Google, Amazon, Oracle, and Tencent (each 50000).
Monopoly is not built in a day, it takes time. As early as the 2000s, when AI was stagnant in basic tasks such as image recognition and speech recognition, and its prospects were still uncertain, many AI training sessions were conducted on Nvidia graphics cards. In 2009, Geoffrey Hinton, known as the father of artificial intelligence, used Nvidia's CUDA platform to train neural networks for recognizing human speech. In 2012, his students Alex Krizhevsky and Ilya Sutskever (later became the chief scientist of OpenAI) purchased two GeForce graphics cards from Amazon due to budget constraints. In 2016, Nvidia delivered the first dedicated artificial intelligence supercomputer DGX-1 to a research team at OpenAI. Huang Renxun personally brought this computer to OpenAI's office, and Elon Musk, then the chairman of OpenAI, opened the packaging with a paper cutter.
In March 2023, OpenAI released GPT-4, adding another spark to the entire artificial intelligence industry. According to information obtained by chip research firm SemiAnalysis, OpenAI trained with 25000 Nvidia A100 GPUs for over three months before creating the GPT-4 model.
But as Huang Renxun himself said, "We don't need to pretend that the company is always in danger, we are always in danger.". The field of AI chips is already surrounded by multiple giants: Sun Zhengyi of SoftBank Group is still preparing to invest heavily in AI chips worth $100 billion, with 70% of the funds coming from the Middle East; Microsoft is collaborating with Intel to manufacture chips to reduce dependence on Nvidia; More rookie teams have innovated the underlying logic of AI chips, and the Groq team, which jumped out of Google's early TPU team, has adopted a new technological route to design chips.
SoftBank's Sun Zhengyi's Money Making Ability for Chip Manufacturing
According to Bloomberg on February 15th, SoftBank founder Sun Zhengyi is considering creating a $100 billion chip company to develop AI chips. According to insiders, one scenario that Sun Zhengyi is considering is that SoftBank will provide $30 billion, with an additional $70 billion possibly coming from Middle Eastern institutions. A few days after the news was disclosed, SoftBank Group's stock rose by 3%.
The code name for this chip project is Izanagi, which in Japanese refers to the god who created heaven and earth in Japanese mythology, demonstrating Sun Zhengyi's ambition. If the Izanagi chip project is successful, it will occupy about one-fifth of the global semiconductor market share.
According to insiders, this new company is aimed at complementing Arm, a chip design company under SoftBank. After a series of setbacks in SoftBank Group's investments a few years ago, Arm's chip design business is currently one of SoftBank Group's key areas of focus. Arm does not manufacture chips, but designs instruction sets for modern chips, with customers including Qualcomm, Apple, and Samsung. SoftBank Group holds approximately 90% of Arm's shares, and in February of this year, Arm's stock price has risen by 67%.
Sun Zhengyi also likes to gamble heavily, and at multiple historical nodes, he sees the trend of the times and places his bets widely. In 2000, when Alibaba was still a nobody and the Internet economy had not yet taken off in the mobile era, Sun Zhengyi invested 20 million dollars in Alibaba after chatting with Ma Yun for several decades. Because he thought the Internet was promising, Sun Zhengyi invested heavily in Yahoo in 1995. Because he believed that technology had prospects, Sun Zhengyi purchased 80% of Kingston's shares for $1.5 billion in 1996. Of course, Sun Zhengyi's wide-ranging investment has both gains and losses.
Sun Zhengyi, who has a keen sense of the times, naturally won't miss out on the new track of AI today. According to Bloomberg, Sun Zhengyi said, "AGI is the goal pursued by every artificial intelligence expert, but when you ask them for detailed definitions, numbers, time, computing power, and how much AGI is smarter than human intelligence, most of them have no answer. I have my own answer: I believe AGI will become a reality within 10 years."
It is worth mentioning that Middle East Capital is a major player behind the chip market and also one of Sun Zhengyi's main sources of funding. 70% of the funding for the Izanagi chip project comes from Middle Eastern institutions. Previously, the Vision Fund established by Sun Zhengyi had $100 billion in funding, which is one of the largest technology investment pools in the world, with $45 billion coming from the Saudi Arabian National Fund and $15 billion from the UAE National Fund.
Microsoft's chip maker who doesn't want to be controlled by others
In the new arena of AI, Microsoft, Google, and Amazon have also made a profit by selling cloud services.
Microsoft is one of Nvidia's largest customers, but it is also reducing its reliance on Nvidia chips through various means. In terms of self-developed network cards and chips, according to The Information on February 20th, Microsoft is developing a network card to ensure fast data movement between its servers as a replacement for the network cards provided by Nvidia.
According to reports, Microsoft is developing its own Maia AI server chip, which will be installed in data centers this year. The new network card it is developing can also improve the performance of Maia chips.
Microsoft is also partnering with Intel to make chips. According to a report by The Wall Street Journal on February 22nd, Microsoft CEO Satya Nadella said at an event at Intel that Microsoft is designing chips and will manufacture them at one of Intel's most advanced factories. Nadella did not specify which chip Intel would produce for it, but in recent months Microsoft has been seeking to strengthen its chip design capabilities, including a new chip launched last year for artificial intelligence computing.
Previously, Microsoft had purchased a large number of AMD chips. In December 2023, AMD launched the Instinct MI300X chip, designed for pure GPUs, which can provide breakthrough performance for AI work execution. According to a report by foreign media Seeking Alpha, Citigroup's analysis report indicates that Microsoft's data center department is the largest buyer of AMD Instinct MI300X chips and has started working on large language models (LLMs) such as the GPT-4.
The emerging new force Groq
Just two days before Nvidia's financial report was released, Nvidia suddenly faced a formidable rival.
On February 21st, a startup company named Groq exploded in the AI industry. In traditional generative AI, generating answers through question answering takes time, with characters popping out one by one. According to Tencent Technology, on Groq's cloud service experience platform, when a model receives a prompt, it can almost immediately generate an answer, one screen per second.
Groq leads the team. Groq official website
According to the Science and Technology Innovation Board Daily, 8 members of Groq's founding team came from Google's early TPU core design team, which had only 10 members. Although the team originated from Google TPU, Groq did not choose the TPU route, nor did it prioritize routes such as GPU and CPU. Groq has chosen a brand new system route - LPU (Language Processing Unit).
"We are not doing big models," Groq said. "Our LPU inference engine is a new type of end-to-end processing unit system that can provide the fastest inference speed for computationally intensive applications such as AI big models."
At present, the price of Groq's LPU card is around $20000, which is much cheaper than Nvidia's H100 GPU. The Groq team is expected to become a new force in the field of AI chips, overtaking on a new technological path.
There are more players who layout AI chips than just these.
Just last month, Sam Altman visited South Korean chip giants Samsung and SK to explore the possibility of collaborating in chip manufacturing. Prior to this, he also had contact with Intel and TSMC. Sam Altman even proposed using $5-7 trillion to build AI chips. Many people think Sam Altman is crazy, but he sees the real bottleneck in the development of AI - computing power.
At present, the tenfold premium of Nvidia's AI chips comes from both the huge research and development costs of the chip and software ecosystem, as well as from monopolies.
According to Li Bojie, a domestic AI entrepreneur and founder of Logenic AI, there is a saying in the artificial intelligence industry that as long as the demand and production of chips are large enough, the chips themselves are the price of sand.
So, if the AI race advances rapidly and Nvidia chips break their monopoly, more countries and regions will welcome chip freedom, but what we need to face will be even more brand new problems.
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

  •   过去一周的时间里,有关苹果微信“二选一”的话题持续霸占各个平台热搜,甚至有媒体还在微博发起了“如果苹果微信二选一,你选择iPhone还是微信?”的投票,当然结果是微信取得了压倒性的胜利。   从最新的 ...
    lub_pig
    昨天 17:05
    支持
    反对
    回复
    收藏
  •   今日,特斯拉AI团队发布产品路线图,其中,预计2025年第一季度在中国和欧洲推出完全自动驾驶(FSD),但仍有待监管批准。   自2016年以来,马斯克一直在探索特斯拉的FSD自动驾驶方案。2024年,特斯拉FSD V12 ...
    seisei
    前天 16:32
    支持
    反对
    回复
    收藏
  • 【全球市场】1、道指跌0.54%,纳指涨0.25%,标普跌0.30%。2、特斯拉涨近5%,亚马逊涨超2%。3、纳斯达克中国金龙指数涨0.88%,蔚来涨超14%。
    wishii
    昨天 22:03
    支持
    反对
    回复
    收藏
  • 【ASML CEO回应对华出口限制:会有更多应对措施】当地时间9月4日,荷兰计算机芯片设备供应商ASML首席执行官Christophe Fouquet在花旗银行的一场会议上表示,美国限制ASML对华出口是出于“经济动机”。他预计该公司应 ...
    mbgg2797
    前天 09:15
    支持
    反对
    回复
    收藏
六月清晨搅 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    30