首页 News 正文

Does Apple's AI big model 'hold hands' with Google's TPU and Nvidia feel threatened?

因醉鞭名马幌
243 0 0

Customers are also competitors, which has always been a common question raised when the outside world pays attention to Nvidia.
Microsoft, Google, Amazon and other cloud computing giants have purchased a large number of NVIDIA GPUs in the past year, and these major clients are also laying out their own chip research plans. Huang Renxun previously mentioned this situation, stating that Nvidia faces more competition than 'anyone else on Earth'.
Even at this time when Nvidia is in the limelight, this situation is still happening. On July 29th, according to Reuters, Apple mentioned in a research paper that the AI big model being developed by the company uses Google TPU (Tensor Processor) as the underlying layer, rather than the commonly used Nvidia GPU (Graphics Processing Unit) in the industry.
As the last tech giant to enter the battle, Apple has not publicly reported any record of large-scale purchases of Nvidia's GPUs, and the market's focus has always been on who will be the winner of Apple's AI. According to reports, Apple used 2048 TPUv5p and 8192 TPUv4 to support the training and inference of AI models running on iPhones and other devices.
At present, Nvidia has not provided any evaluation or response to this news.
TPU was originally a specialized chip designed by Google for its TensorFlow machine learning framework. Since its launch in 2015, TPU has evolved to its sixth generation and has maintained an update rhythm of approximately one iteration per year. Prior to this, TPU was mainly used for internal use by Google, and although it was later linked to Google Cloud services for external use, there have been no major external expansion actions.
Last year, the explosion of AI big models led to a frenzy of market competition for AI chips, with high-end GPUs in high demand. Nvidia has won over 80% of the market share in this field and is undoubtedly the dominant player. But at the same time, Google is also secretly working on TPU. According to Techinsights, Google's estimated self use of TPU chips last year exceeded 2 million, with a market share second only to Nvidia and Intel, making it the world's third-largest data center chip design manufacturer.
Despite having its own TPU chip, Google remains one of the world's largest buyers of Nvidia GPUs. In a report written by market research firm Omdia, the list of major buyers of Nvidia H100 GPUs that were snapped up last year was compiled. Meta and Microsoft tied for first place with a purchase volume of 150000 H100 GPUs, while Google, Amazon, Oracle, and Tencent each purchased 50000 H100 GPUs, tied for second place.
Gu Geyun also had close cooperation with Nvidia last year. Google not only uses Nvidia GPUs internally, but also provides Nvidia GPU based services on its cloud service platform to meet customers' demands for high-performance computing and AI applications.
In addition to Google, cloud giants such as Amazon Web Services and Microsoft are developing their own chips based on the Arm architecture. The chip manufacturing of cloud computing giants has always been seen as a threat to Nvidia by the outside world. But Nvidia has always insisted that it has an exclusive advantage in the face of competition.
As early as 2017, when Google launched its second generation TPU, Huang Renxun stated in an interview with CNBC that he was "not worried about competition from Google TPU". In his view, although some major cloud computing clients may develop their own AI server chips to reduce their dependence on NVIDIA chips, NVIDIA can still maintain its leading position in the AI field with the outstanding performance of its GPUs.
Cost has always been Nvidia's trump card for its own products. According to Huang Renxun's proposition of "the more you buy, the more you save", due to economies of scale, the average cost will decrease. Therefore, when enterprises purchase a large number of NVIDIA GPUs, although the initial investment may be large, in the long run, high-performance GPUs have a longer service life, lower maintenance costs, and lower overall operating costs (TCO) for customers. If competitors want to compete directly with them, even if they are free, they are still not cheap enough in the end.
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

  •   知名做空机构香橼研究(Citron Research)周四(11月21日)在社交媒体平台X上发布消息称,该公司已决定做空“比特币大户”微策略(Microstrategy)这家公司,并认为该公司已经将自己变身成为一家比特币投资基金 ...
    caffycat
    昨天 11:18
    支持
    反对
    回复
    收藏
  •   每经AI快讯,11月20日,文远知行宣布旗下自动驾驶环卫车S6与无人扫路机S1分别在新加坡滨海湾海岸大道与滨海艺术中心正式投入运营。据介绍,这是新加坡首个商业化运营的自动驾驶环卫项目。 ...
    star8699
    3 天前
    支持
    反对
    回复
    收藏
  •   上证报中国证券网讯(记者王子霖)11月20日,斗鱼发布2024年第三季度未经审计的财务报告。本季度斗鱼依托丰富的游戏内容生态,充分发挥主播资源和新业务潜力,持续为用户提供高质量的直播内容及游戏服务,进一步 ...
    goodfriendboy
    3 天前
    支持
    反对
    回复
    收藏
  •   人民网北京11月22日电 (记者栗翘楚、任妍)2024广州车展,在新能源汽车占据“半壁江山”的同时,正加速向智能网联新能源汽车全面过渡,随着“端到端”成为新宠,智能驾驶解决方案成为本届广州车展各大车企竞 ...
    3233340
    昨天 17:06
    支持
    反对
    回复
    收藏
因醉鞭名马幌 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    43