Is it time for Google to open up a lightweight and large model, Gemma, and the era of universal AI?
hughmini
发表于 2024-2-22 11:24:19
1248
0
0
Google released a new artificial intelligence "open model" Gemma on February 21st, which means that open source for large models means that external developers can build them into their own models. Google has also become another major technology company, following Meta, attempting to take the path of open source big models and accelerating the arrival of the era of universal AI.
Google stated that Gemma is a series of "lightweight" advanced open models built using the same research and technology as creating Gemini models. Developers can use the Gemma "Open Model" series to build artificial intelligence software for free. The company stated that it is publicly disclosing key technical data, such as so-called "model weights".
Google CEO Sundar Pichai said, "Gemma has demonstrated powerful performance and will be available globally starting today, running on laptops or Google Cloud."
Market analysis suggests that Google's open source of large models may attract software engineers to develop on Google's technological foundation and encourage the use of its newly profitable cloud division. Google stated that these models have also been optimized for Google Cloud.
However, Gemma is not entirely open source, which means the company can still establish terms and ownership for using the model.
It is reported that compared to the Gemini model previously released by Google, the Gemma model may have smaller parameters, with 2 billion or 7 billion parameter versions available for selection. Google has not yet disclosed the parameter size of its largest Gemini.
Google stated, "Gemini is the largest and most powerful AI model widely used today. The Gemma model shares technology and infrastructure components with Gemini, and can run directly on developers' laptops or desktops."
The company also emphasizes that Gemma surpasses models with larger parameters on key benchmarks while adhering to strict standards for safe and responsible output.
Previously, the open-source Meta's Llama 2 model had a maximum of 70 billion parameters. In contrast, OpenAI's GPT-3 model has 175 billion parameters.
In a technical report released by Google, the company compared Gemma's 7 billion parameter model with several models including Llama 27 billion parameter, Llama 213 billion parameter, and Mistral 7 billion parameter in different dimensions. Gemma outperformed its competitors in benchmark tests such as question answering, reasoning, mathematics/science, and code.
Nvidia stated during the release of the Gemma model that it has partnered with Google to ensure that the Gemma model runs smoothly on its chip. Nvidia also stated that it will soon develop a chatbot software to be used in conjunction with Gemma.
Opening up AI models with smaller parameters is also Google's business strategy. Previously, iFlytek also chose to open source smaller parameter size models.
Liu Qingfeng, Chairman of iFlytek, explained to a reporter from First Financial News, "The key to General Motors' big models is to see who has good performance, and open source big models are to establish an ecosystem. Therefore, from a technical perspective, generally open source big models are slightly lower than General Motors' big models."
"We have also observed that many companies may hide their biggest model and still hope to establish barriers for commercialization," a researcher engaged in AI big model development told a reporter from First Financial.
There are currently different views on open source big models. Some experts believe that open source AI big models may be abused, while others support open source methods, believing that this can promote technological development and expand the beneficiaries.
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
猜你喜欢
- Baidu Robin Lee: The AI native era needs millions of native applications, not 100 big models
- Rolling GPT-4? Google releases the strongest AI model interpretation
- News reports that Apple is developing its own large-scale language model for devices
- IPhone can run! Microsoft launches lightweight model Phi-3 with performance comparable to GPT-3.5 Turbo AI in the future on mobile devices?
- The first hundred billion parameter model from Tongyi Qianwen has arrived
- JD technical leader: Large models will become smaller and even finer down to the scene
-
2024年11月7日,由新华社新闻信息中心、新华社上海分社、新华社品牌工作办公室主办的“品牌·让世界更美好”中外品牌论坛在上海举办。此次论坛,理想汽车荣获“通用ESG企业评价规范”年度最佳品牌奖。理想汽车将 ...
- cool88817
- 昨天 19:20
- 支持
- 反对
- 回复
- 收藏
-
【别跟我提特朗普!鲍威尔发布会:无需过度解读措辞改变 民众“不觉得经济好”没错】北京时间周五凌晨3点30分,美联储主席鲍威尔举行新闻发布会,就继续降息25个基点的决定和市场热点话题回答全球媒体提问。鲍威尔在 ...
- anhao007
- 昨天 22:48
- 支持
- 反对
- 回复
- 收藏
-
何思文表示,“在进博会这个平台上,我们开启的是倾听模式,通过进博会展出各类产品,收集消费者的需求和反馈,进而帮助决定未来进口到中国的产品。过去,汽车行业的许多创新源于美国加州或欧洲。我相信,中国正 ...
- MaxLucky
- 昨天 12:44
- 支持
- 反对
- 回复
- 收藏
-
11月5日至10日,第七届中国国际进口博览会(下称“进博会”)在国家会展中心(上海)举办。在进博会期间,平安健康医疗科技有限公司(下称“平安健康”)与美敦力(上海)管理有限公司(下称“美敦力”)达成战 ...
- Hidden2
- 3 天前
- 支持
- 反对
- 回复
- 收藏