Tongyi Qianwen's open-source 32 billion parameter model has achieved full open-source for 7 major language models
芊芊551
发表于 2024-4-7 17:04:49
204
0
0
Alibaba Cloud's Tongyi Qianwen open-source 32 billion parameter model Qwen1.5-32B can balance performance, efficiency, and memory usage to the greatest extent possible, providing enterprises and developers with a more cost-effective model choice. At present, Tongyi Qianwen has opened up 7 major language models, with a cumulative download volume exceeding 3 million in open source communities at home and abroad.
Tongyi Qianwen has previously opened up six large language models with parameters of 500 million, 1.8 billion, 4 billion, 7 billion, 14 billion, and 72 billion, all of which have been upgraded to version 1.5. Among them, several small-sized models can be easily deployed on the end side, while the 72 billion parameter model has industry-leading performance and has repeatedly appeared on models lists such as HuggingFace. This open-source 32 billion parameter model will achieve a more ideal balance between performance, efficiency, and memory usage. For example, compared to the 14B model, the 32B model has stronger capabilities in intelligent agent scenarios; Compared to 72B, 32B has lower inference costs. The Tongyi Qianwen team hopes that the 32B open source model can provide better solutions for downstream applications.
In terms of basic capabilities, the 32 billion parameter model of Tongyi Qianwen has performed well in multiple evaluations such as MMLU, GSM8K, HumanEval, BBH, etc. Its performance is close to that of Tongyi Qianwen's 72 billion parameter model, far exceeding other 30 billion parameter models.
In terms of the Chat model, the Qwen1.5-32B-Chat model scored over 8 points in the MT Bench evaluation, and the gap between it and Qwen1.5-72B-Chat is relatively small.
In terms of multilingual abilities, the Tongyi Qianwen team selected 12 languages including Arabic, Spanish, French, Japanese, Korean, etc., and conducted assessments in multiple fields such as exams, comprehension, mathematics, and translation. The multilingual ability of Qwen1.5-32B is only slightly inferior to the 72 billion parameter model of Tongyi Qianwen.
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
猜你喜欢
- Robin Lee: The average daily adjustment amount of Wenxin Model exceeded 1.5 billion, 30 times more than that of a year ago
- Will DeepMind's open-source biomolecule prediction model win the Nobel Prize and ignite a wave of AI pharmaceuticals?
- "AI new generation" big model manufacturer Qi "roll" agent, Robin Lee said that it will usher in an era of "making money by thinking"
- Robin Lee said that the illusion of the big model has basically eliminated the actual measurement of ERNIE Bot?
- AI Weekly | Yang Zhilin claims that Kimi has over 36 million monthly active users; Robin Lee: The illusion of big model is basically eliminated
- ERNIE Bot has more than 400 million users, Baidu Wu Tian: the big model is reshaping the industrial intelligence engine
- In October of this year, Tesla Model Y won the sales championship for first tier and new first tier city models
- Alibaba CEO Wu Yongming: AI development requires a batch of open-source models of different scales and fields
- Baidu's Q3 core net profit increased by 17%, exceeding expectations. Wenxin's large model daily usage reached 1.5 billion
-
知名做空机构香橼研究(Citron Research)周四(11月21日)在社交媒体平台X上发布消息称,该公司已决定做空“比特币大户”微策略(Microstrategy)这家公司,并认为该公司已经将自己变身成为一家比特币投资基金 ...
- caffycat
- 12 小时前
- 支持
- 反对
- 回复
- 收藏
-
每经AI快讯,11月20日,文远知行宣布旗下自动驾驶环卫车S6与无人扫路机S1分别在新加坡滨海湾海岸大道与滨海艺术中心正式投入运营。据介绍,这是新加坡首个商业化运营的自动驾驶环卫项目。 ...
- star8699
- 前天 19:48
- 支持
- 反对
- 回复
- 收藏
-
上证报中国证券网讯(记者王子霖)11月20日,斗鱼发布2024年第三季度未经审计的财务报告。本季度斗鱼依托丰富的游戏内容生态,充分发挥主播资源和新业务潜力,持续为用户提供高质量的直播内容及游戏服务,进一步 ...
- goodfriendboy
- 前天 20:09
- 支持
- 反对
- 回复
- 收藏
-
人民网北京11月22日电 (记者栗翘楚、任妍)2024广州车展,在新能源汽车占据“半壁江山”的同时,正加速向智能网联新能源汽车全面过渡,随着“端到端”成为新宠,智能驾驶解决方案成为本届广州车展各大车企竞 ...
- 3233340
- 6 小时前
- 支持
- 反对
- 回复
- 收藏