首页 News 正文

Tongyi Qianwen's open-source 32 billion parameter model has achieved full open-source for 7 major language models

芊芊551
198 0 0

Alibaba Cloud's Tongyi Qianwen open-source 32 billion parameter model Qwen1.5-32B can balance performance, efficiency, and memory usage to the greatest extent possible, providing enterprises and developers with a more cost-effective model choice. At present, Tongyi Qianwen has opened up 7 major language models, with a cumulative download volume exceeding 3 million in open source communities at home and abroad.
Tongyi Qianwen has previously opened up six large language models with parameters of 500 million, 1.8 billion, 4 billion, 7 billion, 14 billion, and 72 billion, all of which have been upgraded to version 1.5. Among them, several small-sized models can be easily deployed on the end side, while the 72 billion parameter model has industry-leading performance and has repeatedly appeared on models lists such as HuggingFace. This open-source 32 billion parameter model will achieve a more ideal balance between performance, efficiency, and memory usage. For example, compared to the 14B model, the 32B model has stronger capabilities in intelligent agent scenarios; Compared to 72B, 32B has lower inference costs. The Tongyi Qianwen team hopes that the 32B open source model can provide better solutions for downstream applications.
In terms of basic capabilities, the 32 billion parameter model of Tongyi Qianwen has performed well in multiple evaluations such as MMLU, GSM8K, HumanEval, BBH, etc. Its performance is close to that of Tongyi Qianwen's 72 billion parameter model, far exceeding other 30 billion parameter models.
In terms of the Chat model, the Qwen1.5-32B-Chat model scored over 8 points in the MT Bench evaluation, and the gap between it and Qwen1.5-72B-Chat is relatively small.
In terms of multilingual abilities, the Tongyi Qianwen team selected 12 languages including Arabic, Spanish, French, Japanese, Korean, etc., and conducted assessments in multiple fields such as exams, comprehension, mathematics, and translation. The multilingual ability of Qwen1.5-32B is only slightly inferior to the 72 billion parameter model of Tongyi Qianwen.
CandyLake.com 系信息发布平台,仅提供信息存储空间服务。
声明:该文观点仅代表作者本人,本文不代表CandyLake.com立场,且不构成建议,请谨慎对待。
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

  • 【科技记者古尔曼:苹果计划于12月第一周发布iOS 18.2系统更新 带来更多人工智能功能】科技记者古尔曼透露,苹果计划于12月第一周发布iOS 18.2系统更新。iOS 18.2将为iPhone 15 Pro机型和所有iPhone 16机型带来更多 ...
    cristianna
    昨天 17:32
    支持
    反对
    回复
    收藏
  •   为期超七周的大罢工终于落下帷幕。   当地时间11月4日,波音美国西海岸工厂工人们就改进后的合同提案投票。   随后,代表着波音超过33000名西雅图地区机械师的IAM工会经表决,以59%的同意票决定接纳波音提 ...
    cristianna
    2 小时前
    支持
    反对
    回复
    收藏
  •   近日,爱立信中国区总裁方迎在接受《经济参考报》记者采访时表示,5G技术在全球范围内得到了迅速发展,但面临商业潜力未能充分挖掘、网络运营难度较以往更高两大挑战。因此,运营商在继续5G网络部署的同时,应关 ...
    blueskybb
    昨天 15:05
    支持
    反对
    回复
    收藏
  •   “新四化”的时代浪潮下,新能源汽车行业百家争鸣。伴随着自主品牌不断崛起,合资品牌当下的生存状况备受外界关注,如何打好电动化时代的突围战,成为合资品牌的新课题。   作为国内合资车企的代表之一,上汽 ...
    mbgg2797
    5 小时前
    支持
    反对
    回复
    收藏
芊芊551 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    44