清华 THUDM ChatGLM的简介
非常高兴地给大家介绍我们大模型家族的新成员:对话机器人 ChatGLM(alpha内测版:QAGLM),这是一个初具问答和对话功能的千亿中英语言模型, 并针对中文进行了优化,现已开启邀请制内测,后续还会逐步扩大内测范围。
与此同时,继开源 GLM-130B 千亿基座模型之后,我们正式开源最新的中英双语对话 GLM 模型: ChatGLM-6B,结合模型量化技术,用户可以在消费级的显卡上进行本地部署(INT4 量化级别下最低只需 6GB 显存)。经过约 1T 标识符的中英双语训练,辅以监督微调、 反馈自助、人类反馈强化学习等技术的加持,62 亿参数的 ChatGLM-6B 虽然规模不及千亿模型,但大大降低了用户部署的门槛,并且已经能生成相当符合人类偏好的回答。
I am very pleased to introduce to you the newest member of our large model family: ChatGLM (alpha beta version: QAGLM), which is a 100 billion Chinese English language model with question-and-answer and conversation functions, and optimized for Chinese. It has now started the invitation beta, and will be expanded gradually in the future.
At the same time, after open source GLM-130B 100 billion base model, we officially open source the latest Chinese-English bilingual conversation GLM model: ChatGLM-6B, combined with model quantization technology, users can be locally deployed on consumer graphics cards (INT4 quantization level at least 6GB video memory). After training about 1T identifiers in Chinese and English, supplemented by monitoring fine-tuning, feedback self-help, human feedback reinforcement learning and other technologies, ChatGLM-6B with 6.2 billion parameters, although less than 100 billion models, has greatly lowered the threshold of user deployment, and has been able to generate answers quite in line with human preferences.
清华 THUDM ChatGLM的收录查询
清华 THUDM ChatGLM的最新快照
相关标签
数据评估
本站网站百科提供的清华 THUDM ChatGLM都来源于网络,不保证外部链接的准确性和完整性,同时,对于该外部链接的指向,不由网站百科实际控制,在2023 年 3 月 29 日 23:17收录时,该网页上的内容,都属于合规合法,后期网页的内容如出现违规,可以直接联系网站管理员进行删除,网站百科不承担任何责任。