
OpenBMB:清华大学开源社区如何推动大语言模型高效计算与参数微调
OpenBMB is an open-source community and toolset initiated by Tsinghua University since 2018, focused on building efficient computational tools for large-scale pre-trained language models. Its core contribution includes parameter-efficient fine-tuning methods, and it has released significant projects like UltraRAG 2.1, UltraEval-Audio v1.1.0, and the 4-billion-parameter AgentCPM-Explore model, which demonstrate strong performance in benchmarks. (OpenBMB是清华大学自2018年起支持发起的开源社区与工具集,致力于构建大规模预训练语言模型的高效计算工具。其核心贡献包括参数高效微调方法,并发布了UltraRAG 2.1、UltraEval-Audio v1.1.0和40亿参数的AgentCPM-Explore模型等重要项目,在多项基准测试中表现出色。)
AI大模型2026/1/24
阅读全文 →






