Large language models (LLMs) have been shown to be able to perform new tasks based on a few demonstrations or natural language instructions. While these capabilities have led to widespread adoption, most LLMs are developed by resource-rich organizations and are frequently kept from the public. As a step towards democratizing this powerful technology, we present BLOOM, a 176B-parameter open-access language model designed and built thanks to a collaboration of hundreds of researchers. BLOOM is a decoder-only Transformer language model that was trained on the ROOTS corpus, a dataset comprising hundreds of sources in 46 natural and 13 programming languages (59 in total). We find that BLOOM achieves competitive performance on a wide variety of benchmarks, with stronger results after undergoing multitask prompted finetuning. To facilitate future research and applications using LLMs, we publicly release our models and code under the Responsible AI License.
翻译:事实证明,大型语言模式(LLMS)能够根据一些示范或自然语言指令执行新的任务,虽然这些能力已经导致广泛采用,但大多数LMS是由资源丰富的组织开发的,而且经常不向公众开放,作为使这一强大技术民主化的一个步骤,我们介绍BLOM,这是一个176B参数开放语言模式,由几百名研究人员合作设计和建立。BLOOM是一个只有解译器的变异语言模式,在ROOTS文中受过培训,该数据集包括以46种自然语言和13种编程语言(共59种)的数百种来源。我们发现,BLOM在广泛基准上取得了竞争性业绩,在经过多任务推动的微调后取得了更强有力的成果。为了便利今后使用LMS的研究和应用,我们根据负责的AI许可证公开发布我们的模型和代码。</s>