The Allen Institute for AI (AI2) has developed OLMo 2, an open-source family of language models.OLMo 2 consists of 7 billion (7B) and 13 billion (13B) parameter configurations and was trained on up to 5 trillion tokens.The models bridge the performance gap with proprietary systems and outperform their predecessors as well as competitors.Advancements in training stability, staged training, evaluation frameworks, and dataset diversity contribute to OLMo 2's success.