The Allen Institute for AI (AI2) has introduced OLMo 2 32B, a fully open model surpassing GPT-3.5 Turbo and GPT-4o mini on multi-skill benchmarks.OLMo 2 32B has 32 billion parameters and achieved performance levels comparable to leading open-weight models with less computational resources.It demonstrated impressive results on tasks like MMLU, MATH, and IFEval, showcasing its versatility in linguistic challenges.By promoting openness and collaboration, AI2 enables researchers worldwide to build upon this work and contributes to the evolving landscape of AI.