menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

The Allen ...
source image

Marktechpost

3w

read

225

img
dot

The Allen Institute for AI (AI2) Releases OLMo 2: A New Family of Open-Sourced 7B and 13B Language Models Trained on up to 5T Tokens

  • The Allen Institute for AI (AI2) has developed OLMo 2, an open-source family of language models.
  • OLMo 2 consists of 7 billion (7B) and 13 billion (13B) parameter configurations and was trained on up to 5 trillion tokens.
  • The models bridge the performance gap with proprietary systems and outperform their predecessors as well as competitors.
  • Advancements in training stability, staged training, evaluation frameworks, and dataset diversity contribute to OLMo 2's success.

Read Full Article

like

13 Likes

For uninterrupted reading, download the app