menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

PRIME Inte...
source image

Marktechpost

3w

read

246

img
dot

PRIME Intellect Releases INTELLECT-1 (Instruct + Base): The First 10B Parameter Language Model Collaboratively Trained Across the Globe

  • PRIME Intellect has released INTELLECT-1 (Instruct + Base), the first 10-billion-parameter language model collaboratively trained across the globe.
  • INTELLECT-1 project highlights the feasibility of using decentralized, community-driven resources for training advanced LLMs.
  • The PRIME framework utilized up to 112 H100 GPUs across three continents and achieved a compute utilization rate of up to 96% under optimal conditions.
  • INTELLECT-1 model was trained on 1 trillion tokens, ensuring it has a broad understanding of various domains.
  • The PRIME framework addressed the constraints of geographically distributed nodes by implementing innovations like live checkpointing, fault-tolerant communication, and 8-bit quantization strategy.
  • INTELLECT-1 achieved 37.5% accuracy on the MMLU benchmark and 72.26% on HellaSwag and outperformed several other open-source models in specific benchmarks, including 65.82% on the WinoGrande challenge.
  • PRIME Intellect and its collaborators have demonstrated that advanced AI development need not be limited to a few elite corporations.
  • INTELLECT-1 release sets a new standard for what is possible in open and inclusive AI research.
  • By leveraging decentralized resources, it paves the way for further developments in community-led AI projects.
  • The global network of 30 independent compute contributors ensured the scalability of such efforts and highlighted the accessibility of advanced AI technology.

Read Full Article

like

14 Likes

For uninterrupted reading, download the app