menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Open Source News

>

Qwen 2.5 M...
source image

Marktechpost

1M

read

182

img
dot

Qwen 2.5 Models Released: Featuring Qwen2.5, Qwen2.5-Coder, and Qwen2.5-Math with 72B Parameters and 128K Context Support

  • The Qwen team from Alibaba has recently made waves in the AI/ML community by releasing their latest series of large language models (LLMs), Qwen2.5.
  • These models have taken the AI landscape by storm, from 0.5 billion to 72 billion parameters.
  • Qwen2.5 has introduced notable improvements across several key areas, including coding, mathematics, instruction-following, and multilingual support.
  • The release includes specialized models, such as Qwen2.5-Coder and Qwen2.5-Math, further diversifying the range of applications for which these models can be optimized.
  • Versatility and performance, which allows it to challenge some of the most powerful models on the market, including Llama 3.1 and Mistral Large 2.
  • Qwen2.5’s benchmark results suggest massive improvements over its predecessor, Qwen2, across several key metrics.
  • Qwen2.5’s ability to support context lengths of up to 128,000 tokens is crucial for tasks requiring extensive and complex inputs.
  • Qwen2.5 series supports 29 languages, making it a robust tool for multilingual applications.
  • Alibaba has also released specialized variants with base models: Qwen2.5-Coder and Qwen2.5-Math.
  • The Qwen2.5-72B model represents the top-tier variant with 72 billion parameters.

Read Full Article

like

11 Likes

For uninterrupted reading, download the app