menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Technology News

>

Akamai dis...
source image

Siliconangle

1M

read

242

img
dot

Image Credit: Siliconangle

Akamai distributes AI inference across the globe, promising lower latency and higher throughput

  • Akamai Technologies Inc. launches Akamai Cloud Inference, a distributed platform for running large language models closer to users.
  • The service is built on the highly distributed Akamai Cloud to address issues with centralized cloud-based models.
  • Running language models in centralized platforms leads to latency issues due to data location.
  • Akamai focuses on moving AI data closer to users at the edge for more efficient inference.
  • The company's extensive network from its CDN roots gives it a distributed advantage.
  • Akamai's cloud infrastructure spans over 4,100 points of presence in more than 130 countries.
  • Akamai Cloud Inference promises triple the throughput, reduced latency, and cost savings for AI inference.
  • The service offers flexible compute resources, including GPUs and ASICs, for various AI workloads.
  • Advanced data management capabilities are provided through a partnership with Vast Data Inc.
  • Edge compute capabilities enable serverless infrastructure for latency-sensitive AI applications.

Read Full Article

like

14 Likes

For uninterrupted reading, download the app