menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Devops News

>

Kubernetes...
source image

The New Stack

4w

read

276

img
dot

Image Credit: The New Stack

Kubernetes + LLMs: Cast AI Solves the Cost Puzzle

  • Cast AI has launched an AI Optimizer service that automatically cuts the cost of deploying large language models (LLMs) by integrating with any OpenAI-compatible API endpoint.
  • Cast AI has also launched AI Enabler, which leverages the vendor’s Kubernetes infrastructure optimisation capabilities to intelligently route queries from organisations and DevOps to the most cost-efficient LLM.
  • DevOps teams can compare LLMs side-by-side for performance and cost with the Cast AI Playground, another tool from the provider.
  • Cast AI is helping developers manage the costs of AI operations with its AI Optimizer and now AI Enabler.
  • The costs of running LLMs can grow quickly, with a single instance of an LLM hosting reaching upwards of $20,000 per month.
  • MLOps teams responsible for building and maintaining the infrastructure for generative AI workloads struggle to determine the best model for their specific needs.
  • Cast AI’s various tools help developers get their arms around the growing number of LLMs and the costs to run them.
  • AI Enabler compares LLMs and creates benchmarks, helping developers to optimise performance and cost.
  • Cast AI has launched Commercially Supported Container Live Migration, which enables automatic and uninterrupted migration of stateful and uninterruptible workloads in Kubernetes.
  • Cast AI is integrating the Container Live Migration feature with its other automation tools, including Bin-Packing and Eviction, Cluster and Node Rebalancing, Spot Fallback, Spot Interruption ML Prediction, and Spot Instance Price Drift Rebalancing.

Read Full Article

like

16 Likes

For uninterrupted reading, download the app