menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Devops News

>

Cloud Nati...
source image

The New Stack

4w

read

199

img
dot

Image Credit: The New Stack

Cloud Native and Open Source Help Scale Agentic AI Workflows

  • Enterprise automation is leveraging intelligent agent workflows powered by AI, with large language models facing complexity and high costs.
  • Small language models (SLMs) paired with cloud native platforms like Kubernetes and FaaS emerge for agile agentic AI applications.
  • SLMs offer advantages in responsiveness, scalability, and cost-efficiency compared to large language models (LLMs).
  • Microsoft’s Phi-3-mini and other SLMs are designed for efficiency and are suitable for deployments in various settings including edge devices.
  • Cloud native architectures utilizing Kubernetes and FaaS provide scalable AI deployments with efficient resource allocation.
  • Knative complements Kubernetes, enabling automatic scaling based on demand for serverless workloads.
  • The Cloud Native Computing Foundation (CNCF) ecosystem provides tools like Prometheus, Grafana, and Istio for monitoring and optimization.
  • Practical implementation involves using OCI Kubernetes Engine (OKE) to deploy small language models via Knative Serving for serverless capabilities.
  • Deployment steps include provisioning a Kubernetes cluster, deploying Knative Serving, containerizing the SLM, and monitoring performance metrics.
  • SLM agents show effectiveness in use-cases like real-time customer support, DevOps automation, and financial services, offering agility and cost-efficiency.

Read Full Article

like

12 Likes

For uninterrupted reading, download the app