menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Cloud News

>

Generative...
source image

Analyticsindiamag

1M

read

85

img
dot

Image Credit: Analyticsindiamag

Generative AI Cost Optimisation Strategies

  • Generative AI’s potential for organisations raises questions regarding its cost implications. AI implementation is a complex ecosystem of decisions, each affecting the final price tag. Optimising costs throughout the AI lifecycle involves various strategies for model selection, fine-tuning, data management and operations culture.
  • To optimise costs, start by clearly defining your use case and its requirements, and balance performance, accuracy and cost.
  • Try experimenting with different model sizes. Smaller models may be more effective and economical for specific tasks.
  • For customisation, choice of retrieval-augmented generation, fine-tuning or prompt engineering impacts cost.
  • RAG enables organisations to use an FM to enrich data from an organisation’s source. This approach improves accuracy and relevance without significant model retraining, balancing performance and cost efficiency.
  • Fine-tuning excels at complex operations beyond simple information retrieval. Phasing RAG and fine-tuning can be a more cost-effective approach.
  • More accurate prompts reduce cost of multiple interactions, enabling smaller cost-effective AI models. Good data management includes data governance for regulatory compliance, preventing costly legal issues.
  • Organisational culture and practices, encouraging prove-the-value approach and fostering a frugal AI culture, rewards innovation and helps identify cost-saving opportunities.
  • FinOps, a practice bringing financial accountability to the variable spend model of cloud computing, can help organisations efficiently manage resources for training, running, and customising their AI models.
  • FinOps balances a centralised organisational and technical platform, which applies the core FinOps principles of visibility, optimisation, and governance, with decentralised teams responsible for justifying AI spending, making informed decisions about model selection and continuously optimising AI processes for cost efficiency.

Read Full Article

like

5 Likes

For uninterrupted reading, download the app