menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Generalizi...
source image

Arxiv

1d

read

151

img
dot

Image Credit: Arxiv

Generalizing Large Language Model Usability Across Resource-Constrained

  • Large Language Models (LLMs) have been successful in various natural language tasks and are now being extended to multimodal domains and resource-constrained environments.
  • This dissertation focuses on enhancing the usability of LLMs under real-world constraints by introducing a text-centric alignment framework for integrating diverse modalities and an adversarial prompting technique for robustness against noisy data.
  • It also explores inference-time optimization strategies using prompt search and uncertainty quantification to improve LLM performance without additional training.
  • Furthermore, the work addresses low-resource domains like Verilog code generation by utilizing synthetic data pipelines and logic-enhanced reasoning models to achieve state-of-the-art performance with minimal data.

Read Full Article

like

9 Likes

For uninterrupted reading, download the app