At the core of LLM development lies an extremely demanding pre-training phase.
SALT addresses these inefficiencies by bringing SLMs into the picture as “intelligent curators” for data selection and guidance, reducing training time and computational load.
SALT introduces a two-stage training approach where an SLM steps in as a guide during the early stages of LLM training.
SALT not only reduced the required training time but also outperformed traditional training methods on several benchmarks, achieving superior quality while reducing computational demands.