Hardware choices and training time significantly impact energy, water, and carbon footprints during AI model training, while architecture-related factors have a minimal effect.
Energy efficiency during AI model training has improved slightly over the years, showing a 0.13% improvement annually.
Longer training times can gradually reduce overall energy efficiency by 0.03% per hour.
The study analyzed AI models' architectural and hardware choices' impact on resource consumption during training.
Data from Epoch AI's Notable AI Models dataset was used for estimation and analysis methods.
Results indicated that hardware choices and training time were significant predictors of energy consumption during AI training.
AI models have become slightly more energy-efficient over time, with an estimated 0.13% improvement annually.
Training time was identified as a significant factor influencing energy efficiency, decreasing by 0.03% per hour.
The study highlighted the significant environmental impacts of AI model training and the importance of considering hardware choices and training practices.
Further research is recommended to explore interactions between hardware types and training practices for more comprehensive insights.