menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Energy Con...
source image

Arxiv

5d

read

138

img
dot

Image Credit: Arxiv

Energy Considerations for Large Pretrained Neural Networks

  • Increasingly complex neural network architectures have achieved phenomenal performance but require massive computational resources and consume substantial amounts of electricity, raising environmental concerns.
  • Research on large pre-trained models shows redundancies exist. Previous focus was on model compression for performance rather than electricity consumption impact.
  • Study quantifies energy usage of uncompressed and compressed large pre-trained models to reduce electricity consumption.
  • Compression techniques like steganographic capacity reduction show significant benefits in reducing energy usage, while pruning and low-rank factorization do not offer significant improvements.

Read Full Article

like

8 Likes

For uninterrupted reading, download the app