menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

HOT: Hadam...
source image

Arxiv

5d

read

226

img
dot

Image Credit: Arxiv

HOT: Hadamard-based Optimized Training

  • Researchers introduce a novel method called Hadamard-based Optimized Training (HOT) to optimize backpropagation in deep learning.
  • HOT focuses on matrix multiplication, the most computationally expensive part of training, and applies Hadamard-based optimizations selectively.
  • The method achieves up to 75% memory savings and a 2.6 times acceleration on GPUs, with minimal loss in accuracy compared to FP32 precision.
  • HOT includes techniques such as Hadamard quantization, Hadamard low-rank approximation, activation buffer compression, and layer-wise quantizer selection.

Read Full Article

like

13 Likes

For uninterrupted reading, download the app