menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

How Iterat...
source image

Medium

1w

read

110

img
dot

How Iterative Magnitude Pruning works part4(AI 2024)

  • Sparse shrunk additive models and sparse random feature models have been developed separately as methods to learn low-order functions, where there are few interactions between variables.
  • Inspired by the success of the iterative magnitude pruning technique in finding lottery tickets of neural networks, a new method called Sparser Random Feature Models via IMP (ShRIMP) is proposed to efficiently fit high-dimensional data with sparse variable dependencies.
  • ShRIMP combines the process of constructing and finding sparse lottery tickets for two-layer dense networks.
  • Experimental results show that ShRIMP achieves better or comparable test accuracy compared to other sparse feature and additive methods, while offering feature selection with low computational complexity.

Read Full Article

like

6 Likes

For uninterrupted reading, download the app