menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Foxtsage v...
source image

Arxiv

14h

read

218

img
dot

Image Credit: Arxiv

Foxtsage vs. Adam: Revolution or Evolution in Optimization?

  • Foxtsage is a hybrid optimization approach that integrates Hybrid FOX-TSA with Stochastic Gradient Descent for training Multi-Layer Perceptron models.
  • Foxtsage achieves a 42.03% reduction in loss mean and a 42.19% improvement in loss standard deviation compared to the widely adopted Adam optimizer.
  • There are modest improvements in accuracy, precision, recall, and F1-score with Foxtsage.
  • However, Foxtsage has a higher computational cost with a 330.87% increase in time mean compared to Adam.

Read Full Article

like

13 Likes

For uninterrupted reading, download the app