menu
techminis

A naukri.com initiative

google-web-stories
Home

>

Deep Learning News

>

How to use...
source image

Medium

3w

read

170

img
dot

Image Credit: Medium

How to use Sparse Categorical Crossentropy in Keras

  • Sparse categorical crossentropy is a loss function in Keras that allows leaving the integers as they are without requiring one-hot encoding.
  • The blog explains how to build a CNN using sparse categorical crossentropy with an example on the MNIST dataset.
  • In traditional multiclass classification with Keras, categorical crossentropy necessitates one-hot encoding of target vectors.
  • One-hot encoding involves converting integer targets into categorical format before using categorical crossentropy.
  • For integer targets that are too large for one-hot encoding, sparse categorical crossentropy can be used.
  • The formula for categorical crossentropy involves computing natural logarithms of class predictions and actual targets.
  • Sparse categorical crossentropy is an integer-based version of categorical crossentropy.
  • The blog provides code examples for creating a CNN with sparse categorical crossentropy using the MNIST dataset.
  • The tutorial includes setting up model configurations, loading and preparing MNIST data, model architecture, compilation using sparse categorical crossentropy, and model fitting.
  • By following the tutorial, one can train a CNN with sparse categorical crossentropy in Keras for multiclass classification.

Read Full Article

like

10 Likes

For uninterrupted reading, download the app