menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

How Integr...
source image

Medium

2w

read

95

img
dot

How Integral Probability Metric works part4(Machine Learning 2024)

  • This paper provides a unified perspective for the Kullback-Leibler (KL)-divergence and the integral probability metrics (IPMs) from the perspective of maximum likelihood density-ratio estimation.
  • The paper shows that the KL-divergence and IPMs can be represented as maximal likelihoods, differing only by sampling schemes.
  • The paper also introduces a new class of probability divergences called the Density Ratio Metrics (DRMs), which interpolate the KL-divergence and IPMs.
  • The authors propose a new adversarial training scheme for learning fair representation (LFR) with a theoretical guarantee of fairness for the final prediction model.

Read Full Article

like

5 Likes

For uninterrupted reading, download the app