This paper provides a unified perspective for the Kullback-Leibler (KL)-divergence and the integral probability metrics (IPMs) from the perspective of maximum likelihood density-ratio estimation.
The paper shows that the KL-divergence and IPMs can be represented as maximal likelihoods, differing only by sampling schemes.
The paper also introduces a new class of probability divergences called the Density Ratio Metrics (DRMs), which interpolate the KL-divergence and IPMs.
The authors propose a new adversarial training scheme for learning fair representation (LFR) with a theoretical guarantee of fairness for the final prediction model.