Skip to yearly menu bar Skip to main content


Poster

Bounds on $L_p$ Errors in Density Ratio Estimation via $f$-Divergence Loss Functions

Yoshiaki Kitazawa

[ ]
2025 Poster

Abstract: Density ratio estimation (DRE) is a core technique in machine learning used to capture relationships between two probability distributions. $f$-divergence loss functions, which are derived from variational representations of $f$-divergence, have become a standard choice in DRE for achieving cutting-edge performance. This study provides novel theoretical insights into DRE by deriving upper and lower bounds on the $L_p$ errors through $f$-divergence loss functions. These bounds apply to any estimator belonging to a class of Lipschitz continuous estimators, irrespective of the specific $f$-divergence loss function employed.The derived bounds are expressed as a product involving the data dimensionality and the expected value of the density ratio raised to the $p$-th power.Notably, the lower bound includes an exponential term that depends on the Kullback--Leibler (KL) divergence, revealing that the $L_p$ error increases significantly as the KL divergence grows when $p > 1$. This increase becomes even more pronounced as the value of $p$ grows. The theoretical insights are validated through numerical experiments.

Chat is not available.