Bias reduction and metric learning for nearest-neighbor estimation of kullback-leibler divergence

Cited 5 time in webofscience Cited 0 time in scopus
  • Hit : 50
  • Download : 0
Asymptotically unbiased nearest-neighbor estimators for KL divergence have recently been proposed and demonstrated in a number of applications. With small sample sizes, however, these nonparametric methods typically suffer from high estimation bias due to the non-local statistics of empirical nearest-neighbor information. In this paper, we show that this non-local bias can be mitigated by changing the distance metric, and we propose a method for learning an optimal Mahalanobis-type metric based on global information provided by approximate parametric models of the underlying densities. In both simulations and experiments, we demonstrate that this interplay between parametric models and nonparametric estimation methods significantly improves the accuracy of the nearest-neighbor KL divergence estimator.
Publisher
AISTATS Committee
Issue Date
2014-04
Language
English
Citation

17th International Conference on Artificial Intelligence and Statistics, AISTATS 2014, pp.669 - 677

ISSN
1938-7288
URI
http://hdl.handle.net/10203/314516
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 5 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0