Original title: Evaluation of Kullback-Leibler Divergence
Authors: Homolová, Jitka ; Kárný, Miroslav
Document type: Research reports
Year: 2015
Language: eng
Series: Research Report, volume: 2349
Abstract: Kullback-Leibler divergence is a leading measure of similarity or dissimilarity of probability distributions. This technical paper collects its analytical and numerical expressions for the broad range of distributions.
Keywords: Bayesian decision making; Bayesian learning and approximation; cross-entropy; Kullback-Leibler divergence
Project no.: GA13-13502S (CEP)
Funding provider: GA ČR

Institution: Institute of Information Theory and Automation AS ČR (web)
Document availability information: Fulltext is available at external website.
External URL: http://library.utia.cas.cz/separaty/2015/AS/homolova-0444191.pdf
Original record: http://hdl.handle.net/11104/0247115

Permalink: http://www.nusl.cz/ntk/nusl-187994


The record appears in these collections:
Research > Institutes ASCR > Institute of Information Theory and Automation
Reports > Research reports
 Record created 2015-06-11, last modified 2023-12-06


No fulltext
  • Export as DC, NUŠL, RIS
  • Share