Matches in DBpedia 2016-04 for { <http://dbpedia.org/resource/Kullback–Leibler_divergence> ?p ?o }
- Kullback–Leibler_divergence abstract "In probability theory and information theory, the Kullback–Leibler divergence (also information divergence, information gain, relative entropy, KLIC, or KL divergence) is a measure of the difference between two probability distributions P and Q. It is not symmetric in P and Q. In applications, P typically represents the \"true\" distribution of data, observations, or a precisely calculated theoretical distribution, while Q typically represents a theory, model, description, or approximation of P.Specifically, the Kullback–Leibler divergence of Q from P, denoted DKL(P‖Q), is a measure of the information gained when one revises ones beliefs from the prior probability distribution Q to the posterior probability distribution P. In other words, it is the amount of information lost when Q is used to approximate P. Kullback–Leibler divergence also measures the expected number of extra bits required to code samples from P using a code optimized for Q rather than the code optimized for P.Although it is often intuited as a way of measuring the distance between probability distributions, the Kullback–Leibler divergence is not a true metric. It does not obey the triangle inequality, and in general DKL(P‖Q) does not equal DKL(Q‖P). However, its infinitesimal form, specifically its Hessian, gives a metric tensor known as the Fisher information metric.The Kullback–Leibler divergence is a special case of a broader class of divergences called f-divergences as well as the class of Bregman divergences. It is the only such divergence over probabilities that is a member of both classes.The Kullback–Leibler divergence was originally introduced by Solomon Kullback and Richard Leibler in 1951 as the directed divergence between two distributions. It is discussed in Kullback's historic text, Information Theory and Statistics.The Kullback–Leibler divergence is sometimes also called the information gain achieved if P is used instead of Q. It is also called the relative entropy of P with respect to Q, and written H(P|Q).".
- Kullback–Leibler_divergence wikiPageExternalLink 1404.2000.
- Kullback–Leibler_divergence wikiPageExternalLink 0604246.
- Kullback–Leibler_divergence wikiPageExternalLink nips09_verdu_re.
- Kullback–Leibler_divergence wikiPageExternalLink loadFile.do?objectId=13089&objectType=file.
- Kullback–Leibler_divergence wikiPageExternalLink ite.
- Kullback–Leibler_divergence wikiPageExternalLink diverge.
- Kullback–Leibler_divergence wikiPageID "467527".
- Kullback–Leibler_divergence wikiPageLength "42234".
- Kullback–Leibler_divergence wikiPageOutDegree "142".
- Kullback–Leibler_divergence wikiPageRevisionID "708101070".
- Kullback–Leibler_divergence wikiPageWikiLink Absolute_continuity.
- Kullback–Leibler_divergence wikiPageWikiLink Akaike_information_criterion.
- Kullback–Leibler_divergence wikiPageWikiLink Alfréd_Rényi.
- Kullback–Leibler_divergence wikiPageWikiLink Almost_everywhere.
- Kullback–Leibler_divergence wikiPageWikiLink Bayes_theorem.
- Kullback–Leibler_divergence wikiPageWikiLink Bayesian_experimental_design.
- Kullback–Leibler_divergence wikiPageWikiLink Bayesian_information_criterion.
- Kullback–Leibler_divergence wikiPageWikiLink Bayesian_statistics.
- Kullback–Leibler_divergence wikiPageWikiLink Bit.
- Kullback–Leibler_divergence wikiPageWikiLink Bregman_divergence.
- Kullback–Leibler_divergence wikiPageWikiLink Category:Entropy_and_information.
- Kullback–Leibler_divergence wikiPageWikiLink Category:F-divergences.
- Kullback–Leibler_divergence wikiPageWikiLink Category:Statistical_theory.
- Kullback–Leibler_divergence wikiPageWikiLink Category:Thermodynamics.
- Kullback–Leibler_divergence wikiPageWikiLink Chi-squared_test.
- Kullback–Leibler_divergence wikiPageWikiLink Conditional_entropy.
- Kullback–Leibler_divergence wikiPageWikiLink Conference_on_Neural_Information_Processing_Systems.
- Kullback–Leibler_divergence wikiPageWikiLink Covariance_matrix.
- Kullback–Leibler_divergence wikiPageWikiLink Cross_entropy.
- Kullback–Leibler_divergence wikiPageWikiLink Data_compression.
- Kullback–Leibler_divergence wikiPageWikiLink Data_differencing.
- Kullback–Leibler_divergence wikiPageWikiLink Density_matrix.
- Kullback–Leibler_divergence wikiPageWikiLink Deviance_information_criterion.
- Kullback–Leibler_divergence wikiPageWikiLink Differential_entropy.
- Kullback–Leibler_divergence wikiPageWikiLink Dimensional_analysis.
- Kullback–Leibler_divergence wikiPageWikiLink Divergence.
- Kullback–Leibler_divergence wikiPageWikiLink E_(mathematical_constant).
- Kullback–Leibler_divergence wikiPageWikiLink Earth_movers_distance.
- Kullback–Leibler_divergence wikiPageWikiLink Edwin_Thompson_Jaynes.
- Kullback–Leibler_divergence wikiPageWikiLink Einstein_notation.
- Kullback–Leibler_divergence wikiPageWikiLink Entropic_value_at_risk.
- Kullback–Leibler_divergence wikiPageWikiLink Entropy.
- Kullback–Leibler_divergence wikiPageWikiLink Entropy_(information_theory).
- Kullback–Leibler_divergence wikiPageWikiLink Entropy_maximization.
- Kullback–Leibler_divergence wikiPageWikiLink Entropy_power_inequality.
- Kullback–Leibler_divergence wikiPageWikiLink Exergy.
- Kullback–Leibler_divergence wikiPageWikiLink F-divergence.
- Kullback–Leibler_divergence wikiPageWikiLink File:KL-Gauss-Example.png.
- Kullback–Leibler_divergence wikiPageWikiLink Fisher_information_metric.
- Kullback–Leibler_divergence wikiPageWikiLink Gibbs_free_energy.
- Kullback–Leibler_divergence wikiPageWikiLink Gibbs_inequality.
- Kullback–Leibler_divergence wikiPageWikiLink Hellinger_distance.
- Kullback–Leibler_divergence wikiPageWikiLink Helmholtz_free_energy.
- Kullback–Leibler_divergence wikiPageWikiLink Hessian_matrix.
- Kullback–Leibler_divergence wikiPageWikiLink Huffman_coding.
- Kullback–Leibler_divergence wikiPageWikiLink I._J._Good.
- Kullback–Leibler_divergence wikiPageWikiLink Information_gain_in_decision_trees.
- Kullback–Leibler_divergence wikiPageWikiLink Information_gain_ratio.
- Kullback–Leibler_divergence wikiPageWikiLink Information_theory.
- Kullback–Leibler_divergence wikiPageWikiLink Information_theory_and_measure_theory.
- Kullback–Leibler_divergence wikiPageWikiLink Jensen–Shannon_divergence.
- Kullback–Leibler_divergence wikiPageWikiLink Joint_probability_distribution.
- Kullback–Leibler_divergence wikiPageWikiLink Josiah_Willard_Gibbs.
- Kullback–Leibler_divergence wikiPageWikiLink Kolmogorov–Smirnov_test.
- Kullback–Leibler_divergence wikiPageWikiLink Krafts_inequality.
- Kullback–Leibler_divergence wikiPageWikiLink Kronecker_delta.
- Kullback–Leibler_divergence wikiPageWikiLink Large_deviations_theory.
- Kullback–Leibler_divergence wikiPageWikiLink List_of_weight-of-evidence_articles.
- Kullback–Leibler_divergence wikiPageWikiLink Logarithm.
- Kullback–Leibler_divergence wikiPageWikiLink Logit.
- Kullback–Leibler_divergence wikiPageWikiLink Loss_function.
- Kullback–Leibler_divergence wikiPageWikiLink Marginal_distribution.
- Kullback–Leibler_divergence wikiPageWikiLink Matching_distance.
- Kullback–Leibler_divergence wikiPageWikiLink Maximum_likelihood.
- Kullback–Leibler_divergence wikiPageWikiLink Maximum_spacing_estimation.
- Kullback–Leibler_divergence wikiPageWikiLink Measure_(mathematics).
- Kullback–Leibler_divergence wikiPageWikiLink Metric_(mathematics).
- Kullback–Leibler_divergence wikiPageWikiLink Metric_space.
- Kullback–Leibler_divergence wikiPageWikiLink Metric_tensor.
- Kullback–Leibler_divergence wikiPageWikiLink Multivariate_normal_distribution.
- Kullback–Leibler_divergence wikiPageWikiLink Mutual_information.
- Kullback–Leibler_divergence wikiPageWikiLink Nat_(unit).
- Kullback–Leibler_divergence wikiPageWikiLink Optimal_design.
- Kullback–Leibler_divergence wikiPageWikiLink Partition_function_(mathematics).
- Kullback–Leibler_divergence wikiPageWikiLink Patch_(computing).
- Kullback–Leibler_divergence wikiPageWikiLink Pierre-Simon_Laplace.
- Kullback–Leibler_divergence wikiPageWikiLink Pinskers_inequality.
- Kullback–Leibler_divergence wikiPageWikiLink Positive-definite_matrix.
- Kullback–Leibler_divergence wikiPageWikiLink Posterior_probability.
- Kullback–Leibler_divergence wikiPageWikiLink Principle_of_indifference.
- Kullback–Leibler_divergence wikiPageWikiLink Principle_of_maximum_entropy.
- Kullback–Leibler_divergence wikiPageWikiLink Prior_probability.
- Kullback–Leibler_divergence wikiPageWikiLink Probability_density_function.
- Kullback–Leibler_divergence wikiPageWikiLink Probability_distribution.
- Kullback–Leibler_divergence wikiPageWikiLink Probability_space.
- Kullback–Leibler_divergence wikiPageWikiLink Probability_theory.
- Kullback–Leibler_divergence wikiPageWikiLink Quantum_entanglement.
- Kullback–Leibler_divergence wikiPageWikiLink Quantum_information_science.
- Kullback–Leibler_divergence wikiPageWikiLink Quantum_relative_entropy.