KL Divergence: Measuring Information Difference

FoundationalWidely AppliedMathematically Rigorous

Kullback-Leibler (KL) divergence, often called relative entropy, is a fundamental measure in information theory that quantifies how one probability…

KL Divergence: Measuring Information Difference

Overview

Kullback-Leibler (KL) divergence, often called relative entropy, is a fundamental measure in information theory that quantifies how one probability distribution diverges from a second, expected probability distribution. Developed by Solomon Kullback and Richard Leibler in 1951, it's not a true distance metric as it's asymmetric (D_KL(P||Q) ≠ D_KL(Q||P)) and doesn't satisfy the triangle inequality. Despite this, it's indispensable for tasks like model selection, hypothesis testing, and understanding information loss in data compression. Its applications span machine learning, signal processing, and statistical inference, providing a crucial lens for comparing and evaluating probabilistic models.

Key Facts

Year
1951
Origin
Solomon Kullback and Richard Leibler
Category
Information Theory & Statistics
Type
Concept