Introduction to Natural Language Processing (600.465)Essential Information Theory II
Kullback-Leibler Distance(Relative Entropy)
Comments on Relative Entropy
Mutual Information (MI)in terms of relative entropy
Mutual Information: the Formula
From Mutual Information to Entropy
Properties of MI vs. Entropy
Jensen’s Inequality
Information Inequality
Other (In)Equalities and Facts
Cross-Entropy
Cross Entropy: The Formula
Conditional Cross Entropy
Sample Space vs. Data
Computation Example
Cross Entropy: Some Observations
Cross Entropy: Usage
Comparing Distributions