Introduction to Natural Language Processing (600.465) Essential Information Theory II

9/7/00


Click here to start


Table of Contents

Introduction to Natural Language Processing (600.465) Essential Information Theory II

Kullback-Leibler Distance (Relative Entropy)

Comments on Relative Entropy

Mutual Information (MI) in terms of relative entropy

Mutual Information: the Formula

From Mutual Information to Entropy

Properties of MI vs. Entropy

Jensenís Inequality

Information Inequality

Other (In)Equalities and Facts

Cross-Entropy

Cross Entropy: The Formula

Conditional Cross Entropy

Sample Space vs. Data

Computation Example

Cross Entropy: Some Observations

Cross Entropy: Usage

Comparing Distributions

Author: Jan Hajic

Email: hajic@cs.jhu.edu

Home Page: http://www.cs.jhu.edu/~hajic/courses/cs465/syllabus.html