Introduction to Natural Language Processing (600.465) Essential Information Theory I

9/7/00


Click here to start


Table of Contents

Introduction to Natural Language Processing (600.465) Essential Information Theory I

The Notion of Entropy

The Formula

Using the Formula: Example

Example: Book Availability

The Limits

Entropy and Expectation

Perplexity: motivation

Perplexity

Joint Entropy and Conditional Entropy

Conditional Entropy (Using the Calculus)

Properties of Entropy I

Properties of Entropy II

“Coding” Interpretation of Entropy

Coding: Example

Entropy of a Language

Author: Jan Hajic

Email: hajic@cs.jhu.edu

Home Page: http://www.cs.jhu.edu/~hajic/courses/cs465/syllabus.html