Johns Hopkins Computer Science Home
Johns Hopkins University The Whiting School of Engineering

Natural Language Processing
Prof. Jason Eisner
Course # 600.465 — Fall 2013

parse trees

Announcements


Vital Statistics [1]

Course catalog entry: This course is an in-depth overview of techniques for processing human language. How should linguistic structure and meaning be represented? What algorithms can recover them from text? And crucially, how can we build statistical models to choose among the many legal answers?

The course covers methods for trees (parsing and semantic interpretation), sequences (finite-state transduction such as tagging and morphology), and words (sense and phrase induction), with applications to practical engineering tasks such as information retrieval and extraction, text classification, part-of-speech tagging, speech recognition, and machine translation. There are a number of structured but challenging programming assignments. Prerequisite: 600.226 or equivalent. [Eisner, Applications, Fall] 3 credits

More information: Welcome! This course is designed to introduce you to some of the problems and solutions of NLP, and their relation to linguistics and statistics. You need to know how to program (e.g., 600.120) and use common data structures (600.226). It might also be nice to have some previous familiarity with automata (600.271) and probabilities (600.475, 550.420, or 550.310). At the end you should agree (I hope!) that language is subtle and interesting, feel some ownership over some of NLP's formal and statistical techniques, and be able to understand research papers in the field.

Lectures:MWF 3-4 or 3-4:15, Maryland 109.
Prof:Jason Eisner - (image of email address) ((image of email address))
TA:Ryan Cotterell - (image of email address) (ryan dot cotterell at gmail dot com)
CA:Kieran Magee - (image of email address) (kbrantn1 at jhu dot edu)
Office hrs: For Prof: After class until 4:30, or by appt, in Hackerman 324C
For TA/CA: TBA
Discussion
session:
TA-led session (optional) for activities/discussion/questions/review: TBA
Discussion site: http://piazza.com/jhu/fall2013/600465 ... public questions, discussion, announcements
Web page:http://cs.jhu.edu/~jason/465
Textbook: Jurafsky & Martin, 2nd ed. (semi-required - P98.J87 2009 in Science Ref section on C-Level)
Roark & Sproat (recommended - P98.R63 2007 in same section)
Manning & Schütze (recommended - free online PDF version here!)
Policies: Grading: homework 50%, participation 5%, midterm 15%, final 30%
Submission: TBA
Lateness: floating late days policy
Honesty: here's what it means
Intellectual engagement: much encouraged
Announcements: Read mailing list and this page!
Related
sites:
List of many courses - some may have useful material!

Schedule

This class is in the "flexible time slot" MWF 3-4:30. Please keep the entire slot open. Ordinarily we'll have lecture from 3-4 — followed by office hours from 4-4:30 in the classroom, for those of you who have questions or are interested in further discussion. However, from time to time, lecture will run till 4:15 in order to keep up with the syllabus. I'll give advance notice of these occasional "long lectures," which among other things make up for no-class days when I'll be out of town.

We'll also schedule a once-per-week discussion session led by your TA. This optional session will focus on solving problems together. That's meant as an efficient and cooperative way to study for an hour: it reinforces the past week's class material without adding to your homework load. Also, if you come to discussion session as recommended, you won't be startled by the exam style — the discussion problems are taken from past exams and are generally interesting.

Warning: The schedule below may change. Links to future lectures and assignments may also change (they currently point to last year's versions).

Warning: I sometimes turn off the PDF links when they are not up to date with the PPT links. If they don't work, just click on "ppt" instead.

Week Monday Wednesday Friday Suggested Reading
9/2 No class (Labor Day) No class (Rosh Hashanah) Introduction (ppt)
  • Why is NLP hard?
  • Levels of language
  • NLP applications
  • Random language via n-grams
  • Questionnaire
  • Intro: J&M chapter 1
  • 9/9 Assignment 1 given: Designing CFGs
    Chomsky hierarchy (ppt)
  • What's wrong with n-grams?
  • Regular expressions, CFGs, & more
  • Lists, trees, and vectors
  • Language models (ppt)
  • Language ID
  • Text categorization
  • Spelling correction
  • Segmentation
  • Speech recognition
  • Machine translation
  • Probability concepts (ppt; video lecture)
  • Joint & conditional prob
  • Chain rule and backoff
  • Modeling sequences
  • Cross-entropy and perplexity
  • Homework: J&M 12, M&S 3
  • Chomsky hierarchy: J&M 16
  • Language models: M&S 6 (or R&S 6)
  • Prob/Bayes: M&S 2; slides from Martin or Moore
  • 9/16 Bayes' Theorem (ppt)
    Smoothing n-grams (ppt)
  • Maximum likelihood estimation
  • Bias and variance
  • Add-one or add-lambda smoothing
  • Conditional log-linear models (interactive visualization)
  • Regularization
  • Assignment 1 due
        (& another sign meant 3 ... ?)
    Assignment 2 given: Probabilities
    Limitations of CFG
  • Discussion of Asst. 1
  • Improving CFG with attributes (ppt)
  • Morphology
  • Lexicalization
  • Tenses
  • Gaps (slashes)
  • Smoothing: M&S 6; J&M 4; Rosenfeld (2000)
  • Attributes: J&M 15
  • 9/23 Assignment 3 given: Language Models
    Context-free parsing (ppt)
  • What is parsing?
  • Why is it useful?
  • Brute-force algorithm
  • CKY and Earley algorithms
  • Assignment 2 due
    Context-free parsing
  • From recognition to parsing
  • Incremental strategy
  • Dotted rules
  • Sparse matrices
  • Earley's algorithm (ppt)
  • Top-down parsing
  • Earley's algorithm
  • Parsing: J&M 13
  • 9/30 (not covered this year)
    Extending CFG (summary (ppt))
  • CCG
  • TSG
  • TAG (by Darcey Riley)
  • Probabilistic parsing (ppt)
  • PCFG parsing
  • Dependency grammar
  • Lexicalized PCFGs
  • Assignment 3 due
    Assignment 4 given: Parsing
    Parsing tricks (ppt)
  • Pruning; best-first
  • Rules as regexps
  • Left-corner strategy
  • Smoothing
  • Evaluation
  • CCG: Steedman & Baldridge; more
  • TAG/TSG: Van Noord, Guo, Zhang 1/2/3
  • Prob. parsing: M&S 12, J&M 14
  • 10/7 Catch-up day
        (we'll be behind schedule by now)
  • A song about parsing
  • (not covered this year)
    Human sentence processing (ppt)
  • Methodology
  • Frequency sensitivity
  • Incremental interpretation
  • Unscrambling text (ppt)
  • No class (student NLP colloquium at UMBC)
  • Psycholinguistics: Tanenhaus & Trueswell (2006), Human Sentence Processing website
  • 10/14 (Monday 10/14 is fall break day; but class meets on Tuesday 10/15, which will follow a Monday schedule)
    Semantics (ppt)
  • What is understanding?
  • Lambda terms
  • Semantic phenomena and representations
  • Semantics continued
  • More semantic phenomena and representations
  • Assignment 5 given: Semantics
    Semantics continued
  • Adding semantics to CFG rules
  • Compositional semantics
  • Semantics: J&M 17-18; this web page, up to but not including "denotational semantics" section; try the Penn Lambda Calculator; lambda calculus for kids
  • 10/21 Midterm exam
    (3-4:30 in classroom)
    Forward-backward algorithm (ppt) (Excel spreadsheet; Viterbi version; lesson plan; video lecture)
  • Ice cream, weather, words and tags
  • Forward and backward probabilities
  • Inferring hidden states
  • Controlling the smoothing effect
  • Forward-backward continued
  • Reestimation
  • Likelihood convergence
  • Symmetry breaking
  • Local maxima
  • Uses of states
  • Forward-backward: J&M 6 or perhaps Allen pp. 195-208 (handout)
  • 10/28 Assignment 4 due
    Assignment 6 given: Hidden Markov Models
    Expectation Maximization (ppt)
  • Generalizing the forward-backward strategy
  • Inside-outside algorithm
  • Posterior decoding
  • Finite-state algebra (ppt)
  • Regexp review
  • Properties
  • Functions, relations, composition
  • Simple applications
  • Finite-state machines
  • Acceptors
  • Expressive power
  • Weights and semirings
  • Lattice parsing
  • Transducers
  • Inside-outside: John Lafferty's notes; M&S 11
  • Finite-state machines: R&S 1
  • 11/4 Finite-state implementation (ppt)
  • Finite-state operators
  • Uses of composition
  • Implementing the operators
  • Finite-state tagging (ppt)
  • The task
  • Hidden Markov Models
  • Transformation-based
  • Constraint-based
  • Assignment 5 due
    Noisy channels and FSTs (ppt)
  • Regexps and segmentation
  • The noisy channel generalization
  • Applications of the noisy channel
  • Implementation using FSTs
  • Finite-state operators: chaps 2-3 of XFST book draft
  • Finite-state NLP: Karttunen (1997)
  • Tagging: J&M 5 or M&S 10
  • 11/11 More FST examples (ppt)
  • Baby talk
  • Edit distance
  • Back-transliteration
  • Machine translation
  • Programming with regexps (ppt)
  • Analogy to programming
  • Extended finite-state operators
  • Date parsing
  • FASTUS
  • Morphology and phonology (ppt)
  • English, Turkish, Arabic
  • Stemming
  • Compounds, segmentation
  • Two-level morphology
  • Punctuation
  • Rewrite rules
  • OT
  • Morphology: R&S 2
  • 11/18 Assignment 6 due
    Assignment 7 given: Finite-State Modeling
    Optimal paths in graphs
  • The Dyna perspective
  • Structured prediction (ppt)
  • Perceptrons
  • CRFs
  • Feature engineering
  • Generative vs. discriminative
  • Current NLP tasks and competitions (ppt)
  • The NLP research community
  • Text annotation tasks
  • Other types of tasks
  • Dyna tutorial and assignment
  • 11/25 Applied NLP continued (ppt) No class (Thanksgiving break) No class
    (Thanksgiving break)
    12/2 Applied NLP continued (ppt) Topic models
  • Guest lecture by Michael Paul
  • Assignment 7 due
    Machine translation
  • Guest lecture by Adam Lopez
  • Topic models: intro readings/slides from Dave Blei, slides by Jason Eisner/A> (video lecture part 1, part 2)
  • MT: J&M 25, M&S 13, statmt.org; tutorial (2003), workbook (1999), introductory essay (1997), technical paper (1993); tutorial (2006) focusing on more recent developments (slides, 3-hour video part 1, part 2)
  • 12/10

    Thu 12/12 is absolute deadline for late assignments --->

    Final exam: Tue 12/17, 9am-noon --->


    Old Materials

    Old assignment: Lectures from past years, some still useful: