Johns Hopkins Computer Science Home
Johns Hopkins University The Whiting School of Engineering

Natural Language Processing
Prof. Jason Eisner
Course # 600.465 — Fall 2012

parse trees

Announcements


Vital Statistics [1]

Course catalog entry: This course is an in-depth overview of techniques for processing human language. How should linguistic structure and meaning be represented? What algorithms can recover them from text? And crucially, how can we build statistical models to choose among the many legal answers?

The course covers methods for trees (parsing and semantic interpretation), sequences (finite-state transduction such as tagging and morphology), and words (sense and phrase induction), with applications to practical engineering tasks such as information retrieval and extraction, text classification, part-of-speech tagging, speech recognition, and machine translation. There are a number of structured but challenging programming assignments. Prerequisite: 600.226 or equivalent. [Eisner, Applications, Fall] 3 credits

More information: Welcome! This course is designed to introduce you to some of the problems and solutions of NLP, and their relation to linguistics and statistics. You need to know how to program (e.g., 600.120) and use common data structures (600.226). It might also be nice to have some previous familiarity with automata (600.271) and probabilities (600.475, 550.420, or 550.310). At the end you should agree (I hope!) that language is subtle and interesting, feel some ownership over some of NLP's formal and statistical techniques, and be able to understand research papers in the field.

Lectures:MWF 3-4 or 3-4:15, Hodson 313
Prof:Jason Eisner - (image of email address) ((image of email address))
TA:Frank Ferraro - (image of email address) (ferraro at cs dot jhu dot edu)
CA:Katherine Wu - (image of email address)
Office hrs: For Prof: After class until 4:30, or by appt, in Hackerman 324C
For TA/CA: TBA
Discussion
session:
TA-led session (optional) for activities/discussion/questions/review: TBA
Discussion site: http://piazza.com/class#fall2012/600465 ... public questions, discussion, announcements
Web page:http://cs.jhu.edu/~jason/465
Textbook: Jurafsky & Martin, 2nd ed. (semi-required - P98.J87 2009 in Science Ref section on C-Level)
Roark & Sproat (recommended - P98.R63 2007 in same section)
Manning & Schütze (recommended - free online PDF version here!)
Policies: Grading: homework 50%, participation 5%, midterm 15%, final 30%
Submission: TBA
Lateness: floating late days policy
Honesty: here's what it means
Intellectual engagement: much encouraged
Announcements: Read mailing list and this page!
Related
sites:
List of many courses - some may have useful material!

Schedule

Note: This class is in the "flex time slot" from 3-4:30. We will use the time for a combination of lecture and discussion. Class will often run 3-4, followed by office hours from 4-4:30. However, class will sometimes run till 4:15 in order to keep up with the syllabus. I'll try to give advance notice of these "long classes," which among other things make up for days when the professor will be out of town.

Warning: The schedule may change. Links to future lectures and assignments may also change (they currently point to last year's versions).

Warning: I sometimes turn off the PDF links when they are not up to date with the PPT links. If they don't work, just click on "ppt" instead.

Assignment 2 due

Week Monday Wednesday Friday Suggested Reading
9/3 No class (Labor Day) Introduction (ppt)
  • Why is NLP hard?
  • Levels of language
  • NLP applications
  • Random language via n-grams
  • Questionnaire
  • Assignment 1 given: Designing CFGs
    Chomsky hierarchy (ppt)
  • What's wrong with n-grams?
  • Regular expressions, CFGs, & more
  • Lists, trees, and vectors
  • Intro: J&M chapter 1
  • Chomsky hierarchy: J&M 16
  • Homework: J&M 12, M&S 3
  • 9/10 Language models (ppt)
  • Language ID
  • Text categorization
  • Spelling correction
  • Segmentation
  • Speech recognition
  • Machine translation
  • Probability concepts (ppt; video lecture)
  • Joint & conditional prob
  • Chain rule and backoff
  • Modeling sequences
  • Cross-entropy and perplexity
  • Bayes' Theorem (ppt)
    Smoothing n-grams (ppt)
  • Maximum likelihood estimation
  • Bias and variance
  • Add-one or add-lambda smoothing
  • Conditional log-linear models (interactive visualization)
  • Regularization
  • Language models: M&S 6 (or R&S 6)
  • Text cat: Demo
  • Prob/Bayes: M&S 2; slides from Martin or Moore
  • Smoothing: M&S 6; J&M 4; Rosenfeld (2000)
  • 9/17 No class (Rosh Hashanah)
  • Assignment 1 due
        (& another sign meant 3 ... ?)
    Assignment 2 given: Probabilities
    Limitations of CFG
  • Discussion of Asst. 1
  • Improving CFG with features (ppt)
  • Morphology
  • Lexicalization
  • Tenses
  • Gaps (slashes)
  • Features: J&M 15
  • 9/24 Assignment 3 given: Language Models
    Context-free parsing (ppt)
  • What is parsing?
  • Why is it useful?
  • Brute-force algorithm
  • CKY and Earley algorithms
  • No class (Yom Kippur)
    Context-free parsing
  • From recognition to parsing
  • Incremental strategy
  • Dotted rules
  • Sparse matrices
  • Parsing: J&M 13
  • 10/1 Earley's algorithm (ppt)
  • Top-down parsing
  • Earley's algorithm
  • Extending CFG (summary (ppt))
  • CCG
  • TSG, TAG, TIG
  • Probabilistic parsing (ppt)
  • PCFG parsing
  • Dependency grammar
  • Lexicalized PCFGs
  • CCG: Steedman & Baldridge; more
  • TAG/TSG: Van Noord, Guo, Zhang 1/2/3
  • Prob. parsing: M&S 12, J&M 14
  • 10/8 Assignment 3 due
    Assignment 4 given: Parsing
    Parsing tricks (ppt)
  • Pruning; best-first
  • Rules as regexps
  • Left-corner strategy
  • Smoothing
  • Evaluation
  • Catch-up day
        (we'll be behind schedule by now)
  • A song about parsing
  • Human sentence processing (ppt)
  • Methodology
  • Frequency sensitivity
  • Incremental interpretation
  • Unscrambling text (ppt)
  • TBA
    10/15 (Monday 10/15 is fall break day; but class meets on Tuesday 10/16, which will follow a Monday schedule)
    Semantics (ppt)
  • What is understanding?
  • Lambda terms
  • Semantic phenomena and representations
  • Semantics continued
  • More semantic phenomena and representations
  • Assignment 5 given: Semantics
    Semantics continued
  • Adding semantics to CFG rules
  • Compositional semantics
  • J&M 17-18; also this web page, up to but not including "denotational semantics" section; and you could try the Penn Lambda Calculator; and how about lambda calculus for kids?
    10/22 Midterm exam
    (3-4:30 in classroom)
    Forward-backward algorithm (ppt) (Excel spreadsheet; Viterbi version; lesson plan; video lecture)
  • Ice cream, weather, words and tags
  • Forward and backward probabilities
  • Inferring hidden states
  • Controlling the smoothing effect
  • Forward-backward continued
  • Reestimation
  • Likelihood convergence
  • Symmetry breaking
  • Local maxima
  • Uses of states
  • J&M 6 or perhaps Allen pp. 195-208 (handout); M&S 11
    10/29 Assignment 4 due
    Assignment 6 given: Hidden Markov Models
    Expectation Maximization (ppt)
  • Generalizing the forward-backward strategy
  • Inside-outside algorithm
  • Posterior decoding
  • Finite-state algebra (ppt)
  • Regexp review
  • Properties
  • Functions, relations, composition
  • Simple applications
  • Finite-state machines
  • Acceptors
  • Expressive power
  • Weights and semirings
  • Lattice parsing
  • Transducers
  • John Lafferty's inside-outside notes; R&S 1
    11/5 Finite-state implementation (ppt)
  • Finite-state operators
  • Uses of composition
  • Implementing the operators
  • Finite-state tagging (ppt)
  • The task
  • Hidden Markov Models
  • Transformation-based
  • Constraint-based
  • Assignment 5 due
    Noisy channels and FSTs (ppt)
  • Regexps and segmentation
  • The noisy channel generalization
  • Applications of the noisy channel
  • Implementation using FSTs
  • chaps 2-3 of xfst book draft (only accessible from barley and other Solaris machines at JHU CS; don't distribute); R&S 2; perhaps also this paper
    11/12 More FST examples (ppt)
  • Baby talk
  • Edit distance
  • Back-transliteration
  • Machine translation
  • Programming with regexps (ppt)
  • Analogy to programming
  • Extended finite-state operators
  • Date parsing
  • FASTUS
  • Morphology and phonology (ppt)
  • English, Turkish, Arabic
  • Stemming
  • Compounds, segmentation
  • Two-level morphology
  • Punctuation
  • Rewrite rules
  • OT
  • J&M 5 or M&S 10
    11/19 Assignment 6 due
    Assignment 7 given: Finite-State Modeling
    Log-linear models (ppt)
  • Text classification
  • Maximum entropy perspective
  • Gradient ascent
  • No class (Thanksgiving break) No class
    (Thanksgiving break)
    J&M 6
    11/26 Current NLP tasks and competitions (ppt)
  • The NLP research community
  • Text annotation tasks
  • Other tasks
  • Applied NLP continued (ppt) (not covered this year)
    Topic models (video lecture part 1, part 2)
  • Bayesian multinomial
  • Bayesian text categorization
  • Latent Dirichlet allocation
  • Dynamic topic models
  • 12/3 Machine translation
  • Guest lecture by Matt Post
  • MT continued
    Assignment 7 due
    TAG parsing
  • Guest lecture by Darcey Riley
  • J&M 25, M&S 13, statmt.org;
    tutorial (2003), workbook (1999), introductory essay (1997), technical paper (1993);
    tutorial (2006) focusing on more recent developments (slides, 3-hour video part 1, part 2)
    12/10

    Sun 12/16 is absolute deadline for late assignments --->

    Final exam: Thu 12/20, 9am-noon --->


    Old Materials

    Old assignment: Lectures from past years, some still useful: