Johns Hopkins Computer Science Home
Johns Hopkins University The Whiting School of Engineering

Natural Language Processing
Prof. Jason Eisner
Course # 600.465 — Spring 2016

parse trees

Announcements


Vital Statistics [1]

Course catalog entry: This course is an in-depth overview of techniques for processing human language. How should linguistic structure and meaning be represented? What algorithms can recover them from text? And crucially, how can we build statistical models to choose among the many legal answers?

The course covers methods for trees (parsing and semantic interpretation), sequences (finite-state transduction such as tagging and morphology), and words (sense and phrase induction), with applications to practical engineering tasks such as information retrieval and extraction, text classification, part-of-speech tagging, speech recognition, and machine translation. There are a number of structured but challenging programming assignments. Prerequisite: 600.226 or equivalent. [Applications, 3 credits]

Course objectives: Welcome! This course is designed to introduce you to some of the problems and solutions of NLP, and their relation to linguistics and statistics. You need to know how to program (e.g., 600.120) and use common data structures (600.226). It might also be nice—though it's not required—to have some previous familiarity with automata (600.271) and probabilities (600.475, 550.420, or 550.310). At the end you should agree (I hope!) that language is subtle and interesting, feel some ownership over some of NLP's formal and statistical techniques, and be able to understand research papers in the field.

Lectures:MWF 3-4 or 3-4:15, Hackerman B17.
Prof:Jason Eisner - (image of email address) ((image of email address))
TA:Dingquan Wang - (image of email address)
CAs:Miles Zhang - (image of email address)
Akshay Srivatsan - (image of email address)
Office hrs: For Prof: After class until 4:30; or by appt in Hackerman 324C
For TA/CAs: Ding: TuTh 10-11 in Hackerman 322. Akshay, Miles: TBA.
Discussion
session:
TA-led session (optional) for activities/discussion/questions/review: TBA
Discussion site: https://piazza.com/class/ijwfi1y4u3d6nh ... public questions, discussion, announcements
Web page:http://cs.jhu.edu/~jason/465
Textbook: Jurafsky & Martin, 2nd ed. (semi-required - P98.J87 2009 in Science Ref section on C-Level)
Roark & Sproat (recommended - P98.R63 2007 in same section)
Manning & Schütze (recommended - free online PDF version here!)
Policies: Grading: homework 50%, participation 5%, midterm 15%, final 30%
Submission: TBA
Lateness: floating late days policy
Honesty: CS integrity code, JHU undergraduate policies, JHU graduate policies
Intellectual engagement: much encouraged
Disabilities: If you need accommodations for a disability, obtain a letter from Student Disability Services, 385 Garland, (410) 516-4720.
integrity code
Announcements: Read mailing list and this page!
Related
sites:
List of many courses - some may have useful material!

Schedule

This class is in the "flexible time slot" MWF 3-4:30. Please keep the entire slot open. Class will usually run 3-4, followed by office hours in the classroom from 4-4:30 (stick around to get your money's worth). However, class will sometimes run till 4:15 in order to keep up with the syllabus. I'll try to give advance notice of these "long classes," which among other things make up for no-class days when I'm out of town.

We'll also schedule a once-per-week discussion session led by your TA. This optional session will focus on solving problems together. That's meant as an efficient and cooperative way to study for an hour: it reinforces the past week's class material without adding to your homework load. Also, if you come to discussion session as recommended, you won't be startled by the exam style — the discussion problems are taken from past exams and are generally interesting.

Warning: The schedule below may change. Links to future lectures and assignments may also change (they currently point to last year's versions).

Warning: I sometimes turn off the PDF links when they are not up to date with the PPT links. If they don't work, just click on "ppt" instead.

-->
Week Monday Wednesday Friday Suggested Reading
1/25 No class (blizzard day) Introduction (ppt)
  • Questionnaire
  • Why is NLP hard?
  • Levels of language
  • NLP applications
  • Random language via n-grams
  • Assignment 1 given: Designing CFGs
    Modeling grammaticality (ppt)
  • What's wrong with n-grams?
  • Regular expressions, FSAs, CFGs, ...
  • Intro: J&M chapter 1
  • Chomsky hierarchy: J&M 16
  • Homework: J&M 12, M&S 3
  • 2/1 Language models (ppt)
  • Language ID
  • Text categorization
  • Spelling correction
  • Segmentation
  • Speech recognition
  • Machine translation
  • Probability concepts (ppt; video lecture)
  • Joint & conditional prob
  • Chain rule and backoff
  • Modeling sequences
  • Cross-entropy and perplexity
  • Bayes' Theorem (ppt)
    Smoothing n-grams (ppt)
  • Maximum likelihood estimation
  • Bias and variance
  • Add-one or add-lambda smoothing
  • Good-Turing discounting
  • Smoothing with backoff
  • Deleted interpolation
  • Language models: M&S 6 (or R&S 6) Martin or Moore
  • Prob/Bayes: M&S 2
  • 2/8 Assignment 2 given: Probabilities
    Smoothing continued
  • Conditional log-linear models (interactive visualization)
  • Regularization
  • Catch-up day
        (we'll be behind schedule by now)
  • Assignment 1 due
        (& another sign meant 3 ... ?)
  • Discussion of Asst. 1
  • Improving CFG with attributes (ppt)
  • Morphology
  • Lexicalization
  • Tenses
  • Gaps (slashes)
  • Smoothing: M&S 6; J&M 4; Rosenfeld (2000)
  • Attributes: J&M 15
  • 2/15 Assignment 3 given: Language Models
    Context-free parsing (ppt)
  • What is parsing?
  • Why is it useful?
  • Brute-force algorithm
  • CKY and Earley algorithms
  • Assignment 2 due
    Context-free parsing
  • From recognition to parsing
  • Incremental strategy
  • Dotted rules
  • Sparse matrices
  • Quick in-class quiz: Log-linear models
    Earley's algorithm (ppt)
  • Top-down parsing
  • Earley's algorithm
  • Parsing: J&M 13
  • 2/22 Extending CFG (summary (ppt))
  • CCG
  • TSG
  • TAG (by Darcey Riley)
  • Probabilistic parsing (ppt)
  • PCFG parsing
  • Dependency grammar
  • Lexicalized PCFGs
  • Assignment 4 given: Parsing
    Parsing tricks (ppt)
  • Pruning; best-first
  • Rules as regexps
  • Left-corner strategy
  • Smoothing
  • Evaluation
  • A song about parsing
  • CCG: Steedman & Baldridge; more
  • TAG/TSG: Van Noord, Guo, Zhang 1/2/3
  • Prob. parsing: M&S 12, J&M 14
  • 2/29 Assignment 3 due
    Human sentence processing (ppt)
  • Methodology
  • Frequency sensitivity
  • Incremental interpretation
  • Unscrambling text (ppt)
  • Semantics (ppt)
  • What is understanding?
  • Lambda terms
  • Semantic phenomena and representations
  • Semantics continued
  • More semantic phenomena and representations
  • Psycholinguistics: Tanenhaus & Trueswell (2006), Human Sentence Processing website
  • 3/7 Assignment 5 given: Semantics
    Semantics continued
  • Adding semantics to CFG rules
  • Compositional semantics
  • Midterm exam
    (3-4:30 in classroom)
    Learning in the limit (ppt)
  • Gold's theorem
  • Semantics: J&M 17-18; this web page, up to but not including "denotational semantics" section; try the Penn Lambda Calculator; lambda calculus for kids
  • 3/14 No class
    (spring break)
    No class
    (spring break)
    No class
    (spring break)
    3/21 Forward-backward algorithm (ppt) (Excel spreadsheet; Viterbi version; lesson plan; video lecture)
  • Ice cream, weather, words and tags
  • Forward and backward probabilities
  • Inferring hidden states
  • Controlling the smoothing effect
  • Assignment 4 due
    Forward-backward continued
  • Reestimation
  • Likelihood convergence
  • Symmetry breaking
  • Local maxima
  • Uses of states
  • Assignment 6 given: Hidden Markov Models
    Expectation Maximization (ppt)
  • Generalizing the forward-backward strategy
  • Inside-outside algorithm
  • Posterior decoding
  • Forward-backward: J&M 6
  • Inside-outside: John Lafferty's notes; M&S 11
  • 3/28 Midterm exam moved here?
    Finite-state algebra (ppt)
  • Regexp review
  • Properties
  • Functions, relations, composition
  • Simple applications
  • Finite-state machines
  • Acceptors
  • Expressive power
  • Weights and semirings
  • Lattice parsing
  • Transducers
  • Assignment 5 due
    Finite-state implementation (ppt)
  • Finite-state operators
  • Uses of composition
  • Implementing the operators
  • Finite-state machines: R&S 1
  • Finite-state operators: chaps 2-3 of XFST book draft
  • 4/4 Assignment 7 given: Finite-State Modeling
    Noisy channels and FSTs (ppt)
  • Segmentation
  • Spelling correction
  • The noisy channel generalization
  • Implementation using FSTs
  • Noisy-channel FSTs continued
  • Baby talk
  • Morphology
  • Edit distance
  • Transliteration
  • Speech recognition
  • Finite-state tagging (ppt)
  • The task
  • Hidden Markov Models
  • Transformation-based
  • Constraint-based
  • Finite-state NLP: Karttunen (1997)
  • Tagging: J&M 5 or M&S 10
  • 4/11 Assignment 6 due
    Programming with regexps (ppt)
  • Analogy to programming
  • Extended finite-state operators
  • Date parsing
  • FASTUS
  • Morphology and phonology (ppt)
  • English, Turkish, Arabic
  • Stemming
  • Compounds, segmentation
  • Two-level morphology
  • Punctuation
  • Rewrite rules
  • OT
  • Optimal paths in graphs
  • The Dyna perspective
  • Morphology: R&S 2
  • Dyna tutorial and non-required assignment
  • 4/18 Structured prediction (ppt)
  • Perceptrons
  • CRFs
  • Feature engineering
  • Generative vs. discriminative
  • Current NLP tasks and competitions (ppt)
  • The NLP research community
  • Text annotation tasks
  • Other types of tasks
  • Applied NLP continued (ppt) Explore links in "NLP tasks" slides
    4/25 Applied NLP continued (ppt) Topic models Assignment 7 due
    Machine translation
  • Guest lecture by Matt Post or Philipp Koehn?
  • Topic models: intro readings/slides from Dave Blei, slides by Jason Eisner/A> (video lecture part 1, part 2)
  • MT: J&M 25, M&S 13, statmt.org; tutorial (2003), workbook (1999), introductory essay (1997), technical paper (1993); tutorial (2006) focusing on more recent developments (slides, 3-hour video part 1, part 2)
  • 5/2

    Final exam: Fri 5/6, 2-5pm


    Old Materials

    Lectures from past years, some still useful: Old assignment:

    ABET Outcomes

    Course Outcomes

    Program Outcomes