Range |
Letter |

93+ | A |

90-92 | A- |

87-89 | B+ |

83-86 | B |

80-82 | B- |

77-79 | C+ |

73-76 | C |

70-72 | C- |

67-69 | D+ |

63-66 | D |

60-62 | D- |

59- | F |

Friday is the first day of break, so we have no class. The next class is April 14, when we’ll start the topic of Machine Translation.

Have a nice holiday!!

-James

]]>I will assume that you will be familiar with most of the basic formulas from the material I presented in class. This would include:

- Bayes’ Theorem
- MLE
- MI and PMI
- Smoothing functions
- The basic similarity functions for distributions
- The basics for dimensionality reduction

Don’t worry. The material is broad, but no question can be that deep. Good luck!!

-James

]]>- Types of ambiguity: lexical, syntactic, semantic, pragmatic
- FSAs and Finite-state Transducers:
- basic principles
- examples

- Edit Distance: judge similarity of 4-6 words, in terms of edit score
- N-gram models: definitions, examples
- Noisy Channel Model;
- Naive Bayes Classifiers
- Smoothing, Back-off, Interpolation
- Hidden Markov Models
- Part-of-speech Tagging
- Dependency Structures as used in parsing
- Distributional Semantics
- Word-context vector representations
- Similarity measures

Good luck, and see you Tuesday.

-James

]]>

Sorry I was unable to make it to class, as I was called away to solve a snow-related emergency at home. I trust that Nikhil and Te covered part-of-speech tagging and the Viterbi algorithm with you. I will do a quick review of Viterbi on Friday and then we move into syntax, to prepare ourselves for the CYK algorithm!

Cheers,

James

-James ]]>

Please take a look at the lab exercise. If you cannot do all of them, you should attend the lab. Tuesdays 6:30pm – 7:30pm.

See you guys in class tomorrow.

.te

]]>

http://doodle.com/kqf2ap2muimdmg2h

]]>