Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)
0:00 Introduction
0:33 Overview
2:50 You use Language Models every day!
5:36 n-gram Language Models: Example
10:12 Sparsity Problems with n-gram Language Models
10:58 Storage Problems with n-gram Language Models
11:34 n-gram Language Models in practice
12:53 Generating text with a n-gram Language Model
15:08 How to build a neural Language Model?
16:03 A fixed-window neural Language Model
20:57 Recurrent Neural Networks (RNN)
22:39 ARNN Language Model
32:51 Training a RNN Language Model
36:35 Multivariable Chain Rule
37:10 Backpropagation for RNNs: Proof sketch
41:23 Generating text with a RNN Language Model
51:39 Evaluating Language Models
53:30 RNNs have greatly improved perplexity
54:09 Why should we care about Language Modeling?
58:30 Recap
59:21 RNNs can be used for tagging