- Stanford CS224N: NLP with Deep Learning | Winter 2021 | Lecture 1 - Intro & Word Vectors

Stanford CS224N: NLP with Deep Learning | Winter 2021 | Lecture 1 - Intro & Word Vectors

For more information about Stanford's Artificial Intelligence professional and graduate programs visit: https://stanford.io/3w46jar

This lecture covers:
1. The course (10min)
2. Human language and word meaning (15 min)
3. Word2vec algorithm introduction (15 min)
4. Word2vec objective function gr...
For more information about Stanford's Artificial Intelligence professional and graduate programs visit: https://stanford.io/3w46jar

This lecture covers:
1. The course (10min)
2. Human language and word meaning (15 min)
3. Word2vec algorithm introduction (15 min)
4. Word2vec objective function gradients (25 min)
5. Optimization basics (5min)
6. Looking at word vectors (10 min or less)

Key learning: The (really surprising!) result that word meaning can be representing rather well by a large vector of real numbers.

This course will teach:
1. The foundations of the effective modern methods for deep learning applied to NLP. Basics first, then key methods used in NLP: recurrent networks, attention, transformers, etc.
2. A big picture understanding of human languages and the difficulties in understanding and producing them
3. An understanding of an ability to build systems (in Pytorch) for some of the major problems in NLP. Word meaning, dependency parsing, machine translation, question answering.

To learn more about this course visit: https://online.stanford.edu/courses/cs224n-natural-language-processing-deep-learning
To follow along with the course schedule and syllabus visit: http://web.stanford.edu/class/cs224n/

Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)

0:00 Introduction
1:43 Goals
3:10 Human Language
10:07 Google Translate
10:43 GPT
14:13 Meaning
16:19 Wordnet
19:11 Word Relationships
20:27 Distributional Semantics
23:33 Word Embeddings
27:31 Word tovec
37:55 How to minimize loss
39:55 Interactive whiteboard
41:10 Gradient
48:50 Chain Rule

#Natural language #Natural Language Processing #Deep Learning #Stanford AI Lectures #Stanford Graduate courses #Computer science #language understanding #Stanford #Stanford Online

Stanford Online

※本サイトに掲載されているチャンネル情報や動画情報はYouTube公式のAPIを使って取得・表示しています。

Timetable

動画タイムテーブル