Attention is Cheap!(00:14:06 - 00:16:05) - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Attention is Cheap!(00:14:06 - 00:16:05)
Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3niIw41

Professor Christopher Manning, Stanford University, Ashish Vaswani & Anna Huang, Google
http://onlinehub.stanford.edu/

Professor Christopher Manning
Thomas M...
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit: https://stanford.io/3niIw41

Professor Christopher Manning, Stanford University, Ashish Vaswani & Anna Huang, Google
http://onlinehub.stanford.edu/

Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)

To follow along with the course schedule and syllabus, visit: http://web.stanford.edu/class/cs224n/index.html#schedule

0:00 Introduction
2:07 Learning Representations of Variable Length Data
2:28 Recurrent Neural Networks
4:51 Convolutional Neural Networks?
14:06 Attention is Cheap!
16:05 Attention head: Who
16:26 Attention head: Did What?
16:35 Multihead Attention
17:34 Machine Translation: WMT-2014 BLEU
19:07 Frameworks
19:31 Importance of Residuals
23:26 Non-local Means
26:18 Image Transformer Layer
30:56 Raw representations in music and language
37:52 Attention: a weighted average
40:08 Closer look at relative attention
42:41 A Jazz sample from Music Transformer
44:42 Convolutions and Translational Equivariance
45:12 Relative positions Translational Equivariance
50:21 Sequential generation breaks modes.
50:32 Active Research Area

#naturallanguageprocessing #deeplearning

#Stanford #Stanford Online #NLP #Deep Learning #AI #CS224N
Introduction - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Introduction

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:00:00 - 00:02:07
Learning Representations of Variable Length Data - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Learning Representations of Variable Length Data

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:02:07 - 00:02:28
Recurrent Neural Networks - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Recurrent Neural Networks

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:02:28 - 00:04:51
Convolutional Neural Networks? - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Convolutional Neural Networks?

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:04:51 - 00:14:06
Attention is Cheap! - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Attention is Cheap!

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:14:06 - 00:16:05
Attention head: Who - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Attention head: Who

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:16:05 - 00:16:26
Attention head: Did What? - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Attention head: Did What?

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:16:26 - 00:16:35
Multihead Attention - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Multihead Attention

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:16:35 - 00:17:34
Machine Translation: WMT-2014 BLEU - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Machine Translation: WMT-2014 BLEU

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:17:34 - 00:19:07
Frameworks - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Frameworks

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:19:07 - 00:19:31
Importance of Residuals - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Importance of Residuals

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:19:31 - 00:23:26
Non-local Means - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Non-local Means

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:23:26 - 00:26:18
Image Transformer Layer - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Image Transformer Layer

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:26:18 - 00:30:56
Raw representations in music and language - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Raw representations in music and language

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:30:56 - 00:37:52
Attention: a weighted average - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Attention: a weighted average

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:37:52 - 00:40:08
Closer look at relative attention - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Closer look at relative attention

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:40:08 - 00:42:41
A Jazz sample from Music Transformer - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

A Jazz sample from Music Transformer

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:42:41 - 00:44:42
Convolutions and Translational Equivariance - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Convolutions and Translational Equivariance

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:44:42 - 00:45:12
Relative positions Translational Equivariance - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Relative positions Translational Equivariance

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:45:12 - 00:50:21
Sequential generation breaks modes. - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Sequential generation breaks modes.

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:50:21 - 00:50:32
Active Research Area - Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention

Active Research Area

Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 14 – Transformers and Self-Attention
2019年03月22日 
00:50:32 - 00:53:48

Stanford Online

※本サイトに掲載されているチャンネル情報や動画情報はYouTube公式のAPIを使って取得・表示しています。

Timetable

動画タイムテーブル

動画数:2418件