- 10 – Self / cross, hard / soft attention and the Transformer

10 – Self / cross, hard / soft attention and the Transformer

Course website: http://bit.ly/DLSP21-web
Playlist: http://bit.ly/DLSP21-YouTube
Speaker: Alfredo Canziani

Chapters
00:00 – Welcome to class
00:15 – Listening to YouTube from the terminal
00:36 – Summarising papers with @Notion
01:45 – Reading papers collaboratively
03:15 – Attention! Self / cros...
Course website: http://bit.ly/DLSP21-web
Playlist: http://bit.ly/DLSP21-YouTube
Speaker: Alfredo Canziani

Chapters
00:00 – Welcome to class
00:15 – Listening to YouTube from the terminal
00:36 – Summarising papers with @Notion
01:45 – Reading papers collaboratively
03:15 – Attention! Self / cross, hard / soft
06:44 – Use cases: set encoding!
12:10 – Self-attention
28:45 – Key-value store
29:32 – Queries, keys, and values → self-attention
39:49 – Queries, keys, and values → cross-attention
45:27 – Implementation details
48:11 – The Transformer: an encoder-predictor-decoder architecture
54:59 – The Transformer encoder
56:47 – The Transformer “decoder” (which is an encoder-predictor-decoder module)
1:01:49 – Jupyter Notebook and PyTorch implementation of a Transformer encoder
1:10:51 – Goodbye :)

#PyTorch #NYU #Yann LeCun #Deep Learning #neural networks
– Welcome to class - 10 – Self / cross, hard / soft attention and the Transformer

– Welcome to class

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:00:00 - 00:00:15
– Listening to YouTube from the terminal - 10 – Self / cross, hard / soft attention and the Transformer

– Listening to YouTube from the terminal

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:00:15 - 00:00:36
– Summarising papers with @Notion - 10 – Self / cross, hard / soft attention and the Transformer

– Summarising papers with @Notion

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:00:36 - 00:01:45
– Reading papers collaboratively - 10 – Self / cross, hard / soft attention and the Transformer

– Reading papers collaboratively

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:01:45 - 00:03:15
– Attention! Self / cross, hard / soft - 10 – Self / cross, hard / soft attention and the Transformer

– Attention! Self / cross, hard / soft

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:03:15 - 00:06:44
– Use cases: set encoding! - 10 – Self / cross, hard / soft attention and the Transformer

– Use cases: set encoding!

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:06:44 - 00:12:10
– Self-attention - 10 – Self / cross, hard / soft attention and the Transformer

– Self-attention

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:12:10 - 00:28:45
Hello Alf,   Everyone here prob know what you are trying to say, but I think the matrix transpose animation should be fixed. - 10 – Self / cross, hard / soft attention and the Transformer

Hello Alf, Everyone here prob know what you are trying to say, but I think the matrix transpose animation should be fixed.

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:22:34 - 01:12:01
– Key-value store - 10 – Self / cross, hard / soft attention and the Transformer

– Key-value store

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:28:45 - 00:29:32
– Queries, keys, and values  → self-attention - 10 – Self / cross, hard / soft attention and the Transformer

– Queries, keys, and values → self-attention

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:29:32 - 00:39:49
– Queries, keys, and values → cross-attention - 10 – Self / cross, hard / soft attention and the Transformer

– Queries, keys, and values → cross-attention

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:39:49 - 00:45:27
– Implementation details - 10 – Self / cross, hard / soft attention and the Transformer

– Implementation details

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:45:27 - 00:48:11
– The Transformer: an encoder-predictor-decoder architecture - 10 – Self / cross, hard / soft attention and the Transformer

– The Transformer: an encoder-predictor-decoder architecture

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:48:11 - 00:54:59
– The Transformer encoder - 10 – Self / cross, hard / soft attention and the Transformer

– The Transformer encoder

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:54:59 - 00:56:47
– The Transformer “decoder” (which is an encoder-predictor-decoder module) - 10 – Self / cross, hard / soft attention and the Transformer

– The Transformer “decoder” (which is an encoder-predictor-decoder module)

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
00:56:47 - 01:01:49
, the 'h' after the predictor. What is the truth of that h? (Hidden representation) - 10 – Self / cross, hard / soft attention and the Transformer

, the 'h' after the predictor. What is the truth of that h? (Hidden representation)

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
01:01:25 - 01:12:01
– Jupyter Notebook and PyTorch implementation of a Transformer encoder - 10 – Self / cross, hard / soft attention and the Transformer

– Jupyter Notebook and PyTorch implementation of a Transformer encoder

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
01:01:49 - 01:10:51
Hi Alf, thanks for the great explanation !! A question, why do we set the bias=False, why do we not need an affine transformation of the input space, but just rotation? @ - 10 – Self / cross, hard / soft attention and the Transformer

Hi Alf, thanks for the great explanation !! A question, why do we set the bias=False, why do we not need an affine transformation of the input space, but just rotation? @

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
01:04:00 - 01:12:01
– Goodbye :) - 10 – Self / cross, hard / soft attention and the Transformer

– Goodbye :)

10 – Self / cross, hard / soft attention and the Transformer
2021年06月15日 
01:10:51 - 01:12:01

Alfredo Canziani

※本サイトに掲載されているチャンネル情報や動画情報はYouTube公式のAPIを使って取得・表示しています。

Timetable

動画タイムテーブル