– Word-level training with minimal supervision(00:00:43 - 00:20:41) - Week 6 – Lecture: CNN applications, RNN, and attention

– Word-level training with minimal supervision(00:00:43 - 00:20:41)
Week 6 – Lecture: CNN applications, RNN, and attention

Course website: http://bit.ly/DLSP20-web
Playlist: http://bit.ly/pDL-YouTube
Speaker: Yann LeCun
Week 6: http://bit.ly/DLSP20-06

0:00:00 – Week 6 – Lecture

LECTURE Part A: http://bit.ly/DLSP20-06-1
We discussed three applications of convolutional neural networks. We started with digit recogniti...
Course website: http://bit.ly/DLSP20-web
Playlist: http://bit.ly/pDL-YouTube
Speaker: Yann LeCun
Week 6: http://bit.ly/DLSP20-06

0:00:00 – Week 6 – Lecture

LECTURE Part A: http://bit.ly/DLSP20-06-1
We discussed three applications of convolutional neural networks. We started with digit recognition and the application to a 5-digit zip code recognition. In object detection, we talk about how to use multi-scale architecture in a face-detection setting. Lastly, we saw how ConvNets are used in semantic segmentation tasks with concrete examples in a robotic vision system and object segmentation in an urban environment.
0:00:43 – Word-level training with minimal supervision
0:20:41 – Face Detection and Semantic Segmentation
0:27:49 – ConvNet for Long-Range Adaptive Robot Vision and Scene Parsing

LECTURE Part B: http://bit.ly/DLSP20-06-2
We examine Recurrent Neural Networks, their problems, and common techniques for mitigating these issues. We then review a variety of modules developed to resolve RNN model issues including Attention, GRUs (Gated Recurrent Unit), LSTMs (Long Short-Term Memory), and Seq2Seq.
0:43:40 – Recurrent Neural Networks and Attention Mechanisms
0:59:09 – GRUs, LSTMs, and Seq2Seq Models
1:16:15 – Memory Networks

#CNN #Yann LeCun #Deep Learning #RNN #LSTM #Attention #PyTorch #NYU
– Week 6 – Lecture - Week 6 – Lecture: CNN applications, RNN, and attention

– Week 6 – Lecture

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:00:00 - 00:00:43
– Word-level training with minimal supervision - Week 6 – Lecture: CNN applications, RNN, and attention

– Word-level training with minimal supervision

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:00:43 - 00:20:41
Hi, Thanks very much for all these videos!  At around , the 2nd last layer is 5x5, and we use different widths to signify different windows. But if for instance you use a 5x4 kernel as the last layer, wouldnt you have an output thats 1x2? You will have more columns now, but it seems in the picture whether you used a 5x5 or 5x4 or 5x3 or 5x2 kernel the number of columns remain the same. Do you have to do additional steps after changing the kernel size of the last layer? - Week 6 – Lecture: CNN applications, RNN, and attention

Hi, Thanks very much for all these videos! At around , the 2nd last layer is 5x5, and we use different widths to signify different windows. But if for instance you use a 5x4 kernel as the last layer, wouldnt you have an output thats 1x2? You will have more columns now, but it seems in the picture whether you used a 5x5 or 5x4 or 5x3 or 5x2 kernel the number of columns remain the same. Do you have to do additional steps after changing the kernel size of the last layer?

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:06:14 - 01:28:48
What values are in the big array at ? Does he even say? - Week 6 – Lecture: CNN applications, RNN, and attention

What values are in the big array at ? Does he even say?

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:13:10 - 01:28:48
– Face Detection and Semantic Segmentation - Week 6 – Lecture: CNN applications, RNN, and attention

– Face Detection and Semantic Segmentation

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:20:41 - 00:27:49
– ConvNet for Long Range Adaptive Robot Vision and Scene Parsing - Week 6 – Lecture: CNN applications, RNN, and attention

– ConvNet for Long Range Adaptive Robot Vision and Scene Parsing

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:27:49 - 00:43:40
At , it does not look like all the "paths" share the same weights - except for the last layer (just before the output.) But he says they use the same kernels. I guess the illustration is maybe a bit misleading and the truth is that they do indeed share weights? - Week 6 – Lecture: CNN applications, RNN, and attention

At , it does not look like all the "paths" share the same weights - except for the last layer (just before the output.) But he says they use the same kernels. I guess the illustration is maybe a bit misleading and the truth is that they do indeed share weights?

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:39:23 - 01:28:48
– Recurrent Neural Networks and Attention Mechanisms - Week 6 – Lecture: CNN applications, RNN, and attention

– Recurrent Neural Networks and Attention Mechanisms

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:43:40 - 00:59:09
– GRUs, LSTMs, and Seq2Seq Models - Week 6 – Lecture: CNN applications, RNN, and attention

– GRUs, LSTMs, and Seq2Seq Models

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
00:59:09 - 01:16:15
I have a question. From slide @. Could it be that the bottom equation for h_t does not corresponds exactly to the green diagram? Because z_t and 1-z_t seems to be switched: - Week 6 – Lecture: CNN applications, RNN, and attention

I have a question. From slide @. Could it be that the bottom equation for h_t does not corresponds exactly to the green diagram? Because z_t and 1-z_t seems to be switched:

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
01:01:51 - 01:28:48
– Memory Networks - Week 6 – Lecture: CNN applications, RNN, and attention

– Memory Networks

Week 6 – Lecture: CNN applications, RNN, and attention
2020年04月13日 
01:16:15 - 01:28:48

Alfredo Canziani

※本サイトに掲載されているチャンネル情報や動画情報はYouTube公式のAPIを使って取得・表示しています。

Timetable

動画タイムテーブル