- 03L – Parameter sharing: recurrent and convolutional nets

03L – Parameter sharing: recurrent and convolutional nets

Course website: http://bit.ly/DLSP21-web
Playlist: http://bit.ly/DLSP21-YouTube
Speaker: Yann LeCun

Chapters
00:00:00 – Welcome to class
00:00:49 – Hypernetworks
00:02:24 – Shared weights
00:06:10 – Parameter sharing ⇒ adding the gradients
00:09:33 – Max and sum reductions
00:11:46 – Recurrent n...
Course website: http://bit.ly/DLSP21-web
Playlist: http://bit.ly/DLSP21-YouTube
Speaker: Yann LeCun

Chapters
00:00:00 – Welcome to class
00:00:49 – Hypernetworks
00:02:24 – Shared weights
00:06:10 – Parameter sharing ⇒ adding the gradients
00:09:33 – Max and sum reductions
00:11:46 – Recurrent nets
00:14:20 – Unrolling in time
00:16:17 – Vanishing and exploding gradients
00:19:48 – Math on the whiteboard
00:23:18 – RNN tricks
00:24:29 – RNN for differential equations
00:27:18 – GRU
00:28:23 – What is a memory
00:41:26 – LSTM – Long Short-Term Memory net
00:43:11 – Multilayer LSTM
00:46:01 – Attention for sequence to sequence mapping
00:48:41 – Convolutional nets
00:50:50 – Detecting motifs in images
00:56:57 – Convolution definition(s)
00:59:43 – Backprop through convolutions
01:03:42 – Stride and skip: subsampling and convolution “à trous”
01:06:56 – Convolutional net architecture
01:19:08 – Multiple convolutions
01:20:37 – Vintage ConvNets
01:32:32 – How does the brain interpret images?
01:37:18 – Hubel & Wiesel's model of the visual cortex
01:42:51 – Invariance and equivariance of ConvNets
01:49:23 – In the next episode…
01:52:54 – Training time, iteration cycle, and historical remarks

#PyTorch #NYU #Yann LeCun #Deep Learning #neural networks
– Welcome to class - 03L – Parameter sharing: recurrent and convolutional nets

– Welcome to class

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:00:00 - 00:00:49
– Hypernetworks - 03L – Parameter sharing: recurrent and convolutional nets

– Hypernetworks

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:00:49 - 00:02:24
– Shared weights - 03L – Parameter sharing: recurrent and convolutional nets

– Shared weights

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:02:24 - 00:06:10
– Parameter sharing ⇒ adding the gradients - 03L – Parameter sharing: recurrent and convolutional nets

– Parameter sharing ⇒ adding the gradients

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:06:10 - 00:09:33
– Max and sum reductions - 03L – Parameter sharing: recurrent and convolutional nets

– Max and sum reductions

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:09:33 - 00:11:46
– Recurrent nets - 03L – Parameter sharing: recurrent and convolutional nets

– Recurrent nets

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:11:46 - 00:14:20
– Unrolling in time - 03L – Parameter sharing: recurrent and convolutional nets

– Unrolling in time

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:14:20 - 00:16:17
– Vanishing and exploding gradients - 03L – Parameter sharing: recurrent and convolutional nets

– Vanishing and exploding gradients

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:16:17 - 00:19:48
haha well done Alfredo xD giving Yann some vfx - 03L – Parameter sharing: recurrent and convolutional nets

haha well done Alfredo xD giving Yann some vfx

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:16:42 - 01:59:47
At  Yann says that, if we multiply the "z" vector with the "W" matrix, the result is simply a rotation of the z vector. Can someone please explain this? Does this mean that "W" acts as a rotation matrix? If so, how? - 03L – Parameter sharing: recurrent and convolutional nets

At Yann says that, if we multiply the "z" vector with the "W" matrix, the result is simply a rotation of the z vector. Can someone please explain this? Does this mean that "W" acts as a rotation matrix? If so, how?

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:17:13 - 01:59:47
– Math on the whiteboard - 03L – Parameter sharing: recurrent and convolutional nets

– Math on the whiteboard

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:19:48 - 00:23:18
– RNN tricks - 03L – Parameter sharing: recurrent and convolutional nets

– RNN tricks

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:23:18 - 00:24:29
– RNN for differential equations - 03L – Parameter sharing: recurrent and convolutional nets

– RNN for differential equations

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:24:29 - 00:27:18
– GRU - 03L – Parameter sharing: recurrent and convolutional nets

– GRU

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:27:18 - 00:28:23
Also, really enjoying the lecture series. Thank you for making this available. 👍 - 03L – Parameter sharing: recurrent and convolutional nets

Also, really enjoying the lecture series. Thank you for making this available. 👍

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:27:19 - 01:59:47
– What is a memory - 03L – Parameter sharing: recurrent and convolutional nets

– What is a memory

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:28:23 - 00:41:26
– LSTM – Long Short-Term Memory net - 03L – Parameter sharing: recurrent and convolutional nets

– LSTM – Long Short-Term Memory net

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:41:26 - 00:43:11
– Multilayer LSTM - 03L – Parameter sharing: recurrent and convolutional nets

– Multilayer LSTM

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:43:11 - 00:46:01
– Attention for sequence to sequence mapping - 03L – Parameter sharing: recurrent and convolutional nets

– Attention for sequence to sequence mapping

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:46:01 - 00:48:41
– Convolutional nets - 03L – Parameter sharing: recurrent and convolutional nets

– Convolutional nets

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:48:41 - 00:50:50
– Detecting motifs in images - 03L – Parameter sharing: recurrent and convolutional nets

– Detecting motifs in images

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:50:50 - 00:56:57
– Convolution definition(s) - 03L – Parameter sharing: recurrent and convolutional nets

– Convolution definition(s)

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:56:57 - 00:59:43
– Backprop through convolutions - 03L – Parameter sharing: recurrent and convolutional nets

– Backprop through convolutions

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
00:59:43 - 01:03:42
at   why is Backprop to input  is just this : summation(w_k * del_c / del_y_(j-k) )it should be summation(w_k * del_c / del_y_(j-k) ) + summation(w_k * del_c / del_y_(j+k) )as x at index j will influence range of y's  from  y[j-k] to y[j+k] ,  where k is the size of window and also assuming stride is 1correct me if I am wrong - 03L – Parameter sharing: recurrent and convolutional nets

at why is Backprop to input is just this : summation(w_k * del_c / del_y_(j-k) )it should be summation(w_k * del_c / del_y_(j-k) ) + summation(w_k * del_c / del_y_(j+k) )as x at index j will influence range of y's from y[j-k] to y[j+k] , where k is the size of window and also assuming stride is 1correct me if I am wrong

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:03:24 - 01:59:47
– Stride and skip: subsampling and convolution “à trous” - 03L – Parameter sharing: recurrent and convolutional nets

– Stride and skip: subsampling and convolution “à trous”

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:03:42 - 01:06:56
– Convolutional net architecture - 03L – Parameter sharing: recurrent and convolutional nets

– Convolutional net architecture

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:06:56 - 01:19:08
– Multiple convolutions - 03L – Parameter sharing: recurrent and convolutional nets

– Multiple convolutions

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:19:08 - 01:20:37
– Vintage ConvNets - 03L – Parameter sharing: recurrent and convolutional nets

– Vintage ConvNets

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:20:37 - 01:32:32
was fantastic memory by Professor...so cool... - 03L – Parameter sharing: recurrent and convolutional nets

was fantastic memory by Professor...so cool...

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:32:00 - 01:59:47
young Yann: - 03L – Parameter sharing: recurrent and convolutional nets

young Yann:

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:32:20 - 01:33:55
– How does the brain interpret images? - 03L – Parameter sharing: recurrent and convolutional nets

– How does the brain interpret images?

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:32:32 - 01:37:18
Vertebrates evolved worst than the octopus: - 03L – Parameter sharing: recurrent and convolutional nets

Vertebrates evolved worst than the octopus:

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:33:55 - 01:54:44
– Hubel & Wiesel's model of the visual cortex - 03L – Parameter sharing: recurrent and convolutional nets

– Hubel & Wiesel's model of the visual cortex

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:37:18 - 01:42:51
– Invariance and equivariance of ConvNets - 03L – Parameter sharing: recurrent and convolutional nets

– Invariance and equivariance of ConvNets

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:42:51 - 01:49:23
– In the next episode… - 03L – Parameter sharing: recurrent and convolutional nets

– In the next episode…

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:49:23 - 01:52:54
– Training time, iteration cycle, and historical remarks - 03L – Parameter sharing: recurrent and convolutional nets

– Training time, iteration cycle, and historical remarks

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:52:54 - 01:59:47
s technological problems lmao:And at the end we get the realization that Yann had to code everything in C like a madman 🙃 - 03L – Parameter sharing: recurrent and convolutional nets

s technological problems lmao:And at the end we get the realization that Yann had to code everything in C like a madman 🙃

03L – Parameter sharing: recurrent and convolutional nets
2021年07月21日 
01:54:44 - 01:59:47

Alfredo Canziani

※本サイトに掲載されているチャンネル情報や動画情報はYouTube公式のAPIを使って取得・表示しています。

Timetable

動画タイムテーブル