- Practical 4.1 – RNN forward and backward

Practical 4.1 – RNN forward and backward

Recurrent Neural Networks – Forward and backward
Full project: https://github.com/Atcold/torch-Video-Tutorials

Notes:
13:22 – x[t] is concatenated with h[t−1]; at least it is written in green...
21:28 – Not quite. The unrolling number T represents the hierarchy you want to use for processing you...
Recurrent Neural Networks – Forward and backward
Full project: https://github.com/Atcold/torch-Video-Tutorials

Notes:
13:22 – x[t] is concatenated with h[t−1]; at least it is written in green...
21:28 – Not quite. The unrolling number T represents the hierarchy you want to use for processing your input and does not necessary need to be equal or greater to the max length of your matching sequence in order to capture those relationships, given that the state is preserved across sequences chunks (h[3].new_sequence = h[3].previous_sequence), and not zeroed.

Alfredo Canziani

※本サイトに掲載されているチャンネル情報や動画情報はYouTube公式のAPIを使って取得・表示しています。

Timetable

動画タイムテーブル