This video introduces visualization for architectures and monitoring training.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video explains sequence generation using RNNs.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video discusses Gated Recurrent Units and compares them to Elman and LSTM Cells.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video discusses Long-Short-Term Memory Units.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video discusses the training of simple RNNs using the backpropagation through time algorithm.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video introduces the topic of recurrent neural networks and the Elman Cell.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video discusses learning to learn options for architecture search and first results.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video demonstrates the many uses of residual connections in deep networks from Inception-ResNet to DenseNet.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video discusses the ideas of residual connections in deep networks that allow going from 20 to more than 1000 layers.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video discusses the success of deeper models including Inception V2 and V3. One key technology that is introduced in V3 is label smoothing regularization.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science