This video presents max and average pooling, introduces the concept of fully convolutional networks, and hints on how this is used to build deep networks.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video presents convolutional layers including the concepts of strided and dilated convolutions and how to compute their gradient.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video presents a variety of modern activation functions including the quest for finding new ones.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video presents the biological background of activation functions and the classical choices that were used for neural networks.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video discusses details on optimization and different options in gradient descent procedure such as momentum and ADAM.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video explains hinge loss and its relation to support vector machines. We also show why sub-gradients allow us to optimize factions that are not continuously differentiable. Furthermore, the hinge loss enables to embed optimization constraints into loss functions.
Watch on:FAU TVFAU TV (no ...
This video explains how to derive L2 Loss and Cross-Entropy Loss from statistical assumptions. Highly relevant for the exam!
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video explains backpropagation at the level of layer abstraction.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video introduces the basics of the backpropagation algorithm.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science
This video introduces the topics of activation functions, loss, and the idea of gradient descent.
Watch on:FAU TVFAU TV (no memes)YouTube
Read the Transcript (Summer 2020) at:LMETowards Data Science