Deep Learning



Time and place:

Information regarding the online teaching will be added to the studon course

  • Tue 12:15-13:45, Room H4

Fields of study

  • WPF INF-MA from SEM 1
  • WPF MT-MA-BDV from SEM 1

Prerequisites / Organizational information

The following lectures are recommended:

  • Introduction to Pattern Recognition (IntroPR)

  • Pattern Recognition (PR)

Application via


Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry.
This lecture introduces the core elements of neural networks and deep learning, it comprises:

  • (multilayer) perceptron, backpropagation, fully connected neural networks

  • loss functions and optimization strategies

  • convolutional neural networks (CNNs)

  • activation functions

  • regularization strategies

  • common practices for training and evaluating neural networks

  • visualization of networks and results

  • common architectures, such as LeNet, Alexnet, VGG, GoogleNet

  • recurrent neural networks (RNN, TBPTT, LSTM, GRU)

  • deep reinforcement learning

  • unsupervised learning (autoencoder, RBM, DBM, VAE)

  • generative adversarial networks (GANs)

  • weakly supervised learning

  • applications of deep learning (segmentation, object detection, speech recognition, ...)

The accompanying exercises will provide a deeper understanding of the workings and architecture of neural networks.

Recommended Literature

- Ian Goodfellow, Yoshua Bengio, Aaron Courville: Deep Learning. MIT Press, 2016 - Christopher Bishop: Pattern Recognition and Machine Learning, Springer Verlag, Heidelberg, 2006 - Yann LeCun, Yoshua Bengio, Geoffrey Hinton: Deep learning. Nature 521, 436–444 (28 May 2015)

Additional information

Keywords: deep learning; machine learning

Expected participants: 120