Pattern Recognition WS 20/21

Watch now: Pattern Recognition: Episode 39 – The Viola-Jones Algorithm (WS 20/21)

In this video, we show how Adaboost is used in the Viola-Jones Algorithm for face detection. Watch on:FAU TVYouTube

Watch now: Pattern Recognition: Episode 38 – Adaboost & Exponential Loss (WS 20/21)

In this video, we show that Adaboost is actually optimizing the exponential loss. Watch on:FAU TVYouTube

Watch now: Pattern Recognition: Episode 37 – Adaboost – Concept (WS 20/21)

In this video, we introduce the Adaboost Algorithm that fuses many weak classifiers to a strong one. Watch on:FAU TVYouTube

Watch now: Pattern Recognition: Episode 36 – Performance Measures on Finite Data (WS 20/21)

In this video, we look into how to estimate reliable performance measures on finite data such as the jack-knife, bootstrapping, and cross-validation. Watch on:FAU TVYouTube

Watch now: Pattern Recognition: Episode 35 – No Free Lunch Theorem & Bias-Variance Trade-off (WS 20/21)

In this video, we introduce the "no free lunch" theorem and the bias-variance trade-off. Watch on:FAU TVYouTube

Watch now: Pattern Recognition: Episode 34 – Measures of Non-Gaussianity (WS 20/21)

In this video, we discuss three measures to determine "non-gaussianity". Watch on:FAU TVYouTube

Watch now: Pattern Recognition: Episode 33 – Independent Component Analysis and Gaussianity (WS 20/21)

In this video, we discuss the importance of non-gaussianity for the extraction of independent components.  Watch on:FAU TVYouTube

Watch now: Pattern Recognition: Episode 32 – Independent Component Analysis – Introduction (WS 20/21)

In this video, we introduce the concept of the independent component analysis. Watch on:FAU TVYouTube

Watch now: Pattern Recognition: 31 – EM Algorithm Example (WS 20/21)

In this video, we show how to apply the EM Algorithm to Magnetic Resonance Imaging for simultaneous bias field correction and image segmentation. Watch on:FAU TVYouTube

Watch now: Pattern Recognition: Episode 30 – Expectation Maximization Algorithm (WS 20/21)

In this video, we analyze the expectation-maximization algorithm and relate it to Kullback-Leibler statistics in the context of the missing information principle. Watch on:FAU TVYouTube