Classification problems that evolve over time require classifiers that can adapt to previously unseen
classes and also to a change in the class over time. This is called “open world classification“ and is an
already well established research topic [1, 2, 3, 4, 5]. One good example for this is face recognition. A
classifier for this task should recognize previously unseen persons and classify them in another class.
But also the face of a person might change over time, that is where incremental updates would be
better suited instead of retraining the complete models. While there are many classifiers performing
this static task, it is not straightforward to transfer those into an efficient incremental update algorithm
. The Extreme Value Machine (EVM)  is an open set classifier, which can perform incremental
learning and is based on a strong theoretical foundation. It uses extreme value theory based calibrations
to fit Weibull-distributions around each sample. Compared to a classification via a similarity function
and thresholds, the EVM leads to naturally softer and more robust bounds. It is also less sensitive to
noise and performs well, especially in open space .
However the Extreme Value Machine can be used incrementally, there is no efficient update mechanism
provided. Transferring an algorithm to incremental updates can have several direct problems like
finding an efficient update function, or a limitation in space. One cannot save all previously seen
samples. Therefore a model reduction algorithm, which approximates the set cover algorithm, is
proposed in the EVM. Yet there are also several indirect problems, like a concept drift, that is when
the samples for a class change, either gradually or abruptly. The model is able to adapt to it, but with
it comes another challenge the so called “stability-plasticity dilemma“ . This means the weigh up
between a rather fast or slow adaption to change in one class. A fast adaption can result in an unwanted
adaption to noise. Yet a slow adaption can lead to missing cyclical or constantly gradual change in the
data. Also the model reduction used in the EVM can lead to unbalanced classes and in the extreme
case to a nearly complete removal of some classes. This is called “catastrophic forgetting“ . These
problems are not directly solved by the EVM and should also be assessed in this work.
In this thesis, the EVM will be extended by incremental update mechanisms. Starting with an exact
naive approach, later approximative algorithms will be tested for both efficiency and accuracy. One of
the main problems in terms of efficiency is that when adding new samples to the EVM, all previously
trained points need to be updated according to the naive EVM algorithm. Also when updating an
existing sample it needs to be compared to all the samples of all other classes. The first optimizations
are based on these two properties. First not all previously trained samples have to be updated, finding
the ones, on sample and class base, is the challenge. Second not all feature-points have to be compared
to while (re-)training another feature-point. Later those variants will be evaluated on different datasets.
One important one will be a face dataset for the above stated, given challenges.
The thesis consists of the following milestones:
• Implementation of the exact incremental update algorithm.
• Evaluating performance on MNIST, ImageNet, Face Dataset.
• Expanding of approximative update algorithms.
• Further experiments regarding learning and forgetting procedure.
The implementation should be done in C++.
 Walter Scheirer, Anderson Rocha, Archana Sapkota, and Terrance Boult. Toward Open Set Recognition.
IEEE Transactions on Pattern Analysis and Machine Intelligence, 35:1757–72, 07 2013.
 Abhijit Bendale and Terrance E. Boult. Towards Open Set Deep Networks. CoRR, abs/1511.06233, 2015.
 Pedro Ribeiro Mendes Júnior, Jacques Wainer, and Anderson Rocha. Specialized Support Vector Machines
for Open-Set Recognition. CoRR, abs/1606.03802, 2016.
 Ziwei Liu, Zhongqi Miao, Xiaohang Zhan, Jiayun Wang, Boqing Gong, and Stella X. Yu. Large-Scale Long-
Tailed Recognition in an Open World. In IEEE Conference on Computer Vision and Pattern Recognition
 Zongyuan Ge and Sergey Demyanov and Zetao Chen and Rahil Garnavi. Generative OpenMax for Multi-
Class Open Set Classification. In British Machine Vision Conference Proceedings 2017.
 Ethan M. Rudd, Lalit P. Jain, Walter J. Scheirer, and Terrance E. Boult. The Extreme Value Machine. CoRR,
 Manuel Günther, Steve Cruz, Ethan M. Rudd, and Terrance E. Boult. Toward Open-Set Face Recognition.
CoRR, abs/1705.01567, 2017.
 Alexander Gepperth and Barbara Hammer. Incremental Learning Algorithms and Applications. In European
Symposium on Artificial Neural Networks (ESANN), 2016.