Random Forests for Manifold Learning
Description: There are many different methods for manifold learning, such as Locally Linear Embedding, MDS, ISOMAP or Laplacian Eigenmaps. All of them use a type of local neighborhood that tries to approximate the relationship of the data locally, and then try to find a lower dimensional representation which preserves this local relationship. One method to learn a partitioning of the feature space is by training a density forest on the data [1]. In this project the goal is to implement a Manifold Forest algorithm that finds a 1-D signal of length N in a series of N input images by learning a density forest on the data and afterwards applying Laplacian Eigenmaps on the data. For this, existing frameworks, like [2], [3], or [4] can be used as forest implementation. The Laplacian Eigenmaps algorithm is already implemented and can be integrated.
The concept of Manifold Forests is also introduced in the FAU lecture Pattern Analysis by Christian Riess, which makes candidates who have already heard this lecture preferred.
This project is intended for students wanting to do a 5 ECTS sized module like a research internship, starting now or asap. The project will be implemented in Python.
References:
[1]: Criminisi, A., Shotton, J., & Konukoglu, E. (2012). Decision Forests: A Unified Framework for Classification, Regression, Density Estimation, Manifold Learning and Semi-Supervised Learning. Foundations and Trends® in Computer Graphics and Vision, 7(2–3), 81–227. ; https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/CriminisiForests_FoundTrends_2011.pdf
[2]: https://github.com/CyrilWendl/SIE-Master
[3]: https://github.com/ksanjeevan/randomforest-density-python
[4]: https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomTreesEmbedding.html#sklearn.ensemble.RandomTreesEmbedding