Science Talks
Known Operator Learning – Towards Integration of Prior Knowledge into Machine Learning
The talk introduces known operator learning, a technique to incorporate prior knowledge into machine and deep learning approaches. It is compatible with virtually all other deep learning approaches and offers guaranteed reductions in maximal error bounds. The talk features some applications and ends in an outlook on how to expand on these ideas in the future.
Experiments in the section on deep learning computed tomography can be found at https://doi.org/10.24433/CO.2164960.v1. The code on learning vesselness in the section on learning from heuristic algorithms is published at https://doi.org/10.24433/CO.5016803.v2. The code for the section on deriving networks is available at https://doi.org/10.24433/CO.8086142.v2.
Reference:
Maier, Andreas K., et al. “Learning with known operators reduces maximum error bounds.” Nature machine intelligence 1.8 (2019): 373-380. https://www.nature.com/articles/s42256-019-0077-5
Hinge Loss, SVMs, and the Loss of Users
Hinge Loss is a useful loss function for the training of neural networks and is a convex relaxation of the 0/1-cost function. There is also a direct relation to soft margin support vector machines, as we demonstrate in this short tutorial. With this in mind, we can now build new powerful loss functions for neural network training such as user loss. See the full set of slides at https://www5.cs.fau.de/index.php?id=3937.
References:
[1] Vincent Christlein. Hand-written Document Analysis with Focus on Writer Identification and Writer Retrieval. PhD Thesis. Friedrich-Alexander-University Erlangen-Nuremberg, 2018.
[2] Shahab Zarei, Bernhard Stimpel, Christopher Syben, Andreas Maier. User Loss – A Forced-Choice-Inspired Approach to Train Neural Networks directly by User Interaction. Under Review. https://arxiv.org/abs/1807.09303
[3] Andreas Maier, Frank Schebesch, Christopher Syben, Tobias Würfl, Stefan Steidl, Jang-Hwan Choi, Rebecca Fahrig. Precision Learning: Towards Use of Known Operators in Neural Networks. International Conference on Pattern Recognition ICPR 2018 (to appear). https://arxiv.org/abs/1712.00374
Andre Aichert – Epipolar Consistency in Transmission Imaging
Epipolar Consistency is a fundamental concept found in any pair of X-ray images. Andre Aichert gives an introduction to the topic and an overview of his contributions to medical imaging. Applications are motion compensation, tracking, and 2D-3D registration.
Reference:
Aichert, A., Berger, M., Wang, J., Maass, N., Dörfler, A., Hornegger, J., & Maier, A. (2015). Epipolar Consistency in Transmission Imaging. IEEE Transactions on Medical Imaging, 34(10), 1-15. https://dx.doi.org/10.1109/TMI.2015.2426417
Jian Wang – Robust 2-D/3-D Registration for Real-Time Patient Motion Compensation
Presentation by Jian Wang on Robust 2-D/3-D registration summarising his Ph.D. work. The presentation includes an introduction to the point-to-plane-correspondence metric valid for X-ray projection imaging as well as depth-aware 2D-3D registration.
Reference:
Wang, Jian, et al. “Dynamic 2-D/3-D rigid registration framework using point-to-plane correspondence model.” IEEE transactions on medical imaging 36.9 (2017): 1939-1954. https://www5.informatik.uni-erlangen.de/Forschung/Publikationen/2017/Wang17-D2R.pdf
Yixing Huang – Compressed Sensing and Machine Learning for Limited Angle Tomography
Presentation by Yixing Huang on iterative and deep learning reconstruction summarising his Ph.D. work. The presentation includes an introduction to the iteratively re-weighted image reconstruction and deep learning reconstruction.
Reference:
Huang Y. Consistency Conditions, Compressed Sensing and Machine Learning for Limited Angle Tomography (Dissertation, 2020) URL: https://opus4.kobv.de/opus4-fau/frontdoor/index/index/docId/13203