Jingfeng Han

Jingfeng Han
Hellmut ERZIGKEIT, DGPh,Geromed GmbH

One-to-one Edge Based Registration and Segmentation Based Validations in Hybrid Imaging

Abstract

During the past decade, image registration has become an essential tool for medical treatment in clinics, by finding the spatial mapping between two images, observing the changes of anatomical structure and merging the information from different modalities. On the other hand, the matching of appropriately selected features is becoming more and more important for the further improvement of registration methods, as well as for the qualitative validation of registration. The purpose of this thesis is to solve the following two problems: How to integrate feature detection into a non-rigid registration framework, so that a high quality spatial mapping can be achieved? How to systematically measure the quality of multi-modal registration by automatically segmenting the corresponding features? For the first problem, we develop a general approach based on the Mumford-Shah model for simultaneously detecting the edge features of two images and jointly estimating a consistent set of transformations to match them. The entire variational model is realized in a multi-scale framework of the finite element approximation. The optimization process is guided by an EM type algorithm and an adaptive generalized gradient flow to guarantee a fast and smooth relaxation. This one-to-one edge matching is a general registration method, which has been successfully adapted to solve image registration problems in several medical applications, for example mapping inter-subject MR data, or alignment of retina images from different cameras. For the second problem, we propose a new method validating the hybrid functional and morphological image fusion, especially for the SPECT/CT modality. It focuses on measuring the deviation between the corresponding anatomical structures. Two kinds of anatomical structures are investigated as validationmarkers: (1) the hot spot in a functional image and its counterpart in the morphological image (2) the kidneys in both modalities. A series of special methods are developed to segment these structures in both modalities with minimum user interaction. Accuracy of the validation methods have been confirmed by experiments with real clinical data-sets. The inaccuracies of hot spot based validation for neck regions are reported to be 0.7189±0.6298mm in X-direction, 0.9250 ± 0.4535mm in Y -direction and 0.9544 ± 0.6981mm in Z-direction. While the inaccuracies of kidneys based validation for abdomen regions are 1.3979±0.8401 mm in X-direction, 1.9992 ± 1.3920 mm in Y -direction and 2.7823 ± 2.0672 mm in Z-direction. In the end, we also discuss a new interpolation based method to effectively improve the SPECT/CT fusion and present preliminary results.