Inter-Subject Shape Correspondence Computation From Medical Images Without Organ Segmentation

Statistical shape models (SSMs) have been established as robust anatomical priors for medical image segmentation, registration and anatomy modelling. To construct an SSM which accurately models the inter-subject anatomical variations, it is crucial to compute accurate shape correspondence between th...

Full description

Bibliographic Details
Main Authors: Zhaofeng Chen, Tianshuang Qiu, Li Huo, Lijuan Yu, Hongcheng Shi, Yanjun Zhang, Hongkai Wang
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8830479/
id doaj-d270b013b895498eb407cbcf9149b2b4
record_format Article
spelling doaj-d270b013b895498eb407cbcf9149b2b42021-04-05T17:32:53ZengIEEEIEEE Access2169-35362019-01-01713077213078110.1109/ACCESS.2019.29406438830479Inter-Subject Shape Correspondence Computation From Medical Images Without Organ SegmentationZhaofeng Chen0https://orcid.org/0000-0003-3492-6656Tianshuang Qiu1https://orcid.org/0000-0002-1992-1120Li Huo2Lijuan Yu3Hongcheng Shi4Yanjun Zhang5Hongkai Wang6https://orcid.org/0000-0002-1813-2162Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, ChinaFaculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, ChinaDepartment of Nuclear Medicine, Peking Union Medical College Hospital, Beijing, ChinaThe Affiliated Cancer Hospital of Hainan Medical University, Haikou, ChinaDepartment of Nuclear Medicine, Zhongshan Hospital, Fudan University, Shanghai, ChinaDepartment of Nuclear Medicine, The First Affiliated Hospital of Dalian Medical University, Dalian, ChinaFaculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, ChinaStatistical shape models (SSMs) have been established as robust anatomical priors for medical image segmentation, registration and anatomy modelling. To construct an SSM which accurately models the inter-subject anatomical variations, it is crucial to compute accurate shape correspondence between the training samples. To achieve this goal, the state-of-the-art shape correspondence computation methods always require tedious segmentation of the training images, while they seldom pay enough attention to the correspondence accuracy of key anatomical landmarks like the bone joints, vessel bifurcations, etc. As a result, the computation of shape correspondence is time-consuming and the correspondence accuracy is imperfect. To solve these problems, this paper proposes a novel shape correspondence computation approach which eliminates the need for image segmentation by registering an organ shape template to the training images. This method allows the human expert to specify key anatomical landmarks in the training images to define the correspondence of the crucial landmarks. An intensity-and-landmark-combined strategy is implemented to utilized both the image intensity and expert landmarks to obtain accurate shape correspondence. This method is evaluated for the construction of head anatomy SSM and spine SSM based on computed tomography (CT) images. The SSMs constructed using the proposed method demonstrates better shape correspondence accuracy than other state-of-the-arts correspondence methods. In particular, this method obtains pixel-level surface correspondence accuracy (1.38 mm) for the skull and sub-pixel level accuracy (0.92 mm) for the spine. The generalisability and specificity of the SSMs constructed using our method are also superior to SSMs constructed using other compared correspondence methods. With this method, we propose a novel approach which takes less human intervention and produces higher quality SSM with better shape modelling accuracy.https://ieeexplore.ieee.org/document/8830479/Statistical shape modelshape modelingshape correspondencesanatomical landmarkcombined-intensity-and-landmark registration
collection DOAJ
language English
format Article
sources DOAJ
author Zhaofeng Chen
Tianshuang Qiu
Li Huo
Lijuan Yu
Hongcheng Shi
Yanjun Zhang
Hongkai Wang
spellingShingle Zhaofeng Chen
Tianshuang Qiu
Li Huo
Lijuan Yu
Hongcheng Shi
Yanjun Zhang
Hongkai Wang
Inter-Subject Shape Correspondence Computation From Medical Images Without Organ Segmentation
IEEE Access
Statistical shape model
shape modeling
shape correspondences
anatomical landmark
combined-intensity-and-landmark registration
author_facet Zhaofeng Chen
Tianshuang Qiu
Li Huo
Lijuan Yu
Hongcheng Shi
Yanjun Zhang
Hongkai Wang
author_sort Zhaofeng Chen
title Inter-Subject Shape Correspondence Computation From Medical Images Without Organ Segmentation
title_short Inter-Subject Shape Correspondence Computation From Medical Images Without Organ Segmentation
title_full Inter-Subject Shape Correspondence Computation From Medical Images Without Organ Segmentation
title_fullStr Inter-Subject Shape Correspondence Computation From Medical Images Without Organ Segmentation
title_full_unstemmed Inter-Subject Shape Correspondence Computation From Medical Images Without Organ Segmentation
title_sort inter-subject shape correspondence computation from medical images without organ segmentation
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2019-01-01
description Statistical shape models (SSMs) have been established as robust anatomical priors for medical image segmentation, registration and anatomy modelling. To construct an SSM which accurately models the inter-subject anatomical variations, it is crucial to compute accurate shape correspondence between the training samples. To achieve this goal, the state-of-the-art shape correspondence computation methods always require tedious segmentation of the training images, while they seldom pay enough attention to the correspondence accuracy of key anatomical landmarks like the bone joints, vessel bifurcations, etc. As a result, the computation of shape correspondence is time-consuming and the correspondence accuracy is imperfect. To solve these problems, this paper proposes a novel shape correspondence computation approach which eliminates the need for image segmentation by registering an organ shape template to the training images. This method allows the human expert to specify key anatomical landmarks in the training images to define the correspondence of the crucial landmarks. An intensity-and-landmark-combined strategy is implemented to utilized both the image intensity and expert landmarks to obtain accurate shape correspondence. This method is evaluated for the construction of head anatomy SSM and spine SSM based on computed tomography (CT) images. The SSMs constructed using the proposed method demonstrates better shape correspondence accuracy than other state-of-the-arts correspondence methods. In particular, this method obtains pixel-level surface correspondence accuracy (1.38 mm) for the skull and sub-pixel level accuracy (0.92 mm) for the spine. The generalisability and specificity of the SSMs constructed using our method are also superior to SSMs constructed using other compared correspondence methods. With this method, we propose a novel approach which takes less human intervention and produces higher quality SSM with better shape modelling accuracy.
topic Statistical shape model
shape modeling
shape correspondences
anatomical landmark
combined-intensity-and-landmark registration
url https://ieeexplore.ieee.org/document/8830479/
work_keys_str_mv AT zhaofengchen intersubjectshapecorrespondencecomputationfrommedicalimageswithoutorgansegmentation
AT tianshuangqiu intersubjectshapecorrespondencecomputationfrommedicalimageswithoutorgansegmentation
AT lihuo intersubjectshapecorrespondencecomputationfrommedicalimageswithoutorgansegmentation
AT lijuanyu intersubjectshapecorrespondencecomputationfrommedicalimageswithoutorgansegmentation
AT hongchengshi intersubjectshapecorrespondencecomputationfrommedicalimageswithoutorgansegmentation
AT yanjunzhang intersubjectshapecorrespondencecomputationfrommedicalimageswithoutorgansegmentation
AT hongkaiwang intersubjectshapecorrespondencecomputationfrommedicalimageswithoutorgansegmentation
_version_ 1721539477573206016