|
|
|
|
LEADER |
03233 am a22002533u 4500 |
001 |
126069 |
042 |
|
|
|a dc
|
100 |
1 |
0 |
|a Golland, Polina
|e author
|
100 |
1 |
0 |
|a Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
|e contributor
|
100 |
1 |
0 |
|a Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
|e contributor
|
700 |
1 |
0 |
|a Wells, William M.
|e author
|
245 |
0 |
0 |
|a Non-rigid registration of 3D ultrasound for neurosurgery using automatic feature detection and matching
|
260 |
|
|
|b Springer Nature,
|c 2020-07-07T17:29:18Z.
|
856 |
|
|
|z Get fulltext
|u https://hdl.handle.net/1721.1/126069
|
520 |
|
|
|a Purpose: The brain undergoes significant structural change over the course of neurosurgery, including highly nonlinear deformation and resection. It can be informative to recover the spatial mapping between structures identified in preoperative surgical planning and the intraoperative state of the brain. We present a novel feature-based method for achieving robust, fully automatic deformable registration of intraoperative neurosurgical ultrasound images. Methods: A sparse set of local image feature correspondences is first estimated between ultrasound image pairs, after which rigid, affine and thin-plate spline models are used to estimate dense mappings throughout the image. Correspondences are derived from 3D features, distinctive generic image patterns that are automatically extracted from 3D ultrasound images and characterized in terms of their geometry (i.e., location, scale, and orientation) and a descriptor of local image appearance. Feature correspondences between ultrasound images are achieved based on a nearest-neighbor descriptor matching and probabilistic voting model similar to the Hough transform. Results: Experiments demonstrate our method on intraoperative ultrasound images acquired before and after opening of the dura mater, during resection and after resection in nine clinical cases. A total of 1620 automatically extracted 3D feature correspondences were manually validated by eleven experts and used to guide the registration. Then, using manually labeled corresponding landmarks in the pre- and post-resection ultrasound images, we show that our feature-based registration reduces the mean target registration error from an initial value of 3.3 to 1.5 mm. Conclusions: This result demonstrates that the 3D features promise to offer a robust and accurate solution for 3D ultrasound registration and to correct for brain shift in image-guided neurosurgery.
|
520 |
|
|
|a National Institutes of Health (U.S.) (Grant P41-EB015898-09)
|
520 |
|
|
|a National Institutes of Health (U.S.) (Grant P41-EB015902)
|
520 |
|
|
|a National Institutes of Health (U.S.) (Grant R01-NS049251)
|
520 |
|
|
|a Portuguese Foundation for International Cooperation in Science, Technology and Higher Education (Grant PD/BD/105869/2014)
|
520 |
|
|
|a Portuguese Foundation for International Cooperation in Science, Technology and Higher Education (Grant IDMEC/LAETA UID/EMS/50022/2013)
|
546 |
|
|
|a en
|
655 |
7 |
|
|a Article
|
773 |
|
|
|t 10.1007/S11548-018-1786-7
|
773 |
|
|
|t International Journal of Computer Assisted Radiology and Surgery
|