Deriving Anatomical Context from 4D Ultrasound

Abstract
Real-time three-dimensional (also known as 4D) ultrasound imaging using matrix array probes has the potential to create large-volume information of entire organs such as the liver without external tracking hardware. This information can in turn be placed into the context of a CT or MRI scan of the same patient. However for such an approach many image processing challenges need to be overcome and sources of error addressed, including reconstruction drift, anatomical deformations, varying appearance of anatomy, and imaging artifacts. In this work, we present a fully automatic system including robust image-based ultrasound tracking, a novel learning-based global initialization of the anatomical context, and joint mono- and multi-modal registration. In an evaluation on 4D US sequences and MRI scans of eight volunteers we achieve automatic reconstruction and registration without any user interaction, assess the registration errors based on physician-defined landmarks, and demonstrate realtime tracking of free-breathing sequences.
Description

        
@inproceedings{
:10.2312/vcbm.20141196
https::/diglib.eg.org/handle/10.2312/vcbm.20141196.173-180
, booktitle = {
Eurographics Workshop on Visual Computing for Biology and Medicine
}, editor = {
Ivan Viola and Katja Buehler and Timo Ropinski
}, title = {{
Deriving Anatomical Context from 4D Ultrasound
}}, author = {
Müller, Markus
and
Helljesen, Linn E. S.
and
Prevost, Raphael
and
Viola, Ivan
and
Nylund, Kim
and
Gilja, Odd Helge
and
Navab, Nassir
and
Wein, Wolfgang
}, year = {
2014
}, publisher = {
The Eurographics Association
}, ISSN = {
2070-5778
}, ISBN = {
978-3-905674-62-0
}, DOI = {
/10.2312/vcbm.20141196
https://diglib.eg.org/handle/10.2312/vcbm.20141196.173-180
} }
Citation