by Florian Heinrich, Kai Bornemann, Kai Lawonn, Christian Hansen
Abstract:
Medical volume data is usually explored on monoscopic monitors. Displaying this data in three-dimensional space facilitates the development of mental maps and the identification of anatomical structures and their spatial relations. Using augmented reality (AR) may further enhance these effects by spatially aligning the volume data with the patient. However, conventional interaction methods, e.g. mouse and keyboard, may not be applicable in this environment. Appropriate interaction techniques are needed to naturally and intuitively manipulate the image data. To this end, a user study comparing four gestural interaction techniques with respect to both clipping and windowing tasks was conducted. Image data was directly displayed on a phantom using stereoscopic projective AR and direct volume visualization. Participants were able to complete both tasks with all interaction techniques with respectively similar clipping accuracy and windowing efficiency. However, results suggest advantages of gestures based on motion-sensitive devices in terms of reduced task completion time and less subjective workload. This work presents an important first step towards a surgical AR visualization system enabling intuitive exploration of volume data. Yet, more research is required to assess the interaction techniques’ applicability for intraoperative use.
Reference:
Interacting with Medical Volume Data in Projective Augmented Reality (Florian Heinrich, Kai Bornemann, Kai Lawonn, Christian Hansen), In Proceedings of International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) (Anne L. Martel, Purang Abolmaesumi, Danail Stoyanov, Diana Mateus, Maria A. Zuluaga, S. Kevin Zhou, Daniel Racoceanu, Leo Joskowicz, eds.), Springer International Publishing, volume 12263, 2020.
Bibtex Entry:
@inproceedings{heinrich_interacting_2020,
	address = {Lima, Peru},
	title = {Interacting with {Medical} {Volume} {Data} in {Projective} {Augmented} {Reality}},
	volume = {12263},
	isbn = {978-3-030-59716-0},
	abstract = {Medical volume data is usually explored on monoscopic monitors. Displaying this data in three-dimensional space facilitates the development of mental maps and the identification of anatomical structures and their spatial relations. Using augmented reality (AR) may further enhance these effects by spatially aligning the volume data with the patient. However, conventional interaction methods, e.g. mouse and keyboard, may not be applicable in this environment. Appropriate interaction techniques are needed to naturally and intuitively manipulate the image data. To this end, a user study comparing four gestural interaction techniques with respect to both clipping and windowing tasks was conducted. Image data was directly displayed on a phantom using stereoscopic projective AR and direct volume visualization. Participants were able to complete both tasks with all interaction techniques with respectively similar clipping accuracy and windowing efficiency. However, results suggest advantages of gestures based on motion-sensitive devices in terms of reduced task completion time and less subjective workload. This work presents an important first step towards a surgical AR visualization system enabling intuitive exploration of volume data. Yet, more research is required to assess the interaction techniques’ applicability for intraoperative use.},
	booktitle = {Proceedings of {International} {Conference} on {Medical} {Image} {Computing} and {Computer} {Assisted} {Intervention} ({MICCAI})},
	publisher = {Springer International Publishing},
	author = {Heinrich, Florian and Bornemann, Kai and Lawonn, Kai and Hansen, Christian},
	editor = {Martel, Anne L. and Abolmaesumi, Purang and Stoyanov, Danail and Mateus, Diana and Zuluaga, Maria A. and Zhou, S. Kevin and Racoceanu, Daniel and Joskowicz, Leo},
	year = {2020},
	pages = {429--439}
}