by André Mewes, Patrick Saalfeld, Oleksandr Riabikin, Martin Skalej, Christian Hansen
Abstract:
PURPOSE: The interaction with interventional imaging systems within a sterile environment is a challenging task for physicians. Direct physician-machine interaction during an intervention is rather limited because of sterility and workspace restrictions. METHODS: We present a gesture-controlled projection display that enables a direct and natural physician-machine interaction during computed tomography (CT)-based interventions. Therefore, a graphical user interface is projected on a radiation shield located in front of the physician. Hand gestures in front of this display are captured and classified using a leap motion controller. We propose a gesture set to control basic functions of intervention software such as gestures for 2D image exploration, 3D object manipulation and selection. Our methods were evaluated in a clinically oriented user study with 12 participants. RESULTS: The results of the performed user study confirm that the display and the underlying interaction concept are accepted by clinical users. The recognition of the gestures is robust, although there is potential for improvements. The gesture training times are less than 10 min, but vary heavily between the participants of the study. The developed gestures are connected logically to the intervention software and intuitive to use. CONCLUSIONS: The proposed gesture-controlled projection display counters current thinking, namely it gives the radiologist complete control of the intervention software. It opens new possibilities for direct physician-machine interaction during CT-based interventions and is well suited to become an integral part of future interventional suites.
Reference:
A gesture-controlled projection display for CT-guided interventions (André Mewes, Patrick Saalfeld, Oleksandr Riabikin, Martin Skalej, Christian Hansen), In International journal of computer assisted radiology and surgery, volume 11, 2016.
Bibtex Entry:
@article{mewes_gesture-controlled_2016,
	title = {A gesture-controlled projection display for {CT}-guided interventions},
	volume = {11},
	issn = {1861-6429 1861-6410},
	doi = {10.1007/s11548-015-1215-0},
	abstract = {PURPOSE: The interaction with interventional imaging systems within a sterile environment is a challenging task for physicians. Direct physician-machine interaction during an intervention is rather limited because of sterility and workspace restrictions. METHODS: We present a gesture-controlled projection display that enables a direct and natural physician-machine interaction during computed tomography (CT)-based interventions. Therefore, a graphical user interface is projected on a radiation shield located in front of the physician. Hand gestures in front of this display are captured and classified using a leap motion controller. We propose a gesture set to control basic functions of intervention software such as gestures for 2D image exploration, 3D object manipulation and selection. Our methods were evaluated in a clinically oriented user study with 12 participants. RESULTS: The results of the performed user study confirm that the display and the underlying interaction concept are accepted by clinical users. The recognition of the gestures is robust, although there is potential for improvements. The gesture training times are less than 10 min, but vary heavily between the participants of the study. The developed gestures are connected logically to the intervention software and intuitive to use. CONCLUSIONS: The proposed gesture-controlled projection display counters current thinking, namely it gives the radiologist complete control of the intervention software. It opens new possibilities for direct physician-machine interaction during CT-based interventions and is well suited to become an integral part of future interventional suites.},
	language = {eng},
	number = {1},
	journal = {International journal of computer assisted radiology and surgery},
	author = {Mewes, André and Saalfeld, Patrick and Riabikin, Oleksandr and Skalej, Martin and Hansen, Christian},
	month = jan,
	year = {2016},
	pmid = {25958060},
	keywords = {*Gestures, *Software, *User-Computer Interface, Computer-assisted surgery, Gesture control, Hand, Human–computer interaction, Human–computer interaction, Human-computer interaction, Humans, Intra-operative visualization, Motion, Tomography, X-Ray Computed/*methods},
	pages = {157--164}
}