Natural User Interface
for 3D Medical Imaging - RSNA 2014

Your hand is your input device



User bound by mouse Use your hand as input device Free hand exploration of the 3D data

METHODS & MATERIAL
Off-the-shelf gesture-based user interface technology (e. g., Leap Motion) is utilized to develop a new human-computer interface for 3D image post-processing, allowing to:

  1. define position and orientation of oblique MPR (multiplanar reformations) and sliding-thin-slab MIP (maximum intensity projections)
  2. define viewing angle and position for VRT (volume rendering technique) and MIP
  3. control other commonly used (2D) functions like zoom & pan and "windowing"

with simple hand movements.

RESULTS
The presented natural, "deviceless" user interface provides more degrees of freedom than conventional input devices. However, it lacks accuracy for certain tasks, and is prone to cause fatigue. It works well in combination with a conventional input device like a normal keyboard and when used with simple gestures, e. g., an extended hand that can be reliably detected.

CONCLUSIONS
Compared to conventional input devices (mouse, keyboard, trackball, touchscreen), a gesture-based natural user interface

  1. is not a suitable substitute with regard to (2D) routine tasks
  2. can be a valuable addition with regard to advanced 3D imaging tasks


Comparison of Gesture-controlled NATURAL USER INTERFACE vs. Conventional user interface

Comparison of gesture-controlled NATURAL USER INTERFACE vs. conventional user interfaces



Live presentation of the NATURAL USER INTERFACE at the RSNA 2014 meeting


Teistler M, Flensburg University of Applied Sciences, Germany
Bott OJ, Hannover University of Applied Sciences, Germany
Zak A, Singularity University, Moffett Field, CA, USA
Breiman RS, UCSF School of Medicine, San Francisco, CA, USA
Brunberg JA, UC Davis, Sacramento, CA, USA

Special thanks to:
Matthias Gramm
Franziska Loh
Benjamin Schulz
Michael Seitz
Alena Simon
Matthias Süncksen
Jesse Wilmot