Optimised Eye-Hand-Coordination in Augmented Reality Systems for Image Guided Surgery

Holste-Foundation funds the research project "Optimization of Hand-Eye Coordination in video see-through Augmented Reality (AR) systems for applications in Image Guided Surgery" (Project term: 2004 - 2005).

Augmented Reality (AR) is generally defined as the context-specific enhancement of the real view with virtual information. The broadest application of AR is the overlay of human visual perception with graphical information through a Head-Mounted Display (HMD). This information is meant to augment the real view in a way that the user receives the needed data in the appropriate time and place. This makes possible, for instance, to omit printed instructions during manual assembly or overlay the intraoperative situs with a computer generated planning model or navigation data during surgery. The use of video see-through HMDs is a special challenge, because they provide the user no direct view to the environment, but instead utilize video cameras to capture the real image and display it augmented with graphics in real time. An important current issue of these displays is Hand-Eye Coordination (HEC). HEC performance is impaired by latency of the video image and camera displacement in respect to the eyes. The use of video see-through HMDs is especially critical in medicine, where HEC plays an extremely crucial role. In the framework of the project funded by the Holste-Foundation, the Institute of Industrial Engineering and Ergonomics (IAW), in cooperation with the Surgical Therapy Technology of the Chair of Applied Medical Engineering in the Helmholtz-Institute for Biomedical Engineering (HIA), will research the effect of camera displacement on human HEC performance. The question to answer is how HEC performance changes while using a video see-through viewing system, if the cameras become displaced from the eyes in different directions. Furthermore, a function shall be found to describe such changes in a mathematical relation. With the help of this function, manufacturers and users of AR systems can choose the optimal camera position under consideration of cost effectiveness and technical feasibility, significantly improving the usability of video see-through systems relating to HEC. The developed function shall subsequently become tested and validated for applications in Image Guided Surgery. For this, an appropriate AR module will be developed at the HIA and integrated in a surgical navigation system for visual guidance of the surgeon in microsurgical operations.


  • S. Serefoglou, W. Lauer, A. Perneczky, T. Lutze & K. Radermacher: Multimodal User Interface for a Semi-Robotic Visual Assistance System for Image Guided Neurosurgery. In: H.U. Lemke, K. Inamura, K. Doi, M. Vannier & A. Farman (ed.): Proc. CARS, 2005, pp. 624-629
  • S. Serefoglou, M. Park, K. Radermacher & L. Schmidt: Untersuchung der Hand-Auge-Koordination bei einem videobasierten Spiegel-HMD. Zustandserkennung und Systemgestaltung, 6, 2005, pp. 287-290
  • M. Park, S. Serefoglou, L. Schmidt, K. Radermacher, C. Schlick & H. Luczak: Hand-Eye Coordination Using a Video See-Through Augmented Reality System. The Ergonomics Open Journal, 2008, 1, pp. 46-53