Skip navigation
The Australian National University

Natural Interaction Enhanced User Interface Design and Evaluation for Teleoperation

Dingyun Zhu (ANU and CSIRO ICT Centre)

CSIRO ICT

DATE: 2011-02-21
TIME: 14:00:00 - 15:00:00
LOCATION: Seminar Room S206 CSIRO ICTC Building 108
CONTACT: JavaScript must be enabled to display this email address.

ABSTRACT:
In teleoperation, operators usually have to control multiple complex devices simultaneously, for example controlling a robot and controlling a video camera at the same time. Conventional user interfaces such as joystick, mouse or keyboard are normally used in such teleoperation settings, requiring operators to frequently switch hands between different controllers to accomplish operational tasks. A way of improving this could be to introduce natural human interactive information such as head motion, eye gaze or gestures into the remote control interface. Recent computer vision based head tracking and eye tracking technologies have expanded the possibilities of designing and developing more natural and intuitive user interfaces for a wide range of applications. We have been designing and implementing a few novel user control interfaces and prototypes, for example, applying real-time head motion and eye gaze information as intrinsic elements of teleoperation for remote camera control in a multi-task setting. From conducted user studies, the results provide clear evidence that eye tracking control significantly outperforms the other control approaches we developed in both objective measures and subjective measures.
BIO:
Dingyun is currently with School of Computer Science, Australian National University, working on a collaborative project with CSIRO ICT Centre, which is under the Minerals Down Under (MDU) National Research Flagship. His research is addressing the design and evaluation of natural interaction enhanced user interfaces for remote control of complex machinery - particularly for mining teleoperations. Examples of natural interaction modes include human eye gaze, head motion, and gestures. Augmenting traditional control interfaces (which use input devices such as joysticks, keyboards and mice) with these natural interaction modes offer the possibility of creating machine control interfaces that are easier to learn, and provide more effective control of complex machinery than traditional systems.

Updated:  22 February 2011 / Responsible Officer:  JavaScript must be enabled to display this email address. / Page Contact:  JavaScript must be enabled to display this email address.