Article ID Journal Published Year Pages File Type
736048 Optics and Lasers in Engineering 2006 18 Pages PDF
Abstract

This study describes the design and combination of an eye-controlled and a head-controlled human–machine interface system. This system is a highly effective human–machine interface, detecting head movement by changing positions and numbers of light sources on the head. When the users utilize the head-mounted display to browse a computer screen, the system will catch the images of the user's eyes with CCD cameras, which can also measure the angle and position of the light sources. In the eye-tracking system, the program in the computer will locate each center point of the pupils in the images, and record the information on moving traces and pupil diameters. In the head gesture measurement system, the user wears a double-source eyeglass frame, so the system catches images of the user's head by using a CCD camera in front of the user. The computer program will locate the center point of the head, transferring it to the screen coordinates, and then the user can control the cursor by head motions. We combine the eye-controlled and head-controlled human–machine interface system for the virtual reality applications.

Related Topics
Physical Sciences and Engineering Engineering Electrical and Electronic Engineering
Authors
, , , , , ,