Pixel1.gif (51 bytes)
Pixel1.gif (51 bytes)
Pixel1.gif (51 bytes) Main Page Pixel1.gif (51 bytes)
About DSP Laboratory
People
Research
Publications
Courses
Pixel.gif (52 bytes)
Contact Us
Sponsors
Credits
Pixel.gif (52 bytes)
Search
Go to FIU's Homepage

 

 Pixel1.gif (51 bytes)

 

Curve.gif (104 bytes) Pixel1.gif (51 bytes)

Adaptive Eye-Gaze Tracking Using-Neural-Network-Based User Profiles to Assist People with Motor Disability

Pixel1.gif (51 bytes)

Abstract:
 
"Adaptive Eye-Gaze Tracking Using-Neural-Network-Based User Profiles to Assist People with Motor Disability", (2008)
Sesin A., Adjouadi M., Ayala M., Cabrerizo M. and Barreto A.

ABSTRACT: ABSTRACT: This study developed an adaptive real-time human computer interface (HCI) that serves as an assistive technology tool for people with severe motor disability. The proposed HCI design uses eye gaze as the primary computer input device. Controlling the mouse cursor with raw eye coordinates results in sporadic motion of the pointer because of the saccadic nature of the eye. Even though eye movements are subtle and completely imperceptible under normal circumstances, they considerably affect the accuracy of an eye-gaze-based HCI. The proposed HCI system is novel because it adapts to each specific userís different and potentially changing jitter characteristics through the configuration and training of an artificial neural network (ANN) that is structured to minimize the mouse jitter. This task is based on feeding the ANN a userís initially recorded eye-gaze behavior through a short training session. The ANN finds the relationship between the gaze coordinates and the mouse cursor position based on the multilayer Perceptron model. An embedded graphical interface is used during the training session to generate user profiles that make up these unique ANN configurations. The results with 12 subjects in test 1, which involved following a moving target, showed an average jitter reduction of 35%; the results with 9 subjects in test 2, which involved following the contour of a square object, showed an average jitter reduction of 53%. For both results, the outcomes led to trajectories that were significantly smoother and apt at reaching fixed or moving targets with relative ease and within a 5% error margin or deviation from desired trajectories. The positive effects of such jitter reduction are presented graphically for visual appreciation.