New algorithms tackling micro-saccadic eye movements and the inaccuracy in eye gaze tracking for controlling on-screen pointers are presented and explored. Multimodal fusion algorithms involving eye gaze and finger tracking systems are presented and validated and important results have been obtained on gaze controlled interfaces and visual responses whilst encountering oncoming road hazards. A set of user trials to validate the algorithms involving driving simulators are also presented by the author.
Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments would of great importance to researchers and designers alike, within the fields of automotive design and engineering, human-computer interaction (HCI) and intelligent interfaces.