Eye-Interact: A Low-Cost Eye Movement Controlled Communication System

Authors

  • Amir Sadeghian Department of Systems and Computer Engineering, Carleton University
  • Chol-Ho Yim Department of Systems and Computer Engineering, Carleton University
  • Adrian D.C. Chan Department of Systems and Computer Engineering, Carleton University
  • James R. Green Department of Systems and Computer Engineering, Carleton University

Abstract

In this paper we introduce Eye-Interact, a low-cost, eye movement controlled communication system for persons with high-level physical disabilities. This system consists of a computer equipped with a webcam. The Hough transform and a K-Nearest Neighbor (KNN) classifier are used to determine the eye position and gaze direction, respectively. The Eye- Interact user is presented with a dynamic user interface, consisting of a computer screen segmented into a 33 grid, with each of the 9 grid areas displaying different user inputs. The user can select an input by gazing at the desired grid area. Activation of the input can be determined by the gaze sequence, duration, or blinking. 

Downloads

Published

2007-12-31

How to Cite

[1]
A. Sadeghian, C.-H. Yim, A. D. Chan, and J. R. Green, “Eye-Interact: A Low-Cost Eye Movement Controlled Communication System”, CMBES Proc., vol. 30, no. 1, Dec. 2007.

Issue

Section

Academic