Authors
Ted Selker, J Scott, W Burleson
Publication date
2002/6
Journal
LREC 2002, Workshop on Multi-Modal Resources and Multi-Modal System Evaluation
Pages
78-83
Description
The Eye-bed prototype introduces new ergonomic language scenarios. This paper focuses on developing a demonstration eye gesture language for intelligent user interface and computer control. Controls are integrated with a user model containing a history of previous interactions with the interface. Context recognition enables the Eye-Bed environment to continually adapt to suit the user’s inferred and demonstrated preferences. Staring, gazing, glancing, blinking, etc… are part of person-to-person visual communication. By creating a computer interface language composed of exaggerated eye gestures, we create a multi-modal interface that is easy to learn, use, and remember.
Total citations
20072008200920102011201220132014201520162017201820191111
Scholar articles
T Selker, J Scott, W Burleson - LREC 2002, Workshop on Multi-Modal Resources and …, 2002