Authors
Michael Johnston, Srinivas Bangalore, Gunaranjan Vasireddy, Amanda Stent, Patrick Ehlen, Marilyn Walker, Steve Whittaker, Preetam Maloor
Publication date
2002/7
Conference
Proceedings of the 40th annual meeting of the Association for Computational Linguistics
Pages
376-383
Description
Mobile interfaces need to allow the user and system to adapt their choice of communication modes according to user preferences, the task at hand, and the physical and social environment. We describe a multimodal application architecture which combines finite-state multimodal language processing, a speech-act based multimodal dialogue manager, dynamic multimodal output generation, and user-tailored text planning to enable rapid prototyping of multimodal interfaces with flexible input and adaptive output. Our testbed application MATCH (Multimodal Access To City Help) provides a mobile multimodal speech-pen interface to restaurant and subway information for New York City.
Total citations
2002200320042005200620072008200920102011201220132014201520162017201820192020202120222023218202632192623201515137151013963672
Scholar articles
M Johnston, S Bangalore, G Vasireddy, A Stent… - Proceedings of the 40th annual meeting of the …, 2002