REAL: Situated Dialogues in Instrumented Environments

We give a survey of the research project REAL, where we investigate how a system can proactively assist its user in solving different tasks in an instrumented environment by sensing implicit interaction and utilising distributed presentation media. First we introduce the architecture of our instrumented environment, which uses a blackboard to coordinate the components of the environment, such as the sensing and positioning services and interaction devices. A ubiquitous user model provides contextual information on the users characteristics, actions and locations. The user may access and control their profile via a web interface. In the following, we present two mobile applications to employ the environmental support for situated dialogues, a shopping assistant and a pedestrian navigation system. Both applications allow for multi-modal interaction through a combination of speech, gesture and sensed actions such as motion.