Recent Technology Advances in Multimodal Context-aware Interface In this paper, we introduce multimodal context-aware technologies which can provide more intuitive and convenient user interface in an environment where personal devices are getting smaller, more diverse and intelligent. Multimodal interface provides multiple methods of input and output between the user and the system, and must be able to process them simultaneously. Context-aware technologies provide more intelligent interface by collecting various information surrounding the user and utilizing them to find the most appropriate service. We introduce the concepts of multimodal interface and context-aware technologies along with the world-wide advances in this field. Finally, we intend to study the technologies in multimodal context-aware interface by studying the technology trends in the fields of ubiquitous middleware. Keywords: Multimodal interface, Context-aware technology I
II
III
IV
V [1] A. Jaimes, N. Sebe, ''Multimodal Human Computer Interaction: A Survey,'' IEEE International Workshop on Human Computer Interaction, 2005. [2] R. Vertegaal, ''Attentive User Interfaces,'' Communications of the ACM Vol. 46, No. 3, 2003, pp 33. [3] W3C Emotion Markup Language Incubator Group, http://www.w3.org/2005/incubator/emotion/. [4] Victor Zue, ''On Organic Interfaces,'' Keynote1, Interspeech 2007. [5] W3C MultiModal Interaction Framework, ''http://www.w3.org/tr/mmi-framework/,'' W3C Note, May, 2003. [6] EMMA (Extended MultiModal Annotation), http://www.w3.org/tr/emma/, W3C Candidate Recommendation, Dec, 2007.
[7] Speech Application Language Tags(SALT) 1.0 Specifications, http://www.saltforum.org/saltforum/ downloads/salt1.0.pdf, Jul., 2002 [8] VoiceXML Forum, ''X+V, http://www.voicexml.org/specs/multimodal/x+v/12/,'' Mar. 2004. [9] Distributed MultiModal Synchronization Protocol, IETF internal draft, Jul. 2007. [10] D. Pearce,J. Engelsma, J. Ferrans, J. Johnson, "An architecture for seamless access to distributed multimodal services", INTERSPEECH-2005, pp. 2845-2848. [11] "OMA Multimodal and Multi-device Enabler Architecture," Draft Version 1.0-11 Oct. 2006. [12] SCXML (State Chart XML), http://www.w3.org/tr/scxml/, W3C Working Draft, Feb, 2007. [13] H. Portele, S. Goronzy, M. Emele, A. Kellner, S. Torge, and J. Vrugt, ''SmartKom Home: The Interface to Home Entertainment,'' In Wolfgang Wahlster (ed.) SmartKom: Foundations of Multimodal Dialogue Systems, 2006. [14] Johnston, M., Bangalore, S., Vasireddy, G., Stent, A., Ehlen, P., Walker, M., Whittaker, S., and Maloor, P., ''MATCH: an architecture for multimodal dialogue systems,'' In Proceedings of the 40th Annual Meeting on Association For Computational Linguistics, 2001. [15] Johnston, M., L-F. D'Haro, M. Levine, B. Renger, ''A Multimodal Interface for Access to Content in the Home,'' In Proceedings of the Association for Computational Linguistics Annual Conference. 2007. [16] Johnston, M, P. Ehlen, D. Gibbon, Z. Liu., ''The Multimodal Presentation Dashboard,'' In Proceedings of the NAACL-HLT 2007 Workshop: Bridging the Gap: Academic and Industrial Research in Dialog Technologies, 2008. [17] N. Reithinger, S. Bergweiler, et. al, ''A Look Under the Hood Design and Development of the First SmartWeb System Demonstrator,'' Proc. of 7th International Conf. on Multimodal Interfaces, 2005. [18] Dey. A.K. and Abowd, G.D. ''Towards a better understanding of context and context-awareness,'' In Proceedings of the workshop on the What, Who, Where, When and How of Context-Awareness, 2000. [19] G. chen and D. Otz., ''A survery of context aware mobile computing research,'' Technical Report TR2000-381, Department of Computer Science, Dartmouth College, Nov. 2000. [20] http://www.teco.edu/tea/ [21] A. K. Dey, D. Salber, and G. D. Abowd., ''A conceptual framework and a toolkit for supporting the rapid prototyping of context-aware applications,'' Human- Computer Interaction, 2001. [22] A. K. Dey. And G. D. Abowd, ''The Context Toolkit: Aiding the Development of Context-Aware Applications,'' In the Workshop on Software Engineering for Wearable and Pervasive Computing, 2000. [23] Aura project at the CMU: http://www-2.cs.cmu. edu/~aura. [24] P. Bellavista, A. Corradi, R. Montanari, and C. Stefanelli, ''Context-aware middleware for resource management in the wireless internet,'' IEEE Transactions on Software Engineering, 2003. [25] L. Capra, ''Mobile computing middleware for contextaware applications,'' In ICSE '02: Proceedings of the 24th International Conference on Software Engineering, ACM Press, New York, 2002, pp.723-724. [26] J. Barton and T. Kindberg. ''The Cooltown user experience,'' Technical report, Hewlett Packard, 2001. [27] M. Roman, C. K. hess, R. Cerqueira, A. Ranganathan, R. Campbell, and K. Nahrstedt, ''Gaia: A Middleware Infrastructure to Enable Active Spaces,'' IEEE Pervasive Computing, Oct-Dec 2002. pp.74-83, [28] T. Gu, H.K. Pung and D.Q. Zhang, "A Middleware for Building Context-Aware Mobile Services", In Proceedings of IEEE Vehicular Technology Conference, 2004. [29] Chen, H., Finin, T., and Josh, A., ''Semantic Web in the Context Broker Architecture,'' In Proceedings of the Second IEEE international Conference on Pervasive Computing and Communications, 2004. [30] Strang, T. and Linnhoff-Popien, C., ''A Context Modeling Survey,'' First International Workshop on Advanced Context Modelling, Reasoning and Management, UbiComp. 2004. [31] http://www.daml.org/ [32] http://www.cs.man.ac.uk/~horrocks/fact/ [33] http://flora.sourceforge.net/ [34] KT,, 2007. [35] Oxygen project at the MIT : http://oxygen.lcs.mit.edu. [36] http://www.cooltown.com [37] http://www.research.microsoft.com/easyliving [38] Sungyoung Lee, ''Context-aware Middleware for Ubiquious Computing Systems,'' UCT 2004-winter, pp.81-102. [39], http://gift.kisti.re.kr/data/iac/files/
KISTI_200412_KSJ_middleware.pdf, 2004. [40] ETRI, R&D, BcN, 2007. [41],, U-city u-service, TTA Journal No. 112, pp.72-82. [42],, 1329 2008, pp.25-34. [43],,,, Multi-modal RFID Platform, http://www.cuslab.com/ publications/papers, pp.253-267. [44],, 1314 2007, pp.25-32. [45] Timo Salminen, Simo Hosio & Jukka Riekki, ''Middleware Based User Interface Migration: Implementation and Evaluation,'' http://www.mediateam.oulu.fi/publications/ pdf/1057.pdf [46] Alex Paul Conn, ''How Devices Shift Usage Paradigms- Expectations in a Converging Techonology Arena,'' http://www.apconnsulting.com/papers, 2005. [47] ETRI, R&D, BcN, 2007. [48] http://www.orb.com [49] http://www.w3.org/tr/dpf/ [50] http://www.w3.org/mobile/ccpp [51] Sylvain Wallez, ''Apache Cocoon : A versatile middleware for multi-{format, channel, device, modal} applications,'' http://www.anyware-tech.com [52] IBM, http://www- 903.ibm.com/kr/ubiquitous/embedded/ebsw.html [53] Heedong Ko, ''ut-interaction: Ontology-based Contextaware Interaction Framework,'' UCT 2004-winter, pp.199-205. [54] Seung Wok Han, Yeo Bong Yoon, and Hee Yong Youn, ''A New Middleware Architecture for Ubiquitous Computing Environment,'' Proceedings of the Second IEEE Workshop on Software Technologies for Future Embedded and Ubiquitous Systems (WSTFEUS 04), May 11-12, Vienna, Austria, pp. 117-121. [55] Markku Turunen, Jaakko Hakulinen, Anssi Kainulainen, Aleksi Melto, Topi Hurtig, ''Design of a Rich Multimodal Interface for Mobile Spoken Route Guidance,'' Interspeech 2007, pp.2193-2196. [56] Alexander Gruenstein, Stephanie Seneff, ''Releasing a Multimodal Dialogue System into the Wild:User Support Mechanisms,'' Processings of 8th SIGdial Workshop on Discourse and Dialogue, 2007, pp.111-119. [57] http://www.w3.org/2007/uwa/