È«µ¿Ç¥ Ãâ·Â

Similar documents
¿ì¿îÅà Ãâ·Â

°í¼®ÁÖ Ãâ·Â

À±½Â¿í Ãâ·Â

Microsoft PowerPoint - XP Style

High Resolution Disparity Map Generation Using TOF Depth Camera In this paper, we propose a high-resolution disparity map generation method using a lo

11이정민

학습영역의 Taxonomy에 기초한 CD-ROM Title의 효과분석

45-51 ¹Ú¼ø¸¸

<353420B1C7B9CCB6F52DC1F5B0ADC7F6BDC7C0BB20C0CCBFEBC7D120BEC6B5BFB1B3C0B0C7C1B7CEB1D7B7A52E687770>

광운소식65호출력

정보기술응용학회 발표

Software Requirrment Analysis를 위한 정보 검색 기술의 응용

19_9_767.hwp

Journal of Educational Innovation Research 2017, Vol. 27, No. 2, pp DOI: : Researc

±èÇö¿í Ãâ·Â

감각형 증강현실을 이용한

<91E6308FCD5F96DA8E9F2E706466>

04서종철fig.6(121~131)ok

박선영무선충전-내지

3. 클라우드 컴퓨팅 상호 운용성 기반의 서비스 평가 방법론 개발.hwp

<65B7AFB4D7B7CEB5E5BCEEBFEEBFB5B0E1B0FABAB8B0EDBCAD5FC3D6C1BE2E687770>

Journal of Educational Innovation Research 2018, Vol. 28, No. 4, pp DOI: A Study on Organizi

30이지은.hwp

원고스타일 정의

<333820B1E8C8AFBFEB2D5A B8A620C0CCBFEBC7D120BDC7BFDC20C0A7C4A1C3DFC1A42E687770>

11¹Ú´ö±Ô

Microsoft Word - 1-차우창.doc

(JBE Vol. 7, No. 4, July 0)., [].,,. [4,5,6] [7,8,9]., (bilateral filter, BF) [4,5]. BF., BF,. (joint bilateral filter, JBF) [7,8]. JBF,., BF., JBF,.

6.24-9년 6월

05( ) CPLV12-04.hwp

<32382DC3BBB0A2C0E5BED6C0DA2E687770>

10 이지훈KICS hwp

I

03-최신데이터

[ReadyToCameral]RUF¹öÆÛ(CSTA02-29).hwp

歯1.PDF

1. KT 올레스퀘어 미디어파사드 콘텐츠 개발.hwp

위해 사용된 기법에 대해 소개하고자 한다. 시각화와 자료구조를 동시에 활용하는 프로그램이 가지는 한계와 이를 극복하기 위한 시도들을 살펴봄으로서 소셜네트워크의 분석을 위한 접근 방안을 고찰해 보고자 한다. 2장에서는 실험에 사용된 인터넷 커뮤니티인 MLBPark 게시판

-

인문사회과학기술융합학회

02( ) SAV12-19.hwp

미래 서비스를 위한 스마트 클라우드 모델 수동적으로 웹에 접속을 해야만 요구에 맞는 서비스를 받을 수 있었다. 수동적인 아닌 사용자의 상황에 필요한 정보를 지능적으로 파악 하여 그에 맞는 적합한 서비스 를 제공할 수 새로운 연구 개발이 요구 되고 있다. 이를 위하여,

#Ȳ¿ë¼®

09오충원(613~623)

08SW

09권오설_ok.hwp

Journal of Educational Innovation Research 2019, Vol. 29, No. 1, pp DOI: * Suggestions of Ways

<31325FB1E8B0E6BCBA2E687770>

Output file

<BFA9BAD02DB0A1BBF3B1A4B0ED28C0CCBCF6B9FC2920B3BBC1F62E706466>

,. 3D 2D 3D. 3D. 3D.. 3D 90. Ross. Ross [1]. T. Okino MTD(modified time difference) [2], Y. Matsumoto (motion parallax) [3]. [4], [5,6,7,8] D/3

04-다시_고속철도61~80p

09È«¼®¿µ 5~152s


<C7C1B7A3C2F7C0CCC1EE20B4BABAF1C1EEB4CFBDBA20B7B1C4AA20BBE7B7CA5FBCADB9CEB1B35F28C3D6C1BE292E687770>

63-69±è´ë¿µ

½Éº´È¿ Ãâ·Â

À¯Çõ Ãâ·Â


<30382E20B1C7BCF8C0E720C6EDC1FD5FC3D6C1BEBABB2E687770>


07변성우_ok.hwp

untitled

06_ÀÌÀçÈÆ¿Ü0926

¨ë Áö¸®ÇÐȸÁö-¼Û°æ¾ðOK

03.Agile.key

07.045~051(D04_신상욱).fm

<B1B3B9DFBFF83330B1C7C1A631C8A35FC6EDC1FDBABB5FC7D5BABB362E687770>

<B1E2C8B9BEC828BFCFBCBAC1F7C0FC29322E687770>

<372DBCF6C1A42E687770>

<31B1E8C0B1C8F128C6ED2E687770>

DBPIA-NURIMEDIA

(5차 편집).hwp

성능 감성 감성요구곡선 평균사용자가만족하는수준 성능요구곡선 성능보다감성가치에대한니즈가증대 시간 - 1 -

00내지1번2번


12È«±â¼±¿Ü339~370

: 4 2. : (KSVD) 4 3. :

¿ì¿îÅà Ãâ·Â

untitled

DBPIA-NURIMEDIA

PowerPoint 프레젠테이션

THE JOURNAL OF KOREAN INSTITUTE OF ELECTROMAGNETIC ENGINEERING AND SCIENCE Jan.; 28(1), IS

14.이동천교수님수정

¼�È«¼® Ãâ·Â

<30392DB1E8C7FCBCB12E687770>

<313120C0AFC0FCC0DA5FBECBB0EDB8AEC1F2C0BB5FC0CCBFEBC7D15FB1E8C0BAC5C25FBCF6C1A42E687770>

인문사회과학기술융합학회

○ 제2조 정의에서 기간통신역무의 정의와 EU의 전자커뮤니케이션서비스 정의의 차이점은

DBPIA-NURIMEDIA

38이성식,안상락.hwp

<C1A4BAB8B9FDC7D031362D335F E687770>

지능정보연구제 16 권제 1 호 2010 년 3 월 (pp.71~92),.,.,., Support Vector Machines,,., KOSPI200.,. * 지능정보연구제 16 권제 1 호 2010 년 3 월

Voice Portal using Oracle 9i AS Wireless

10지식정보보안

pdf 16..

도비라

<332EC0E5B3B2B0E62E687770>

2014ijµåÄ·¾È³»Àå-µ¿°è ÃÖÁ¾

Microsoft Word - KSR2014S042

Transcription:

Recent Research Trend of Gesture-based User Interfaces In this paper, we review recent research trends in gesture-based user interfaces to support natural user interactions with media contents. With the developments of information and communication technology, users are able to enjoy varous media contents through various devices anytime and anywhere. From the perspective of user interactions, however, we still suffer from unnatural user interfaces and learning phase of new user interfaces on the various types of media contents and devices. Thus, to support more natural and comfortable user interfaces, there have been many research activities on the development of gesture-based user interfaces. The gesture-based user interface is relatively intuitive and simple in comparison with other user interfaces, such as speech, haptics and eye-gaze, so that it can support more natural user interactions. In particular, the gesturebased user interface is more effective on mobile devices due to its intuitiveness and simplicity. Therefore, we believe personalized gesture-based user interface is required to support user-centered multi-modal interactions in ubiquitous computing environment. Keywords: User Interfcae, Gesture-based Interface, Context-awareness, UbiComp, Augemented Reality

I II

III

IV

V [1] M. Weiser, ''The Computer for the 21st Century,'' Scientific American, Vol. 265, No. 3, 1991, pp. 94-104. [2] B. Schilit, N. Adams, and R. Want, ''Context-Aware Computing Applications,'' in Proceedings of IEEE Workshop on Mobile Computing Systems and Applications, 1994, pp. 85-90. [3] H. Lieberman and T. Selker, ''Out of context: Computer systems that adapt to, and learn from, context,'' IBM Systems Journal, Vol. 39, No. 3-4, 2000, pp. 617-631. [4] A. K. Dey and G. D. Abowd, ''Towards a Better Understanding of Context and Context-awareness,'' in the Workshop on The What, Who, Where, When, and How of Context-Awareness, as part of the 2000 Conference on Human Factors in Computing Systems (CHI 2000), 2000.

[5] A. Schmidt, M. Beigl, and H.-W. Gellersen, ''There is more to Context than Location,'' Computers and Graphics, Vol. 23, No. 6, 1999, pp. 893-901. [6] B. J. Rhodes, ''The Wearable Remembrance Agent: A System for Augmented Memory,'' in Proceedings of The First International Symposium on Wearable Computers, 1997, pp. 123-128. [7] A. Sears, J. Feng, K. Oseitutu, and C.-M. Karat, ''Hands-Free, Speech-Based Navigation During Dictation: Difficulties, Consequences, and Solutions,'' Human-Computer Interaction (HCI) Journal, Vol. 18, No. 3, 2003, pp. 229-257. [8] V. F. M. Salvador, J. S. de Oliveira Neto, and A. S. Kawamoto, ''Requirement Engineering Contributions to Voice User Interface,'' in Proceedings of First International Conference on Advances in Computer- Human Interaction, 2008, pp. 309-314. [9] K. Kyung and J. Park,,, Vol. 21, No. 5, 2006, pp. 93-108. [10] L. Kim and S. Park,,:, Vol. 31, No. 11-12, 2004, pp. 682-691. [11] K. R. Park and J. Kim, ''Real-time Facial and Eye Gaze Tracking System,'' IEICE Transactions on Information and Systems, Vol. E88-D, No. 6, 2005, pp. 1231-1238. [12] M. R. M. Mimica and C. H. Morimoto, ''A Computer Vision Framework for Eye Gaze Tracking,'' in Proceedings of the XVI Brazilian Symposium on Computer Graphics and Image Processing, 2003, pp. 406-412. [13] D. J. Sturman and D. Zeltzer, "A Survey of Glove-based Input," IEEE Computer Graphics and Applications, Vol. 14, No. 1, 1994, pp. 30-39. [14] G. B. Guerra-Filho, ''Optical Motion Capture: Theory and Implementation,'' Journal of Theoretical and Applied Informatics (RITA), Vol. 12, No. 2, 2005, pp. 61-89. [15] T. B. Moeslund and E. Granum, ''A Survey of Computer Vision-Based Human Motion Capture,'' Computer Vision and Image Understanding, Vol. 81, No. 3, 2001, pp. 231-268. [16] M. Turk, ''Computer Vision in the Interface,'' Communications of the ACM, Vol. 47, No. 1, 2004, pp. 60-67. [17] W. T. Freeman, D. B. Anderson, P. A. Beardsley, C. N. Dodge, M. Roth, C. D. Weissman, W. S. Yerazunis, H. Kage, K. Kyuma, Y. Miyake, and K. ichi Tanaka, ''Computer Vision for Interactive Computer Graphics,'' IEEE Computer Graphics and Applications, Vol. 18, No. 3, 1998, pp. 42-53. [18] P. Maes, T. Darrell, B. Blumberg, and A. Pentland, ''The ALIVE system: Full-body Interaction with Autonomous Agents,'' in Proceedings of Computer Animation, 1995, pp. 11-18. [19] K. Hinckley, J. Pierce, M. Sinclair, and E. Horvitz, ''Sensing Techniques for Mobile Interaction,'' in Proceedings of the 13th annual ACM symposium on User Interface Software and Technology, 2000, pp. 91-100. [20] H. Witt, T. Nicolai, and H. Kenn, ''Designing a Wearable User Interface for Hands-free Interaction in Maintenance Applications,'' in Proceedings of the Fourth Annual IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOMW 06), 2006. [21] T. G. Zimmerman, J. Lanier, C. Blanchard, S. Bryson, and Y. Harvill, ''A Hand Gesture Interface Device,'' in Proceedings of the SIGCHI/GI conference on Human factors in computing systems and graphics interface, 1987, pp. 189-192. [22] K. Kurihara, S. Hoshino, K. Yamane, and Y. Nakamura, ''Optical Motion Capture System with Pan-Tilt Camera Tracking and Realtime Data Processing,'' in Proceedings of the 2002 IEEE International Conference on Robotics and Automation, Vol. 2, 2002, pp. 1241-1248. [23] H. Chen, G. Qian, and J. James, ''An Autonomous Dance Scoring System using Marker-based Motion Capture,'' in Proceedings of IEEE 7th Workshop on Multimedia Signal Processing, 2005, pp. 1-4. [24] C. Keskin, A. Erkan, and L. Akarun, ''Real Time Hand Tracking and 3D Gesture Recognition for Interactive Interfaces using HMM,'' in Proceedings of the Joint International Conference ICANN/ICONIP 2003, 2003. [25] Y. Tao and H. Hu, ''Colour Based Human Motion Tracking for Home-based Rehabilitation,'' in Proceedings of IEEE International Conference on Systems, Man, and Cybernetics, Vol. 1, 2004, pp. 773-778. [26] I. Haritaoglu, D. Harwood, and L. S. Davis, ''W4: Who? When? Where? What? A Real Time System for Detecting and Tracking People,'' in Proceedings of International Conference on Face and Gesture Recognition, 1998, pp. 222-227. [27] T. Horprasert, D. Harwood, and L. S. Davis, ''A Statistical Approach for Real-time Robust Background Subtraction and Shadow Detection,'' in Proceedings of the 7th IEEE International Conference on Computer Vision, Frame Rate

Workshop (ICCV 99), 1999, pp. 1-9. [28] D. Hong and W. Woo, ''A Background Subtraction for a Vision-based User Interface,'' in Proceedings of the Fourth International Conference on Pacific Rim Conference on Multimedia, Vol. 1, 2003, pp. 263-267. [29] A. F. Bobick and J. W. Davis, ''The Recognition of Human Movement Using Temporal Templates,'' IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 23, No. 3, 2001, pp. 257-267. [30] G. R. Bradski and J. W. Davis, ''Motion Segmentation and Pose Recognition with Motion History Gradients,'' Machine Vision and Applications, Vol. 13, No. 3, 2002, pp. 174-184. [31] T. Horprasert, I. Haritaoglu, C. Wren, D. Harwood, L. Davis, and A. Pentland, Real-time 3D Motion Capture, in Proceedings of 2nd Workshop on Perceptual User Interfaces, 1998. [32] G. K. Cheung, S. Baker, J. Hodgins, and T. Kanade, ''Markerless Human Motion Transfer,'' in Proceedings of 2nd International Symposium on 3D Data Processing, Visualization and Transmission, 2004, pp. 373-378. [33] D. Hong and W. Woo, ''A 3D Vision-based Ambient User Interface,'' International Journal of Human Computer Interaction, Vol. 20, No. 3, 2006, pp. 271-284. [34] K. Oka, Y. Sato, and H. Koike, ''Real-Time Fingertip Tracking and Gesture Recognition,'' IEEE Computer Graphics and Applications, Vol. 22, No. 6, 2002, pp. 64-71. [35] T. Starner, B. Leibe, D. Minnen, T. Westyn, A. Hurst, and J. Weeks, ''The Perceptive Workbench: Computer Vision-Based Gesture Tracking, Object Tracking, and 3D Reconstruction for Augmented Desks,'' Machine Vision and Applications, Vol. 14, No. 1, 2003, pp. 59-71. [36] A. Bottino and A. Laurentini, ''How to Make a Simple and Robust 3D Hand Tracking Device using a Single Camera,'' in Proceedings of the 11th WSEAS International Conference on Computers, 2007, pp. 414-419. [37] S.-J. Cho, C. Choi, Y. Sung, K. Lee, Y.-B. Kim, and R. Murray-Smith, ''Dynamics of Tilt-based Browsing on Mobile Devices,'' in CHI 07 Extended Abstracts on Human Factors in Computing Systems, 2007, pp. 1947-1952. [38] E.-S. Choi, W.-C. Bang, S.-J. Cho, J. Yang, D.-Y. Kim, and S.-R. Kim, ''Beatbox Music Phone: Gesture-based Interactive Mobile Phone using a Tri-axis Accelerometer,'' in IEEE International Conference on Industrial Technology (ICIT 2005), 2005, pp. 97-102. [39] D. Hong and W. Woo,,, Vol. 26, No. 1, 2008, pp. 88-97.