TY - GEN
T1 - An intelligent guiding system using face information and vision-based mouse-interaction user interface
AU - Chang, Cheng Yu
AU - Chung, Pau Choo
AU - Yeh, Yu Sheng
AU - Yang, Jar Ferr
PY - 2006
Y1 - 2006
N2 - A guiding system is a specific mechanism that is used for providing key responses or information for the visitors. However, traditional guiding systems certainty have several disadvantages such like without real-time interaction with users and monotonous. This paper presents an intelligent guiding system, which allows a user to real-time interact with it without any additional auxiliaries. At first, a real-time front-view face detection using Harr-like features is used to decide when the guiding system should wake up and become interactive with the user. After system initialization, we keep on finding some feature points within the detected face area. Then we estimate the orientation of user's head via pyramidal Lucas-Kanade optical flow tracking. The performance of vision-based mouseinteraction in our system could be reached to 20 fps and people could get the right response in 1.9 seconds averagely under Pentium IV 1G Hz PC. Compared to the traditional guiding system, our system has more flexibility. Information providers could choose a suitable display equipment according to the surround environment. With the comparison to other non-vision-based input devices such like gloves or markers, our system offers a simple, useful and economical solution for the real-time interaction between the user and computer.
AB - A guiding system is a specific mechanism that is used for providing key responses or information for the visitors. However, traditional guiding systems certainty have several disadvantages such like without real-time interaction with users and monotonous. This paper presents an intelligent guiding system, which allows a user to real-time interact with it without any additional auxiliaries. At first, a real-time front-view face detection using Harr-like features is used to decide when the guiding system should wake up and become interactive with the user. After system initialization, we keep on finding some feature points within the detected face area. Then we estimate the orientation of user's head via pyramidal Lucas-Kanade optical flow tracking. The performance of vision-based mouseinteraction in our system could be reached to 20 fps and people could get the right response in 1.9 seconds averagely under Pentium IV 1G Hz PC. Compared to the traditional guiding system, our system has more flexibility. Information providers could choose a suitable display equipment according to the surround environment. With the comparison to other non-vision-based input devices such like gloves or markers, our system offers a simple, useful and economical solution for the real-time interaction between the user and computer.
UR - http://www.scopus.com/inward/record.url?scp=37649028657&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=37649028657&partnerID=8YFLogxK
U2 - 10.1109/ICCIS.2006.252272
DO - 10.1109/ICCIS.2006.252272
M3 - Conference contribution
AN - SCOPUS:37649028657
SN - 1424400236
SN - 9781424400232
T3 - 2006 IEEE Conference on Cybernetics and Intelligent Systems
BT - 2006 IEEE Conference on Cybernetics and Intelligent Systems
T2 - 2006 IEEE Conference on Cybernetics and Intelligent Systems
Y2 - 7 June 2006 through 9 June 2006
ER -