Healthy people use a keyboard and mouse as standard input devices for controlling a computer. However, these input devices are usually not suitable for people with severe physical disabilities. This study aims to design and implement a suitable and reliable human–computer interactive (HCI) interface for disabled users by integrating both eye tracking technology and lip motion recognition. Eye movements control the cursor position on the computer screen. An eye gaze followed by a lip motion of the mouth opening served as a mouse click action. Seven lip motion features were extracted to discriminate mouth openings and mouth closures with the cumulative sum control chart algorithm. A novel smoothing technique called the threshold-based Savitzky–Golay smoothing filter was proposed to stabilize the cursor movement due to the inherent jittery motions of eyes and to reduce eye tracking latencies. In this study, a fixation experiment with nine dots was carried out to evaluate the efficacy of eye gaze data smoothing. A Chinese text entry experiment based on the on-screen keyboard with four keypad sizes was designed to evaluate the influence of the keypad size on the Chinese text entry rate. The results of the fixation experiment indicated that the threshold-based Savitzky–Golay smoothing filter with a threshold value of two standard deviations, a polynomial order of 3, and a window length of 61 can significantly improve the stability of eye cursor movements by 44.86% on average. The averaged Chinese text entry rate achieved 4.41 wpm with the dynamical enlargeable keypads. The current results encouraged us to utilize the proposed HCI interface for disabled users in the future.
All Science Journal Classification (ASJC) codes