Error-tolerant sign retrieval using visual features and maximum a posteriori estimation

Chung Hsien Wu, Yu Hsien Chiu, Kung Wei Cheng

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

This paper proposes an efficient error-tolerant approach to retrieving sign words from a Taiwanese Sign Language (TSL) database. This database is tagged with visual gesture features and organized as a multilist code tree. These features are defined in terms of the visual characteristics of sign gestures by which they are indexed for sign retrieval and displayed using an anthropomorphic interface. The maximum a posteriori estimation is exploited to retrieve the most likely sign word given the input feature sequence. An error-tolerant mechanism based on mutual information criterion is proposed to retrieve a sign word of interest efficiently and robustly. A user-friendly anthropomorphic interface is also developed to assist learning TSL. Several experiments were performed in an educational environment to investigate the system's retrieval accuracy. Our proposed approach outperformed a dynamic programming algorithm in its task and shows tolerance to user input errors.

Original languageEnglish
Pages (from-to)495-508
Number of pages14
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume26
Issue number4
DOIs
Publication statusPublished - 2004 Apr

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Error-tolerant sign retrieval using visual features and maximum a posteriori estimation'. Together they form a unique fingerprint.

Cite this