Color Correction Parameter Estimation on the Smartphone and Its Application to Automatic Tongue Diagnosis

Min-Chun Hu, Ming Hsun Cheng, Kun-Chan Lan

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)

Abstract

Background: An automatic tongue diagnosis framework is proposed to analyze tongue images taken by smartphones. Different from conventional tongue diagnosis systems, our input tongue images are usually in low resolution and taken under unknown lighting conditions. Consequently, existing tongue diagnosis methods cannot be directly applied to give accurate results. Materials and Methods: We use the SVM (support vector machine) to predict the lighting condition and the corresponding color correction matrix according to the color difference of images taken with and without flash. We also modify the state-of-the-art work of fur and fissure detection for tongue images by taking hue information into consideration and adding a denoising step. Results: Our method is able to correct the color of tongue images under different lighting conditions (e.g. fluorescent, incandescent, and halogen illuminant) and provide a better accuracy in tongue features detection with less processing complexity than the prior work. Conclusions: In this work, we proposed an automatic tongue diagnosis framework which can be applied to smartphones. Unlike the prior work which can only work in a controlled environment, our system can adapt to different lighting conditions by employing a novel color correction parameter estimation scheme.

Original languageEnglish
Article number18
JournalJournal of Medical Systems
Volume40
Issue number1
DOIs
Publication statusPublished - 2016 Jan 1

All Science Journal Classification (ASJC) codes

  • Medicine (miscellaneous)
  • Information Systems
  • Health Informatics
  • Health Information Management

Fingerprint Dive into the research topics of 'Color Correction Parameter Estimation on the Smartphone and Its Application to Automatic Tongue Diagnosis'. Together they form a unique fingerprint.

Cite this