Tongue features are important objective basis for clinical diagnosis and treatment in both western medicine and Chinese medicine. The need for continuous monitoring of health conditions inspires us to develop an automatic tongue diagnosis system based on built-in sensors of smartphones. However, tongue images taken by smartphone are quite different in color due to various lighting conditions, and it consequently affects the diagnosis especially when we use the appearance of tongue fur to infer health conditions. In this paper, we captured paired tongue images with and without flash, and the color difference between the paired images is used to estimate the lighting condition based on the Support Vector Machine (SVM). The color correction matrices for three kinds of common lights (i.e., fluorescent, halogen and incandescent) are pre-trained by using a ColorChecker-based method, and the corresponding pre-trained matrix for the estimated lighting is then applied to eliminate the effect of color distortion. We further use tongue fur detection as an example to discuss the effect of different model parameters and ColorCheckers for training the tongue color correction matrix under different lighting conditions. Finally, in order to demonstrate the potential use of our proposed system, we recruited 246 patients over a period of 2.5 years from a local hospital in Taiwan and examined the correlations between the captured tongue features and alanine aminotransferase (ALT)/aspartate aminotransferase (AST), which are important bio-markers for liver diseases. We found that some tongue features have strong correlation with AST or ALT, which suggests the possible use of these tongue features captured on a smartphone to provide an early warning of liver diseases.
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Health Informatics