TY - JOUR
T1 - Texture-Map-Based Branch-Collaborative Network for Oral Cancer Detection
AU - Chan, Chih Hung
AU - Huang, Tze Ta
AU - Chen, Chih Yang
AU - Lee, Chien Cheng
AU - Chan, Man Yee
AU - Chung, Pau Choo
N1 - Funding Information:
Manuscript received November 19, 2018; revised March 30, 2019; accepted May 3, 2019. Date of publication May 22, 2019; date of current version July 26, 2019. This work was supported in part by the Ministry of Science and Technology, Taiwan, under Grant MOST-107-2634-F-006-004 and in part by the Delta-NCKU joint project. This paper was recommended by Associate Editor J. Lü. (Corresponding author: Pau-Choo Chung.) C.-H. Chan is with the NVIDIA Corporation, Taipei 114, Taiwan (e-mail: [email protected]).
Publisher Copyright:
© 2007-2012 IEEE.
PY - 2019/8
Y1 - 2019/8
N2 - The paper proposes an innovative deep convolutional neural network (DCNN) combined with texture map for detecting cancerous regions and marking the ROI in a single model automatically. The proposed DCNN model contains two collaborative branches, namely an upper branch to perform oral cancer detection, and a lower branch to perform semantic segmentation and ROI marking. With the upper branch the network model extracts the cancerous regions, and the lower branch makes the cancerous regions more precision. To make the features in the cancerous more regular, the network model extracts the texture images from the input image. A sliding window is then applied to compute the standard deviation values of the texture image. Finally, the standard deviation values are used to construct a texture map, which is partitioned into multiple patches and used as the input data to the deep convolutional network model. The method proposed by this paper is called texture-map-based branch-collaborative network. In the experimental result, the average sensitivity and specificity of detection are up to 0.9687 and 0.7129, respectively based on wavelet transform. And the average sensitivity and specificity of detection are up to 0.9314 and 0.9475, respectively based on Gabor filter.
AB - The paper proposes an innovative deep convolutional neural network (DCNN) combined with texture map for detecting cancerous regions and marking the ROI in a single model automatically. The proposed DCNN model contains two collaborative branches, namely an upper branch to perform oral cancer detection, and a lower branch to perform semantic segmentation and ROI marking. With the upper branch the network model extracts the cancerous regions, and the lower branch makes the cancerous regions more precision. To make the features in the cancerous more regular, the network model extracts the texture images from the input image. A sliding window is then applied to compute the standard deviation values of the texture image. Finally, the standard deviation values are used to construct a texture map, which is partitioned into multiple patches and used as the input data to the deep convolutional network model. The method proposed by this paper is called texture-map-based branch-collaborative network. In the experimental result, the average sensitivity and specificity of detection are up to 0.9687 and 0.7129, respectively based on wavelet transform. And the average sensitivity and specificity of detection are up to 0.9314 and 0.9475, respectively based on Gabor filter.
UR - http://www.scopus.com/inward/record.url?scp=85070949853&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85070949853&partnerID=8YFLogxK
U2 - 10.1109/TBCAS.2019.2918244
DO - 10.1109/TBCAS.2019.2918244
M3 - Article
C2 - 31135368
AN - SCOPUS:85070949853
SN - 1932-4545
VL - 13
SP - 766
EP - 780
JO - IEEE transactions on biomedical circuits and systems
JF - IEEE transactions on biomedical circuits and systems
IS - 4
M1 - 8719967
ER -