The paper proposes an innovative deep convolutional neural network (DCNN) combined with texture map for detecting cancerous regions and marking the ROI in a single model automatically. The proposed DCNN model contains two collaborative branches, namely an upper branch to perform oral cancer detection, and a lower branch to perform semantic segmentation and ROI marking. With the upper branch the network model extracts the cancerous regions, and the lower branch makes the cancerous regions more precision. To make the features in the cancerous more regular, the network model extracts the texture images from the input image. A sliding window is then applied to compute the standard deviation values of the texture image. Finally, the standard deviation values are used to construct a texture map, which is partitioned into multiple patches and used as the input data to the deep convolutional network model. The method proposed by this paper is called texture-map-based branch-collaborative network. In the experimental result, the average sensitivity and specificity of detection are up to 0.9687 and 0.7129, respectively based on wavelet transform. And the average sensitivity and specificity of detection are up to 0.9314 and 0.9475, respectively based on Gabor filter.
|Number of pages||15|
|Journal||IEEE transactions on biomedical circuits and systems|
|Publication status||Published - 2019 Aug|
All Science Journal Classification (ASJC) codes
- Biomedical Engineering
- Electrical and Electronic Engineering