TY - GEN
T1 - Deep Learning-based Computerized Tomographic Imaging for Differentiation and Segmentation of Parotid Gland Neoplasm
AU - Chang, Chan-Chi
AU - Horng, Ming Huwi
AU - Jiang, Jheng You
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - We applied a convolution neural network (CNN) to the parotid tumor classification and segmentation. The bounding box prediction of CNNs was used to detect the areas of parotid tumors. The Yolov4 method was used to obtain AP50 0.964. Furthermore, the ResNet+CBAM and ResNet+BiFPN were applied to classify each image into mixed, Warthin, and malignant tumors. The classification accuracies of ResNet+BiFPN and ResNet+CBAM were 0.8526 and 0.8419 (for mixed malignant and Warthin) and 0.8216 and 0.8111 (for mixed malignant). To effectively classify the slice images of patients and normal participants, we developed a decision tree to integrate classified images to make a decision. Using the U-net and Unet ++, we segmented the tumors of images. For 1493 tumor images, the performances of U-net and Unet ++ were presented as the Dice measure of 0.850 and 0.863. The results revealed that the classification and the segmentation showed an accuracy of 87% and a Dice coefficient of 0.91.
AB - We applied a convolution neural network (CNN) to the parotid tumor classification and segmentation. The bounding box prediction of CNNs was used to detect the areas of parotid tumors. The Yolov4 method was used to obtain AP50 0.964. Furthermore, the ResNet+CBAM and ResNet+BiFPN were applied to classify each image into mixed, Warthin, and malignant tumors. The classification accuracies of ResNet+BiFPN and ResNet+CBAM were 0.8526 and 0.8419 (for mixed malignant and Warthin) and 0.8216 and 0.8111 (for mixed malignant). To effectively classify the slice images of patients and normal participants, we developed a decision tree to integrate classified images to make a decision. Using the U-net and Unet ++, we segmented the tumors of images. For 1493 tumor images, the performances of U-net and Unet ++ were presented as the Dice measure of 0.850 and 0.863. The results revealed that the classification and the segmentation showed an accuracy of 87% and a Dice coefficient of 0.91.
UR - http://www.scopus.com/inward/record.url?scp=85193252114&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85193252114&partnerID=8YFLogxK
U2 - 10.1109/ECEI60433.2024.10510809
DO - 10.1109/ECEI60433.2024.10510809
M3 - Conference contribution
AN - SCOPUS:85193252114
T3 - 2024 IEEE 7th Eurasian Conference on Educational Innovation: Educational Innovations and Emerging Technologies, ECEI 2024
SP - 127
EP - 131
BT - 2024 IEEE 7th Eurasian Conference on Educational Innovation
A2 - Meen, Teen-Hang
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 7th IEEE Eurasian Conference on Educational Innovation, ECEI 2024
Y2 - 26 January 2024 through 28 January 2024
ER -