Prostate segmentation and volume estimation in MRI

Chuan Yu Chang, Chuan Huan Chiu, Yuh Shyan Tsai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In order to detect prostate diseases, urologists usually use ultrasound images or magnetic resonance images (MRI) for clinical diagnosis. In clinical practice, region of prostate is manually outlined by urologist. However, outline the prostate boundary manually is highly time-consuming. In this paper, a prostate segmentation and volume estimation in MRI is proposed. Anactive contour model(ACM) is adopted to obtain the initial contour of the prostate. Four textural features extracted from the prostate were used to train the SVM classifier. Non-prostate regions are then excluded by the trained SVM. A quick convex hull is applied to refine the shape of prostate. The volume of the prostate is eventually estimated by the series of segmented prostate regions. The proposed segmentation method achieves high accuracy of 93.7%. Our experimental results show that the proposed prostate segmentation and volume estimation method is highly potential for helping urologists in clinical diagnosis.

Original languageEnglish
Title of host publicationIntelligent Systems and Applications - Proceedings of the International Computer Symposium, ICS 2014
EditorsWilliam Cheng-Chung Chu, Stephen Jenn-Hwa Yang, Han-Chieh Chao
PublisherIOS Press
Pages1907-1917
Number of pages11
ISBN (Electronic)9781614994831
DOIs
Publication statusPublished - 2015
EventInternational Computer Symposium, ICS 2014 - Taichung, Taiwan
Duration: 2014 Dec 122014 Dec 14

Publication series

NameFrontiers in Artificial Intelligence and Applications
Volume274
ISSN (Print)0922-6389

Other

OtherInternational Computer Symposium, ICS 2014
CountryTaiwan
CityTaichung
Period14-12-1214-12-14

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Prostate segmentation and volume estimation in MRI'. Together they form a unique fingerprint.

Cite this