Pupil localization for ophthalmic diagnosis using anchor ellipse regression

Horng Horng Lin, Zheng Yi Li, Min Hsiu Shih, Yung Nien Sun, Ting Li Shen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Recent developments of deep neural networks, such as Mask R-CNN, have shown significant advances in simultaneous object detection and segmentation. We thus apply deep learning to pupil localization for ophthalmic diagnosis and propose a novel anchor ellipse regression approach based on region proposal network and Mask R-CNN for detecting pupils, estimating pupil shape parameters, and segmenting pupil regions at the same time in infrared images. This new extension of anchor ellipse regression for Mask R-CNN is demonstrated to be effective in size and rotation estimations of elliptical objects, as well as in object detections and segmentations, by experiments. Temporal pupil size estimations by using the proposed approach for normal and abnormal subjects give meaningful indices of pupil size changes for ophthalmic diagnosis.

Original languageEnglish
Title of host publicationProceedings of the 16th International Conference on Machine Vision Applications, MVA 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9784901122184
DOIs
Publication statusPublished - 2019 May
Event16th International Conference on Machine Vision Applications, MVA 2019 - Tokyo, Japan
Duration: 2019 May 272019 May 31

Publication series

NameProceedings of the 16th International Conference on Machine Vision Applications, MVA 2019

Conference

Conference16th International Conference on Machine Vision Applications, MVA 2019
CountryJapan
CityTokyo
Period19-05-2719-05-31

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Signal Processing
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Pupil localization for ophthalmic diagnosis using anchor ellipse regression'. Together they form a unique fingerprint.

Cite this