2/2019 - 9 |
Pupil Segmentation Using Orientation Fields, Radial Non-Maximal Suppression and Elliptic ApproximationLEE, S.![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Extra paper information in ![]() ![]() ![]() |
Click to see author's profile in ![]() ![]() ![]() |
Download PDF ![]() |
Author keywords
image edge detection, image segmentation, image texture analysis, iris recognition, pattern analysis
References keywords
iris(15), recognition(10), segmentation(7), sign(4), patt(4)
Blue keywords are present in both the references section and the paper title.
About this article
Date of Publication: 2019-05-31
Volume 19, Issue 2, Year 2019, On page(s): 69 - 74
ISSN: 1582-7445, e-ISSN: 1844-7600
Digital Object Identifier: 10.4316/AECE.2019.02009
Web of Science Accession Number: 000475806300009
SCOPUS ID: 85066319709
Abstract
This paper proposes a novel pupil segmentation method for robust iris recognition systems. The proposed method uses orientation fields to accurately detect an initial pupil center, and applies radial non-maximal suppression to remove non-pupil boundaries. Finally, we repeatedly fit the pupil boundary by radius-updating, center-shifting and region of interest (ROI) shrinking adjusting the radius and center of a circular model, and the estimated pupil boundary is approximated with a novel elliptic model. By the elliptic approximation, the pupil boundaries are more correctly segmented than those of circular models. The detection hit ratio is largely improved due to robust detection of the initial centers. The experimental results show that the proposed method can accurately detect pupils for various iris images. |
References | | | Cited By «-- Click to see who has cited this paper |
[1] L. Ma, T. Tan, Y. Wang, and D. Zhang, "Efficient iris recognition by characterizing key local variations," IEEE. Trans. Imag. Proc., vol. 13, no. 6, pp. 739-750, 2004. [CrossRef] [SCOPUS Times Cited 820] [2] A. Jain, R. Bolle and S. Pankanti, "Biometrics: Personal Identification in a Networked Society," Springer US, New York, 2006. [3] D. Zhang, Automated Biometrics, "Technologies and Systems," Springer US, New York, 2000. [4] J. Daugman, "How iris recognition works," IEEE Trans. Circ. Syst. for Vid. Techn., vol. 14, no. 1, pp. 21-30, 2004. [CrossRef] [SCOPUS Times Cited 2563] [5] Y. H. Li and P. J. Huang, "An accurate and efficient user authentication mechanism on smart glasses based on iris recognition," Mobile Inform. Syst., vol. 2017, 1281020, 2017. [CrossRef] [SCOPUS Times Cited 25] [6] H. Proenca and L.A. Alexandre, "Introduction to the special issue on the segmentation of visible wavelength iris images captured at-a-distance and on-the-move," Image Vis. Comput., vol. 28, no. 2, pp. 213-214, 2010. [CrossRef] [SCOPUS Times Cited 23] [7] N. B. Puhan, N. Sudha, and A. S. Kaushalram, "Efficient segmentation technique for noisy frontal view iris images using Fourier spectral density," Sign. Imag. Video Proc., vol. 5, no. 1, pp. 105-119, 2011. [CrossRef] [SCOPUS Times Cited 40] [8] J. Huang, X. You, Y. Y. Tang, L. Du, and Y. Yuan, "A novel iris segmentation using radial-suppression edge detection," Sign. Proc., vol. 89, no. 12, pp. 2630-2643, 2009. [CrossRef] [SCOPUS Times Cited 54] [9] A. Radman, N. Zainal, ans S. A. Suandi, "Automated segmentation of iris images acquired in an unconstrained environment using HOG-SVM and GrowCut," Digit. Sign. Proc., vol. 64, pp. 60-70, 2017. [CrossRef] [SCOPUS Times Cited 83] [10] J. Daugman, "The importance of being random: statistical principles of iris recognition," Patt. Recogn., vol. 36, no. 2, pp. 279-291, 2003. [CrossRef] [SCOPUS Times Cited 764] [11] C.L. Tisse, L. Martin, L. Torres, and M. Robert, "Person identification technique using human iris recognition," in Proc. 15th Int. Conf. Vision Interface, Hong Kong, China, 2002, pp. 294-299. [12] J. Huang, Y. Wang, T. Tan, and J. Cui, "A new iris segmentation method for recognition," in Proc. 17th Int. Conf. Pattern Recognition, Cambridge, UK, 2004, pp. 554-557. [CrossRef] [SCOPUS Times Cited 95] [13] D. M. Monro, S. Rakshit, and D. Zhang, "DCT-based iris recognition," IEEE Trans. Patt. Analys. Mach. Intell., vol. 29, no. 4, pp. 586-595, 2007. [CrossRef] [SCOPUS Times Cited 372] [14] R. Krishnamoorthi and G. Annapoorani, "A simple boundary extraction technique for irregular pupil localization with orthogonal polynomials," Comp. Vis. Imag. Underst., vol. 116, no. 2, pp. 262-273, 2012. [CrossRef] [SCOPUS Times Cited 9] [15] J. Koh, V.Govindaraju, and V. Chaudhary, "A robust iris localization method using an active contour model and Hough transform," in Proc. 20th Int. Conf. Pattern Recognition, Istanbul, Turkey, 2010, pp. 2852-2856. [CrossRef] [SCOPUS Times Cited 64] [16] S. Shah, "Iris segmentation using geodesic active contours," IEEE Trans. Inform. For. Sec., vol. 4, no. 4, pp. 824-836, 2009. [CrossRef] [SCOPUS Times Cited 244] [17] K. Miyazawa, K. Ito, T. Aoki, K. Kobayashi, and H. Nakajima, "An effective approach for iris recognition using phase-based image matching," IEEE Trans. Patt. Analys. Mach. Intell., vol. 30, no. 10, pp. 1741-1756, 2008. [CrossRef] [SCOPUS Times Cited 252] [18] Z. Z. Abidin, et al., "Iris segmentation analysis using integro-differential operator and Hough transform in biometric system," J. Telec. Electr. Comput. Eng., vol. 4, no. 2, pp. 41-48, 2012 [19] L. Hong, Y. Wan, and A. K. Jain, "Fingerprint image enhancement: algorithm and performance evaluation," IEEE Trans. Patt. Analys. Mach. Intell., vol.20, no. 8, pp. 777-789, 1998. [CrossRef] [SCOPUS Times Cited 1802] [20] M. Liu, X. Jiang, and A. C. Kot, "Fingerprint reference-point detection," EURASIP J. Appli. Sign. Proc., vol. 5, pp. 498-509, 2005. [CrossRef] [SCOPUS Times Cited 91] [21] M. Tonsen, X. Zhang, Y. Sugano, and A. Bulling, "Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments," in Proc. ACM Int. Symp. Eye Tracking Research & Applications, SC, USA, 2016, pp. 139-142. [CrossRef] [SCOPUS Times Cited 115] Web of Science® Citations for all references: 0 SCOPUS® Citations for all references: 7,416 TCR Web of Science® Average Citations per reference: 0 SCOPUS® Average Citations per reference: 337 ACR TCR = Total Citations for References / ACR = Average Citations per Reference We introduced in 2010 - for the first time in scientific publishing, the term "References Weight", as a quantitative indication of the quality ... Read more Citations for references updated on 2025-07-01 12:21 in 121 seconds. Note1: Web of Science® is a registered trademark of Clarivate Analytics. Note2: SCOPUS® is a registered trademark of Elsevier B.V. Disclaimer: All queries to the respective databases were made by using the DOI record of every reference (where available). Due to technical problems beyond our control, the information is not always accurate. Please use the CrossRef link to visit the respective publisher site. |
Faculty of Electrical Engineering and Computer Science
Stefan cel Mare University of Suceava, Romania
All rights reserved: Advances in Electrical and Computer Engineering is a registered trademark of the Stefan cel Mare University of Suceava. No part of this publication may be reproduced, stored in a retrieval system, photocopied, recorded or archived, without the written permission from the Editor. When authors submit their papers for publication, they agree that the copyright for their article be transferred to the Faculty of Electrical Engineering and Computer Science, Stefan cel Mare University of Suceava, Romania, if and only if the articles are accepted for publication. The copyright covers the exclusive rights to reproduce and distribute the article, including reprints and translations.
Permission for other use: The copyright owner's consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific written permission must be obtained from the Editor for such copying. Direct linking to files hosted on this website is strictly prohibited.
Disclaimer: Whilst every effort is made by the publishers and editorial board to see that no inaccurate or misleading data, opinions or statements appear in this journal, they wish to make it clear that all information and opinions formulated in the articles, as well as linguistic accuracy, are the sole responsibility of the author.