Figure 1. Iris image in (a) visible light spetrum and (b) near infra red spectrum (a) (b) [629132]
ECAI 2018 – International Conference – 10th Edition Electronics, Computers and Artificial Intelligence 28 June -30 June, 2018, Iasi, ROMÂNIA IMPROVED CROSS SPECTRAL IRIS MATCHING USING MULTI SCALE WEBERFACE-GABOR LOCAL BINARY PATTERN (MGLBP) Maulisa Oktiana2, Khairun Saddami2, Fitri Arnia1,2, Yuwaldi Away1,2, Khairul Munadi1,2* 1Department of Electrical and Computer Engineering 2Postgraduate Program in Engineering Syiah Kuala University Banda Aceh, Indonesia *Corresponding author: [anonimizat] Abstract – Iris matching is widely used in various biometric systems, such as for person identification and verification. Cross spectral iris matching is defined as a matching between two iris images acquired in different electromagnetic spectrums. One of the problems in cross spectral iris matching is the modality gap between the images. Therefore, feature descriptors that have illumination invariant and generate higher accuracy is required. This paper proposes an improved cross spectral iris matching framework using Multi Scale Weberface (MSW). In the proposed method, MSW is integrated with Gabor Local Binary Pattern (GLBP). Experimental results demonstrate that the proposed framework outperform the previous method with higher accuracy. By using MGLBP, the recognition rate increases 40% compared to LBP and 9% compared to GLBP. Keywords- iris matching; cross spectral; NIR images, visible images; GLBP; MSW. I. INTRODUCTION Iris matching is getting more attention in recent years. This research is in line with application of biometrical identification such as national ID program, security applications and the unique identification authority [1]. Previously, iris matching used uniqueness of iris texture of an image acquired in visible light spectrum (VIS) [2-4]. Recently, the iris image is matched against images captured in different domain, such as Near Infrared (NIR) and VIS to increase iris recognition performance [5]. The identification becomes more optimal because the cross-spectral domain exploits the additional information that exists on both different spectrums. The most important idea of cross spectral iris matching is to produce the same representation of both VIS and NIR image. It is very big challenge to propose a similar representation between VIS and NIR images due to modality gap between NIR and VIS images. Therefore, a feature descriptor that is invariant to illumination and accurately represents the contents of VIS and NIR images is required in cross spectral iris matching. Multi Scale Weberface (MSW) is a photometric normalization that aims to reduce local variations which is caused differences of modality and illumination between VIS and NIR. MSW is used in face recognition at a preprocessing stage. By using MSW, the detail of the face becomes clear even though the image quality of the origin is very poor. MSW can extract the prominent pattern of an image by calculating the ratio between relative intensity difference of the current pixel to its neighbor, and the intensity of the current pixel. Therefore, in this study MSW is applied to improve the detail of the iris and pupils which have poor quality due to variations of illumination NIR and VIS images. In this paper, we propose to integrate MSW and Gabor Local Binary Pattern (GLBP) in order to generate robust feature descriptors. For practical reason, our approach is termed as MGLBP. By proposing MGLBP, we could enhance the iris features that results in higher recognition rate. Moreover, we are also proposing the adoption of face recognition approach to cross spectral iris matching domain. This paper is organized as follows. Section 2 provides a relevant literature review, and Section 3 explains the proposed framework. Result and discussion are given in section 4, and the conclusion will be summarized in section 5. II. LITERATURE REVIEW Cross spectral matching represents the ability to recognize the objects presented in two different modalities.
Figure 1. Iris image in (a) visible light spetrum and (b) near infra red spectrum (a) (b)
ECAI 2018 – International Conference – 10th Edition Electronics, Computers and Artificial Intelligence 28 June -30 June, 2018, Iasi, ROMÂNIA Figure 2. Proposed framework The modalities that is used in cross spectral matching are visible light spectrum (VIS) and Near Infrared (NIR). Iris recognition is one of the benchmark object that was recognized using cross spectral scheme. Various feature extraction methods are used in cross spectral iris matching. Sharma et al. employed pyramid of histogram of oriented gradients (PHOG) for feature extraction [8]. The PHOG features vector then used as input vector to neural network. Another interesting approach is conducted by Ramaiah and kumar [9-10]. They exploit real value of Log Gabor Phase for feature representation. Therefore, the used of real value of Log Gabor Phase can be adopted for domain adaptation learning in order to improve matching accuracy. Furthermore, Ramaiah and kumar also investigated the integration of the real value of 1-D log Gabor filter with various descriptors [11]. They convolved 1-D log Gabor filter with two variants of Local Binary Pattern (LBP), that is Four Patch LBP (FPLBP) and Three Patch LBP (TPLBP). Recently Behera et al. alleviated the difference of wavelength spectrum of NIR and VIS periocular images by convolved the image with Difference of Gaussian (DoG) filtering [12]. The feature is extracted using Histogram of Oriented Gradient (HOG) descriptors. Abdullah et al. explored various photometric normalization techinues and descriptors, namely Gabor Difference of Gaussian (G-DoG), Gabor Binarized Statistical Image features (G-BSIF), and Gabor Multi Scale Weberface (G-MSW) to increased matching accuracy using decision fusion [13]. The matching result of each method are fused with AND rules. In this paper, we integrated Multi Scale Weberface (MSW) with 1D log-Gabor filter to reduce the iris cross-spectral variations and enhance the feature extraction. Multi Scale Weberface (MSW) known as one of the photometric normalization techniques that is capable to reduce illumination variation in face recognition [14]. MSW is insensitive to variation of lighting sources by comparing local intensity variation and its background. Furthermore, the structural details in the illumination normalized iris image are captured by LBP. To the best of our knowledge, this is the first in the literature to propose this framework as feature representation of VL to NIR for iris matching. III. PROPOSED METHOD In this section, we explain the proposed method for thermal and visible iris matching briefly. For feature extraction, we explore the advantages of Multi scale Weberface and Gabor Local Binary Pattern. The proposed framework is illustrated in Fig. 1. Each of these stages will be explained as follows. A. VIS and NIR Images Segmentation The purpose of segmentation is to isolate the valid iris and pupil boundaries in VIS and NIR images. In this framework, VIS and NIR images are segmented using Circular Hough Transform (CHT). CHT finds the circle boundaries image in a set of edge points using canny edge detector [15]. The segmentation process as describes in [16]. B. VIS and NIR Iris Normalization The VIS and NIR iris normalization is processed using Daugman’s rubber sheet normalization as in[17-18]. This process aims to change the shape of 'Doughnut' iris region into polar coordinates. This step will produce normalized VIS and NIR iris that have fixed resolution,that is radial resolution 20 and angular resolution 240.
CHT Segmentation
Rubber sheet Iris Normalization
Multi Scale Weberface
NIR
1D Log Gabor Filter
Local Binary Pattern
Iris Code
VIS
Iris code
Local Binary Pattern
1D Log Gabor Filter
Multi Scale Weberface
Rubber sheet Iris Normalization
CHT Segmentation
Red channel
Similarity Measurement
Matching Score
Reference Threshold
Genuine
Imposter
Enrollment
Query
C. Multi Scale Weberface Multi Scale Weberface (MSW) is used to eliminate illumination variations of VIS and NIR images based on Weber's law. The MSW computation as follow [14]: 𝑊𝑒𝑏𝑒𝑟𝑓𝑎𝑐𝑒 (𝑥,𝑦)=𝑎𝑟𝑐𝑡𝑎𝑛1∝∑4564745869:;<= (1) where α represents a parameter which balanced the intensity variations between neighboring pixel, 𝑥> the center pixel, 𝑥:(𝑖=0,1,…,𝑝−1) are the neighboring pixels and p is number of neighbors. MSW is used in face recognition to calculate the relative gradient in contrast weber for some variation of neighboring size. By using MSW, the detail of the face becomes clear even though the image quality of the origin is very poor. Three parameters that affect the MSW results are standard deviation of the gaussian filter used in the smoothing step (σ), the size of the neighborhood used for computing the weberfaces, and α. In this experiment we used : σ = 1, 0.75 0.5, neighbor = 9, 25, 49 and α = 2, 0.2, 0.02 as following in the previous study [13]. D. 1D Log Gabor Filter 1D log Gabor filter have good theoretical properties and relevant to human vision therefore often integrated with some features [19]. Log gabor filter are used because they have 0 DC component for arbitary large bandwitdh. Also, the size distribution of features in an image is often logarithmic. The 1D log Gabor filter is [20] : 𝐺 (𝑓)=𝑒𝑥𝑝F−0.5 𝑥𝑙𝑜𝑔1LLM=N/𝑙𝑜𝑔1PLM=NQ (2) where 𝑓< is centre frequency and 𝜎 is bandwitdh of the filter. E. Local Binary Pattern Local Binary Pattern (LBP) is local texture descriptor. The texture differences are represented by gray scale value of each pixel. For each neighboring, the value of the neighboring pixel is compared to the pixel center then the result is ecnode into binary 0 and 1. The LBP feature extractions as describes in [21]. F. Similarity Measurement The matching process is defined as the degree of similarity between the two features. In this study, Hamming distance similarity measurement method is used. Hamming distance is used to measure how many bits are the same between two template bit patterns so it can be determined whether the two patterns are from the same or different iris. Hamming distance is calculated according to [22]: 𝐻𝐷=∑[V9W(XYZ)VNW](\]^)[_`ab 9W(\]^)_`ab NW]cWde]6∑V9f(YZ)VNfcfde (3) TABLE I. HAMMING DISTANCE VALUES Query LBP GLBP MGLBP 1 0,36958 0 0 2 0,4899 0,1016949 0,0660714 3 0,11501 0,1136363 0 4 0,04397 0,0131795 0,0019342 5 0,19264 0,0076628 0 6 0,19435 0 0 7 0,18409 0,0204081 0,0019342 8 0,25474 0,0195694 0,0057692 9 0,10422 0,060176 0 10 0,01374 0 0 where T1= bit wise template 1, T2=bit wise template 2, N= number of bit in template, mask 1, mask 2 are corresponding noise mask each template respectively. G. Reference Threshold Reference threshold is used to determine the system authentication whether the matching score result is genuine or imposter [23]. The reference threshold value is calculated according to [22]: 𝑇=|ijkil|m(njopnlo)o (4) where T is reference treshold, 𝜇a represent mean of intra class hamming distance distributions and 𝜇^ mean of inter class hamming distance distributions. 𝜎a and 𝜎^ are standar deviations of intra and inter class hamming distance distributions respectively. H. Performance Evaluation The accuracy of cross-spectral iris matching is measured by recognition rate while the efficacy is evaluated using the False Acceptance Rate (FAR) and False Rejection Rate (FRR). Recognition rate is defined as ratio an image recognized in the system. FAR is known as an acceptance error because the hamming values is smaller than the reference threshold. FRR occurs when the hamming values is greater than the reference threshold, therefore the identical image does not recognize by the system. The best iris recognition system is achieved when the system resulted a low FAR,FRR and higher recognition rate [23]. Recognition rate is computed using : 𝑅𝑒𝑐𝑜𝑔𝑛𝑖𝑡𝑖𝑜𝑛 𝑟𝑎𝑡𝑒= >sttu>v >s_8`t:aswvsv`x sL >s_8`t:asw𝑥100% (5) FAR is computed using : 𝐹𝐴𝑅= |tsw}x~ `>>u8vuvsv`x sL >s_8`t:asw𝑥100% (6)
ECAI 2018 – International Conference – 10th Edition Electronics, Computers and Artificial Intelligence 28 June -30 June, 2018, Iasi, ROMÂNIA TABLE II. CROSS SPECTRAL IRIS MATCHING PERFORMANCE Reference Threshold 0.38 Inter-class matching Intra-class matching LBP GLBP MGLBP LBP GLBP MGLBP FAR 96.47 98.8 99.3 0 0 0 FRR 10 0 0 13.37 0 0 Recognition Rate 90 100 100 86.67 100 100 Reference Threshold 0.2 Inter-class matching Intra-class matching LBP GLBP MGLBP LBP GLBP MGLBP FAR 76.5 95.7 99.3 0 0 0 FRR 30 0 0 54.7 0 0 Recognition Rate 70 100 100 45.3 100 100 Reference Threshold 0.1 Inter-class matching Intra-class matching LBP GLBP MGLBP LBP GLBP MGLBP FAR 49.5 85.47 96.6 0 0 0 FRR 80 20 0 70.67 7.33 0.067 Recognition Rate 20 80 100 29.33 92.67 99.93 Reference Threshold 0.05 Inter-class matching Intra-class matching LBP GLBP MGLBP LBP GLBP MGLBP FAR 24.4 62.8 83.07 0 0 0 FRR 80 30 0.067 74.7 30.667 9.33 Recognition Rate 20 70 90 25.3 69.33 90.67 Reference Threshold 0.01 Inter-class matching Intra-class matching LBP GLBP MGLBP LBP GLBP MGLBP FAR 3.06 23.5 57.07 0 0 0 FRR 100 60 0.133 83.3 75.3 28 Recognition Rate 0 40 80 16.7 24.67 72 FRR is computed using : 𝐹𝑅𝑅= |tsw}x~ tuu>vuvsv`x sL >s_8`t:asw𝑥100% (7) IV. EXPERIMENTAL RESULT AND DISCUSSION The simulation is conducted using polyU iris database [24]. PolyU database consists 150 subjects and 15 left and right iris images/person. Ten iris images are used for queries and 150 iris images are used for enrollment. Intra-class matching (comparison with same subjets) is performed using 10 iris image as queries and 15 iris image from same subject as enrollment. As for inter-class matching (different subject), 10 iris images from different subject are used for queries and 150 different iris images are used for enrollment. Intra-class matching resulting 150 comparisons and inter-class matching resulting 1500 comparisons. In this study, we use five reference threshold values: [0.38, 0.2, 0.1, 0.05, and 0.01]. The 0.38 is obtained by calculating the means and standar deviation of the intra-class and inter-class hamming distance distributions. The rest are used as variations of reference threshold in order to determine the best separation points which is resulted optimal recognition performance.
ECAI 2018 – International Conference – 10th Edition Electronics, Computers and Artificial Intelligence 28 June -30 June, 2018, Iasi, ROMÂNIA
Figure 3. Inter-class matching performance
Figure 4. Intra-class matching performance Table I is illustrated the hamming value for inter-class cross spectral iris matching. This score represents the hamming distance value of the query image which is identical with the image in the database. By using MGLBP, hamming distance has smaller value than others. Almost all comparisons with the same subject have a small hamming value. This indicates the two iris templates are generated from the same iris. Table II is represented performance of cross spectral iris matching for intra-class and inter-class comparisons. At 0.38 reference threshold, MGLBP and GLBP generate an ideal intra-class cross spectral iris matching with 100% recognition rate while 0.00% for both FAR and FRR. By using reference threshold 0.2, MGLBP still resulted higher recognition rate with 0.00% FAR and FRR compare to others. FRR increases when reference threshold 0.1 is used. This indicates there are 70.67% genuine users that are rejected by the system using LBP, 7.33% using GLBP and 0.0667% using MGLBP. While recognition rate 29.33%, 92.67% and 99.93% respectively. At the 0.05 reference threshold, the recognition rate starts to decrease. LBP resulted 25.3% recognition rate, GLBP 69.33% while MGLBP still resulted higher accuracy with recognition rate around 90.67%. Otherwise, the rejection rate increased by using this reference threshold. 74.7% genuine users were rejected by the system using LBP, 30.667% using GLBP and 9.33% using MGLBP. We concluded that the smaller the reference threshold is used, the greater the rejection rate (FRR), while the lower the acceptance error ratio (FAR). An ideal iris recognition system is when both FAR and FRR values are low or equal. For intra-class comparison, perfect recognition is achieved by using reference treshold 0.38 and 0.2, which gives FAR and FRR both as 0.00% while recognition rate 100%. Whereas for inter-class cross spectral matching, the ideal recognition is obtained by selecting at 0.1 reference threshold. Figs.3 and 4 are illustrated the impact of selecting reference treshold using LBP, GLBP, and MGLBP for intra-class and inter-class matching. The small reference threshold value is used the system becomes secure but the recognition rate is lower. Conversely, the higher reference threshold is used, the system becomes inefficient because many impostors whose hamming score under the reference threshold will be recognized as genuine. With different reference threshold values, MGLBP still resulted higher recognition rate and better performance compare to GLBP and LBP. Table III is illustrated the addition of MSW to the descriptors. For all the reference thresholds used in this simulation, MGLBP has a higher recognition rate compared to LBP and GLBP because he salient pattern in the iris image can be very well extracted by MGLBP. By adding the MSW, the recognition rate increases 40% to LBP and 9% to GLBP.
TABLE III. EFFECT OF ADDITION MSW TO RECOGNITION RATE RECOGNITION RATE (%) Reference Treshold LBP GLBP MGLBP 0,38 94,11765 100 100 0,2 50 94,11765 97,0588 0,1 17,64706 82,35294 85,2941 0,05 14,70588 61,76471 73,5294 0,01 0 29,41176 61,7647 V. CONCLUSION This paper proposes a novel framework to improved Gabor Local Binary Pattern for cross spectral iris matching by integrated GLBP with Multi Scale Weberface. MSW can extracts the detail of the normalized iris and pupils clearly even though the image quality is poor due to variations of illumination. Compared with LBP and GLBP, experiments show that MGLBP have higher recognition rate at various reference treshold. Our future work will investigate Gabor and MSW based feature extractor technique to propose a new feature descriptor. ACKNOWLEDGMENT This research was supported by the Ministry of Research, Technology, and Higher Education of the Republic of Indonesia, under the PMDSU scheme. REFERENCES [1] A.K. Jain, K. Nandakumar, A. Ross., “50 Years of biometric research: accomplishments, challenges, and opportunities,” Pattern Recognit Lett vol.79. pp. 80–105, 2016. [2] K. W. Bowyer, K. Hollingsworth, and P. J. Flynn, “Image understanding for iris biometrics: A survey,” Comput. Vis. Image Underst., vol. 110, no. 2, pp. 281–307, 2008. [3] M. J. Burge and Ke. W. Bowyer, Handbook of Iris Recognition, vol. 1542. 2013. [4] J. Daugman. Biometric personal identification system based on iris analysis. U.S. Patent No. 5,291,560, March 1994. [5] K. K. Fasna, P. Athira, and K.J.S. Remya, “A review on iris feature extraction methods,” Int. Journ of Engineering Research and general science., vol.4, no.2, pp. 663-667. 2016. [6] Y. C. Cabrera, J. L. G. Rodriguez, and E. G. Liano, “ Iris Feature Extraction Methods,” 2014. [7] M. Trokielewicz and E. Bartuzi, “Cross-spectral Iris Recognition for Mobile Applications using High-quality Color Images.” Journal of Telecommunications and Information Technology, pp. 91-97, 2016. [8] A. Sharma, S. Verma, M. Vatsa, and R. Singh, “On cross spectral periocular recognition,” 2014 IEEE Int. Conf. Image Process. ICIP 2014, pp. 5007–5011, 2014. [9] N. Ramaiah and A. Kumar, “Towards More Accurate Iris Recognition using Cross-Spectral Matching,” IEEE Trans. Image Process., vol. 7149, no. c, pp. 1–1, 2016. [10] N. P. Ramaiah and A. Kumar, “Advancing Cross-Spectral Iris Recognition Research Using Bi-Spectral Imaging,” in Machine Intelligence and Signal Processing, A. K. Richa Singh, Mayank Vatsa, Angshul Majumdar, Ed. Springer, 2016, pp. 1–10. [11] N. P. Ramaiah and A. Kumar, “On Matching Cross-Spectral Periocular Images for Accurate Biometrics Identification,” in Biometrics Theory, Applications and Systems (BTAS), 2016. [12] S. S. Behera, M. Gour, and V. Kanhangad, “Periocular Recognition in Cross-Spectral Scenario,” in International Joint Conference on Biometrics (IJCB), 2017, pp. 681–687. [13] M. A. M. Abdullah, S. S. Dlay, W. L. Woo, and J. A. Chambers, “A novel framework for cross-spectral iris matching,” IPSJ Trans. Comput. Vis. Appl., vol. 8, no. 1, p. 9, 2016. [14] B. Wang, W. Li, W. Yang, and Q. Liao, “Illumination Normalization Based on Weber ’ s Law With Application to Face Recognition,” vol. 18, no. 8, pp. 462–465, 2011. [15] T. Kazakov and T. Kazakov, “Iris Detection and Normalization,” Birmingham, 2011. [16] R. Jillela et al., “Iris Segmentation for Challenging Periocular Images,” in Handbook of Face Recognition, J. Burge, Mark and W. Bowyer, Kevin, Eds. london: Springer-Verlag London, 2013, pp. 281–308. [17] M. Moh, “Unconstrained Iris Recognition ,” De Montfort University, 2014. [18] T. Johar and P. Kaushik, “Iris Segmentation and Normalization using Daugman ’ s Rubber Sheet Model,” vol. 1, no. 1, pp. 11–14, 2015. [19] K. Y. Chen and D. Zhang, “Gabor Surface Feature for Face Recognition,” in 1st Asian Conference on Pattern Recognition. 2011. [20] A. T. Khalil and F. E. M. Abou Chadi, “Generation of Iris Codes Using 1D Log Gabor Filter,” in International Conference on Computer Engineering & Systems. 2010 [21] T. Ojala, A.M. Pietik, and T. Maenpaa, “Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns,” IEEE Transactions on Pattern Analysis and Machine Intelligence.; vol. 24, no. 7, pp. 971–987, 2002. [22] L. Masek, “Recognition of Human Iris Patterns for Biometric Identification,” 2003. [23] J. Malik and D. Girdhar, “Reference Threshold Calculation for Biometric Authentication,” vol. 2, no. January, pp. 46–53, 2014. [24] http://www4.comp.polyu.edu.hk
Copyright Notice
© Licențiada.org respectă drepturile de proprietate intelectuală și așteaptă ca toți utilizatorii să facă același lucru. Dacă consideri că un conținut de pe site încalcă drepturile tale de autor, te rugăm să trimiți o notificare DMCA.
Acest articol: Figure 1. Iris image in (a) visible light spetrum and (b) near infra red spectrum (a) (b) [629132] (ID: 629132)
Dacă considerați că acest conținut vă încalcă drepturile de autor, vă rugăm să depuneți o cerere pe pagina noastră Copyright Takedown.
