Skip to main content

Improving less constrained iris recognition

Hu, Yang (2017) Improving less constrained iris recognition. Doctor of Philosophy (PhD) thesis, University of Kent,. (KAR id:59928)

PDF
Language: English
Download (16MB) Preview
[thumbnail of 68thesis.pdf]
Preview
This file may not be suitable for users of assistive technology.
Request an accessible format

Abstract

The iris has been one of the most reliable biometric traits for automatic human authentication due to its highly stable and distinctive patterns. Traditional iris recognition algorithms have achieved remarkable performance in strictly constrained environments, with the subject standing still and with the iris captured at a close distance. This enables the wide deployment of iris recognition systems in applications such as border control and access control. However, in less constrained environments with the subject at-a-distance and on-the-move, the iris recognition performance is significantly deteriorated, since such environments induce noise and degradations in iris captures. This restricts the applicability and practicality of iris recognition technology for some real-world applications with more open capturing conditions, such as surveillance, forensic and mobile device security applications. Therefore, robust algorithms for less constrained iris recognition are desirable for the wider deployment of iris recognition systems.

This thesis focuses on improving less constrained iris recognition. Five methods are proposed to improve the performance of different stages in less constrained iris recognition. First, a robust iris segmentation algorithm is developed using l1-norm regression and model selection. This algorithm formulates iris segmentation as robust l1-norm regression problems. To further enhance the robustness, multiple segmentation results are produced by applying l1-norm regression to different models, and a model selection technique is used to select the most reliable result. Second, an iris liveness detection method using regional features is investigated. This method seeks not only low level features, but also high level feature distributions for more accurate and robust iris liveness detection. Third, a signal-level information fusion algorithm is presented to mitigate the noise in less constrained iris captures. With multiple noisy iris captures, this algorithm proposes a sparse-error low rank matrix factorization model to separate noiseless iris structures and noise. The noiseless structures are preserved and emphasised during the fusion process, while the noise is suppressed, in order to obtain more reliable signals for recognition. Fourth, a method to generate optimal iris codes is proposed. This method considers iris code generation from the perspective of optimization. It formulates traditional iris code generation method as an optimization problem; an additional objective term modelling the spatial correlations in iris codes is applied to this optimization problem to produce more effective iris codes. Fifth, an iris weight map method is studied for robust iris matching. This method considers both intra-class bit stability and inter-class bit discriminability in iris codes. It emphasises highly stable and discriminative bits for iris matching, enhancing the robustness of iris matching.

Comprehensive experimental analysis are performed on benchmark datasets for each of the above methods. The results indicate that the presented methods are effective for less constrained iris recognition, generally improving state-of-the-art performance.

Item Type: Thesis (Doctor of Philosophy (PhD))
Thesis advisor: Howells, Gareth
Thesis advisor: Sirlantzis, Konstantinos
Uncontrolled keywords: Iris Recognition, Less Constrained Environments, Iris Segmentation, Iris Liveness Detection, Information Fusion, Iris Code, Iris Matching
Depositing User: Users 1 not found.
Date Deposited: 19 Jan 2017 12:00 UTC
Last Modified: 15 Nov 2020 04:09 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/59928 (The current URI for this page, for reference purposes)
  • Depositors only (login required):