Academic Journals Database
Disseminating quality controlled scientific knowledge

Performance issues in biometric authentication based on information theoretic concepts: A review

Author(s): Bhatnagar Jay | Lall Brejesh | Patney R

Journal: IETE Technical Review
ISSN 0256-4602

Volume: 27;
Issue: 4;
Start page: 273;
Date: 2010;
Original page

Keywords: Biometric authentication | Biometrics | Constrained capacity | Error exponents | Probability of random correspondence | Recognition capacity | Uniqueness.

Many of the performance evaluation techniques for biometric authentication use error probabilities to yield a measure called receiver operating characteristic (ROC). The ROC is based on the Neyman-Pearson hypothesis testing and is obtained by varying a threshold for decision making. This measure is dependent on database partitioning and choice of thresholds. Also, obtaining the probability distributions, and thus the ROC, is computationally complex. Recent approaches based on information theoretic models partially overcome these limitations and also provide insight into the performance of biometric authentication techniques. Measures in line with obtaining Chernoff capacity and with Shannon capacity have been proposed, and are respectively called recognition capacity and constrained capacity. Measures which are largely independent of data size and quality are based on the minimization of false matches between templates and are good indicators of biometrics uniqueness or, equivalently, its random correspondence. The parameters related to confidence intervals are however obtained empirically. One such measure is obtained from the probability distribution of the Hamming distance between templates of the iris biometrics. Also, relative entropy of features between a user and the population yields a measure of uniqueness for face biometrics. Another approach is to measure the probability of this random correspondence (PRC). PRC of a fingerprint biometrics has been obtained using compound statistical distributions. Another formulation of PRC is based on a rate-distortion framework by applying a distortion constraint to a codebook of binarized features. Biometric templates are akin to noisy source symbols in an information theoretic setup. A bound on the PRC has been obtained by developing error exponents of noisy biometrics represented in terms of a binary source-channel model. The method has low computational complexity, is not limited to a specific biometrics, and does not require empirically obtained confidence intervals.
Why do you need a reservation system?      Affiliate Program