首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到10条相似文献,搜索用时 390 毫秒
1.
A biometric discretization scheme converts biometric features into a binary string via segmenting every one-dimensional feature space into multiple labelled intervals, assigning each interval-captured feature element with a short binary string and concatenating the binary output of all feature elements into a bit string. This paper proposes a bit allocation algorithm for biometric discretization to allocate bits dynamically to every feature element based on a Binary Reflected Gray code. Unlike existing bit allocation schemes, our scheme bases upon a combination of bit statistics (reliability measure) and signal to noise ratio (discriminability measure) in performing feature selection and bit allocation procedures. Several empirical comparative studies are conducted extensively on two popular face datasets to justify the efficiency and feasibility of our proposed approach.  相似文献   

2.
Feiniu Yuan 《Pattern recognition》2012,45(12):4326-4336
Traditional methods for video smoke detection can easily achieve very low training errors but their generalization performances are not good due to arbitrary shapes of smoke, intra-class variations, occlusions and clutters. To overcome these problems, a double mapping framework is proposed to extract partition based features with AdaBoost. The first mapping is from an original image to block features. A feature vector is presented by concatenating histograms of edge orientation, edge magnitude and Local Binary Pattern (LBP) bit, and densities of edge magnitude, LBP bit, color intensity and saturation. Each component of the feature vector produces a feature image. To obtain shape-invariant features, a detection window is partitioned into a set of small blocks called a partition, and many multi-scale partitions are generated by changing block sizes and partition schemes. The sum of each feature image within each block of each partition is computed to generate block features. The second mapping is from the block features to statistical features. The statistical features of the block features, such as, mean, variance, skewness, kurtosis and Hu moments, are computed on all partitions to form a feature pool. AdaBoost is used to select discriminative shape-invariant features from the feature pool. Experiments show that the proposed method has better generalization performance and less insensitivity to geometry transform than traditional methods.  相似文献   

3.
This paper develops an approach to measure the information content of a biometric feature representation. We define biometric information as the decrease in uncertainty about the identity of a person due to a set of biometric measurements. We then show that the biometric feature information for a person may be calculated by the relative entropy D(p||q){D(p\|q)} between the population feature distribution q and the person’s feature distribution p. The biometric information for a system is the mean D(p||q){D(p\|q)} for all persons in the population. In order to practically measure D(p||q){D(p\|q)} with limited data samples, we introduce an algorithm which regularizes a Gaussian model of the feature covariances. An example of this method is shown for PCA and Fisher linear discriminant (FLD) based face recognition, with biometric feature information calculated to be 45.0 bits (PCA), 37.0 bits (FLD) and 55.6 bits (fusion of PCA and FLD features). Finally, we discuss general applications of this measure.  相似文献   

4.
Biometric cryptosystem has been proven to be a promising approach for template protection. Cryptosystems such as fuzzy extractor and fuzzy commitment require discriminative and informative binary biometric input to offer accurate and secure recognition. In multi-modal biometric recognition, binary features can be produced via fusing the real-valued unimodal features and binarizing the fused features. However, when the extracted features of certain modality are represented in binary and the extraction parameters are not known, real-valued features of other modalities need to be binarized and the feature fusion needs to be carried out at the binary level. In this paper, we propose a binary feature fusion method that extracts a set of fused binary features with high discriminability (small intra-user and large inter-user variations) and entropy (weak dependency among bits and high bit uniformity) from multiple sets of binary unimodal features. Unlike existing fusion methods that mainly focus on discriminability, the proposed method focuses on both feature discriminability and system security: The proposed method 1) extracts a set of weakly dependent feature groups from the multiple unimodal features; and 2) fuses each group to a bit using a mapping that minimizes the intra-user variations and maximizes the inter-user variations and uniformity of the fused bit. Experimental results on three multi-modal databases show that fused binary feature of the proposed method has both higher discriminability and higher entropy compared to the unimodal features and the fused features generated from the state-of-the-art binary fusion approaches.  相似文献   

5.
Although the use of biometrics for security access is convenient and easy to be implemented, it also introduced privacy and other security concerns when the original biometric templates are compromised. BioHash was introduced as a form of cancellable or replaceable biometrics through the integration of a set of user-specific random numbers with biometric features to address these concerns. However, the main drawback of the original form of BioHash is its inferior performance when an imposter obtains a legitimate token and uses it to claim as a genuine user (also known as the stolen-token scenario). In this paper, the problem is circumvented by a user-dependent multi-state discretization method. The experimental results on fingerprint database FVC2002 demonstrated a promising performance improvement on the stolen-token scenario when this discretization method was incorporated in the BioHash scheme. Moreover, the discretization method can render a long bit string, which is a useful feature to resist brute-force attacks. Some desired properties such as one-way transformation and diversity are also analyzed.  相似文献   

6.
Biometric authentication is increasingly gaining popularity in a wide range of applications. However, the storage of the biometric templates and/or encryption keys that are necessary for such applications is a matter of serious concern, as the compromise of templates or keys necessarily compromises the information secured by those keys. In this paper, we propose a novel method, which requires storage of neither biometric templates nor encryption keys, by directly generating the keys from statistical features of biometric data. An outline of the process is as follows: given biometric samples, a set of statistical features is first extracted from each sample. On each feature subset or single feature, we model the intra and interuser variation by clustering the data into natural clusters using a fuzzy genetic clustering algorithm. Based on the modelling results, we subsequently quantify the consistency of each feature subset or single feature for each user. By selecting the most consistent feature subsets and/or single features for each user individually, we generate the key reliably without compromising its relative security. The proposed method is evaluated on handwritten signature data and compared with related methods, and the results are very promising.  相似文献   

7.
叶学义 《计算机工程》2008,34(5):182-184
对生物特征数据的攻击是生物特征识别自身安全的主要威胁。为了提高虹膜特征数据的安全性,根据现有主要的虹膜识别方法中特征模板的数据特性和基于汉明距的比对方法,提出一种基于比特流的将虹膜特征模板数据嵌入人脸图像的数据隐藏算法。实验结果表明,该算法具有较强的隐蔽性,隐藏算法本身误码率为零,计算效率高,不会影响虹膜识别技术本身的性能,能够有效保护特征模板数据,增强虹膜识别系统自身的安全性。  相似文献   

8.
Biometric cryptosystems have been widely studied in the literature to protect biometric templates. To ensure sufficient security of the biometric cryptosystem against the offline brute-force attack (also called the FAR attack), it is critical to reduce FAR of the system. One of the most effective approaches to improve the accuracy is multibiometric fusion, which can be divided into three categories: feature level fusion, score level fusion, and decision level fusion. Among them, only feature level fusion can be applied to the biometric cryptosystem for security and accuracy reasons. Conventional feature level fusion schemes, however, require a user to input all of the enrolled biometric samples at each time of authentication, and make the system inconvenient.In this paper, we first propose a general framework for feature level sequential fusion, which combines biometric features and makes a decision each time a user inputs a biometric sample. We then propose a feature level sequential fusion algorithm that can minimize the average number of input, and prove its optimality theoretically. We apply the proposed scheme to the fuzzy commitment scheme, and demonstrate its effectiveness through experiments using the finger-vein dataset that contains six fingers from 505 subjects. We also analyze the security of the proposed scheme against various attacks: attacks that exploit the relationship between multiple protected templates, the soft-decoding attack, the statistical attack, and the decodability attack.  相似文献   

9.
In this paper, we propose and investigate a novel iris weight map method for iris matching stage to improve less constrained iris recognition. The proposed iris weight map considers both intra-class bit stability and inter-class bit discriminability of iris codes. We model the intra-class bit stability in a stability map to improve the intra-class matching. The stability map assigns more weight to the bits that have values more consistent with their noiseless and stable estimates obtained using a low rank approximation from a set of noisy training images. Also, we express the inter-class bit discriminability in a discriminability map to enhance the inter-class separation. We calculate the discriminability map using a 1-to-N strategy, emphasizing the bits with more discriminative power in iris codes. The final iris weight map is the combination of the stability map and the discriminability map. We conduct experimental analysis on four publicly available datasets captured in varying less constrained conditions. The experimental results demonstrate that the proposed iris weight map achieves generally improved identification and verification performance compared to state-of-the-art methods.  相似文献   

10.
With the emergence and popularity of identity verification means by biometrics, the biometric system which can assure security and privacy has received more and more concentration from both the research and industry communities. In the field of secure biometric authentication, one branch is to combine the biometrics and cryptography. Among all the solutions in this branch, fuzzy commitment scheme is a pioneer and effective security primitive. In this paper, we propose a novel binary length-fixed feature generation method of fingerprint. The alignment procedure, which is thought as a difficult task in the encrypted domain, is avoided in the proposed method due to the employment of minutiae triplets. Using the generated binary feature as input and based on fuzzy commitment scheme, we construct the biometric cryptosystems by combining various of error correction codes, including BCH code, a concatenated code of BCH code and Reed-Solomon code, and LDPC code. Experiments conducted on three fingerprint databases, including one in-house and two public domain, demonstrate that the proposed binary feature generation method is effective and promising, and the biometric cryptosystem constructed by the feature outperforms most of the existing biometric cryptosystems in terms of ZeroFAR and security strength. For instance, in the whole FVC2002 DB2, a 4.58% ZeroFAR is achieved by the proposed biometric cryptosystem with the security strength 48 bits.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号