A Survey on Fingerprint Compression Methods
Lakshmipriya. S1
1Dept. of Computer Science & Engg.
Sree Buddha College of Engg for Women. Pathanamthitta, Kerala, INDIA
Abstract— Fingerprint analysis have an important role in legal matters such as investigation of
crime. A fingerprint image consists of enormous amount of data. In order to reduce the amount of its
data some compression techniques are used. The compression algorithms for fingerprint have some
difficulties such as the need for preserving the minutiae (ridges endings and bifurcations), which are
used in identifications.
Keywords—Fingerprint compression, Sparse representation, WSQ, Beizer curve, Contourlet
Transform
I. INTRODUCTION
Fingerprint recognition is the most mature and established technique among all the biometrics systems. Biometrics based verification has a lot of attention because of the unprecedented proportion of identity frauds in our society and the increasing emphasis on the emerging automatic personal identification applications. Commonly used biometric traits are fingerprint, face, iris, hand geometry, voice, palm print, handwritten signatures etc. Identification and verification are the two kinds of biometric systems. A biometric signature of a new person is given to the identification system. If the system matches the new biometric signature with a database of existing biometric signatures of known individuals to identify if the person is previously known otherwise a stranger. A user gives a biometric signature and the system verifies in the verification system [7]. Fingerprint recognition is used for user identification because of its low cost, reliable performance and usability as compared with other biometrics such as signature, face, iris etc. and is used in forensic and commercial applications such as criminal investigation and electronic personal ID cards. The fingerprint recognition systems database may contain millions of fingerprint images. A fingerprint image has large amount of data and the storage of fingerprint image databases is on huge secondary storage devices. Fingerprints are unique identifiers of individuals. Large amount of fingerprints are collected and stored in many applications including forensics, access control. Large amount of data in a database consumes more amount of memory and the information contained in fingerprints must be compressed by extracting only visible elements. Fingerprint images have high energy in some high frequency bands obtained from the ridge-valley pattern and other structures. Lossy and lossless are the two methods for compression of images. Lossy compression transform an image into another domain and also quantize and encode its coefficients. It is used to compress multimedia data such as video, audio and still images. It eliminates the redundant or unnecessary data. Lossless compression is used to reconstruct original images from the compressed data and is used in source code, text documents and executable programs. Images can be compressed based on some transformations. Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) are the two common transformations. DCT expresses a sequence of data points in terms of a sum of cosine functions oscillating at different frequencies. A DCT is a fourier related transform similar to the discrete fourier transform (DFT), but using only real numbers. The discrete wavelet transform (DWT) uses a discrete set of the wavelet scales and translations obeying some defined rules. This transformation decompose the signal into mutually orthogonal set of wavelets. The wavelet is constructed from a scaling function which describes its scaling properties. Wavelet Scalar Quantization is a DWT based algorithm. WSQ is an algorithm to compress gray scale fingerprint images. It is a Federal Bureau of Investigation (FBI) standard for the compression of fingerprint images. In addition to WSQ, Contourlet Transform is another fingerprint compression algorithm. Bezier curves are used in many fields such as industrial and computer-aided design, vector-based drawing, font design and 3D modeling and also used in computer graphics to model smooth curves. Since the curve is in the convex hull of its control points and it is possible to graphically display the points and also the control points can be used to manipulate the curve intuitively.
II. FINGERPRINT COMPRESSION METHODS
T. Hopper et al. [2] proposed a method based on Wavelet Scalar Quantization (WSQ). WSQ is an algorithm used to compress gray scale fingerprint images. This algorithm is based on wavelet theory and are used for the storage and exchange of fingerprints. The WSQ encoders decompose the fingerprint image into a number of sub bands and it represents information in a particular frequency band. The decomposition of sub band is obtained by a discrete wavelet transformation of the fingerprint image. Each of the sub bands is then quantized using values from a quantization table. The quantized coefficients are then passed to a Huffman encoding procedure which compresses the data. Huffman table specifications must be given to the encoder. Shoreh Kasaei et al. [4] proposed an algorithm to solve the wedge problem in the pyramidal lattice shells. This algorithm has high compression ratio and high reconstructed image quality with a low computational load. This algorithm uses three lattice densities which are separated by two surfaces of constant probabilities. The first lattice is taken as the densest lattice. The second lattice is designed to cover less probable high energy vectors and the third lattice is empty. This algorithm starts with possible condition under which all of the input vectors are inside the two lattices which have the smallest scaling factors. The image is divided into sub images and are classified into different levels. D4 – PLVQ is applied to all the sub images except for the diagonal sub image of the first level and the vertical and low passed sub image of the fourth level. The input vectors are scaled and quantized. In this algorithm after selecting the essential data the binary image containing the mean of the positive and negative data is generated. As the quality of the reconstructed image will be preserved, this algorithm is simplified. This algorithm improves the quality of the reconstructed image. Karl Skretting et al. [6] proposed a method to compress image using learned dictionaries by RLSDLA and compared with K-SVD. The rate of compression is lower than the other algorithms. In this method, a set of vectors are formed from the non-overlapping patches of the image. Sparse matrix is obtained using sparse approximation. The non-zero weights are then quantized and coded. The dictionary is not included in the compressed file, it is a general dictionary and an internal part of the compression scheme. In this method, all learned dictionaries are general for the large class of grayscale natural images. This method is used to explore the compression capability of sparse approximations with dictionaries learned by RLS-DLA both in pixel domain and in the 9/7 wavelet domain. This method uses similar entropy coding as wavelet based schemes and Discrete Cosine Transforms (DCT). Vani Perumal et al. [3] proposed a method for fingerprint compression using Bezier curve representations. The ridges present in the fingerprint image are extracted along with their co-ordinate values and the control points are determined by visualizing each ridge as a Bezier curve. These control points of all the ridges determined are stored and are used to represent the fingerprint image. The fingerprint image is reconstructed from the stored control points using Bezier curves when needed. The two steps in this method consists are: extraction of ridges along with their co-ordinate values and compression using Bezier curve representations. Preprocessing and ridge extraction are the main steps in the extraction of ridges and their coordinates. Histogram equalization, orientation field estimation, fast fourier transform, binarization, enhancement, region of interest (ROI) extraction by morphological operations are the steps involved in preprocessing. In compression using Bezier curve representations each ridge is visualized as a cubic Bezier curve and its Bezier control points i.e. two end points and two control points are determined. The set of Bezier control points determined are served as compressed form of an individual ridge. This method achieves an effective reduction in the memory space required to store the fingerprint. T. Veerakumar et al. [5] proposed a new coding technique for Fingerprint compression based on Contourlet Transform and Self-Organizing Feature Map (SOFM) vector quantization. The Contourlet Transform is a directional transform that captures contours and fine details in images. In Contourlet Transform, the Laplacian pyramid does the decomposition of images into sub bands and then the directional filter banks analyze each detail image. SOFM has lower computational complexity and easier to implement than that of LBG algorithm. This technique is able to preserve minutiae of the fingerprints because of the following reasons: Contourlet have the property of preserving edges and ridges in fingerprints. Unsupervised learning is an approach for image coding because it can extract the features of images from self-organization properties. This main steps in this method are: 1. Contourlet transform is applied to the fingerprints to decorrelate the relationship between pixels. 2. Group the neighboring contourlet coefficients into one vector. 3. Weights are initialized as some random values. 4. The input vector is given to the network and the distance between the output vector and the input pattern nodes are calculated. 5. The output node with minimum distance is taken as the winning cluster. 6. To resemble the input vector the weight of the winning cluster is updated. 7. The indices are encoded by Huffman coding. Guangqi Shao et al. [1] proposed a new fingerprint algorithm based on sparse representation. Sparse representation has been used in some applications of image processing. It has good performance and compression efficiency is consistently lower than other algorithms. The methods based on sparse representation do not work well in general image compression and is used to compress fingerprint images. It includes construction of dictionary, compression of fingerprint, coding and quantization and analysis of algorithm complexity. i. Construction of Dictionary The dictionary is obtained from a training set. The fingerprint images are cut into fixed size square patches. Initially the dictionary is empty so the first patch is added. The next patch is compared with the previously added patch. If there is any similarity then it is discarded. Else add it to the dictionary. Repeat step 2 until all the patches are tested. ii. Compression of Fingerprint Slice the fingerprint into same size square patches. The size of the patch has direct impact on the compression efficiency. As the size of the patch increases the algorithm becomes more efficient and the computation complexity and size of dictionary also increases. So the proper size must be chosen. For each patch the mean is calculated and is subtracted from the patch. Also solve the - minimization problem using the Matching Pursuit method. The coefficients whose absolute value are less than the given threshold are taken as zero and record the remaining coefficients and their locations. For each patch encode the atom number, the mean value and the indexes. Also quantize and encode the coefficients. iii. Coding and Quantization Static arithmetic coders are used for entropy encoding of the atom number and the mean value of each patch, the indexes and the coefficients. The atom number and the mean value are separately coded. Lloyed algorithm is used for the quantization of coefficients. iv. Complexity of Algorithm Analysis The algorithm includes two processes: the training process and the compression process. The training process is off-line so only the complexity of the compression process is analyzed. The compressed stream does not include the dictionary and the information about the models. It consists the atom number, mean value and the indexes plus the coefficients.
III. CONCLUSION
The fingerprint compression using sparse representation have high compression ratio as compared to the other techniques. Some methods are better, some have high compression ratios, some have high performance, and some methods improves the quality of the images. One of the main difficulties in the fingerprint compression algorithms is the need for preserving the minutiae which are used in the identification. During the compression and reconstruction, the method based on sparse representation can hold most of the minutiae robustly.
REFERENCE
[1]. Guangqi Shao, Yanping Wu, Yong A, Xiao Liu, and Tiande Guo, “Fingerprint compression based on sparse representation”, IEEE Trans. on Image Pro., Vol. 23, No. 2, February 2014.
[2]. T. Hopper, C. Brislawn, and J. Bradley, “WSQ gray-scale fingerprint image compression specification,” Federal Bureau of Investigation, Criminal Justice Information Services, Washington, DC, USA, Tech. Rep. IAFIS-IC0110-V2, Feb. 1993.
[3]. Vani Perumal, Dr.Jagannathan Ramaswamy, “An innovative scheme for effectual fingerprint data compression using Bezier curve representations”, IJCSIS, Vol. 6, No. 1, 2009.
[4]. Shoreh Kasaei, Mohamed Deriche, “Fingerprint compression using a piecewise- pyramid lattice vector quantization”, Copyright 1997 IEEE.
[5]. T. Veerakumar, S. Esakkirajan, R. Sudhakar, and V. Senthil Murugan, “Fingerprint compression using contourlet transform and self-organizing feature map”, Iranian Journal of Electrical and Computer Engg., Vol. 6, No. 2, Summer-Fall 2007.
[6]. K. Skretting and K. Engan, ”Image compression using learned dictionaries by RLS-DLA and compared with KSVD,” in Proc. IEEE ICASSP, May 2011.pp.1517-1520.
[7]. P. Jonathon Philips, Alvin Martin, C. L. Wilson, Mark Przybocki,” An Introduction to evaluating Biometric Systems, “ IEEE computer, pp. 56-63, 2000.
I. INTRODUCTION
Fingerprint recognition is the most mature and established technique among all the biometrics systems. Biometrics based verification has a lot of attention because of the unprecedented proportion of identity frauds in our society and the increasing emphasis on the emerging automatic personal identification applications. Commonly used biometric traits are fingerprint, face, iris, hand geometry, voice, palm print, handwritten signatures etc. Identification and verification are the two kinds of biometric systems. A biometric signature of a new person is given to the identification system. If the system matches the new biometric signature with a database of existing biometric signatures of known individuals to identify if the person is previously known otherwise a stranger. A user gives a biometric signature and the system verifies in the verification system [7]. Fingerprint recognition is used for user identification because of its low cost, reliable performance and usability as compared with other biometrics such as signature, face, iris etc. and is used in forensic and commercial applications such as criminal investigation and electronic personal ID cards. The fingerprint recognition systems database may contain millions of fingerprint images. A fingerprint image has large amount of data and the storage of fingerprint image databases is on huge secondary storage devices. Fingerprints are unique identifiers of individuals. Large amount of fingerprints are collected and stored in many applications including forensics, access control. Large amount of data in a database consumes more amount of memory and the information contained in fingerprints must be compressed by extracting only visible elements. Fingerprint images have high energy in some high frequency bands obtained from the ridge-valley pattern and other structures. Lossy and lossless are the two methods for compression of images. Lossy compression transform an image into another domain and also quantize and encode its coefficients. It is used to compress multimedia data such as video, audio and still images. It eliminates the redundant or unnecessary data. Lossless compression is used to reconstruct original images from the compressed data and is used in source code, text documents and executable programs. Images can be compressed based on some transformations. Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) are the two common transformations. DCT expresses a sequence of data points in terms of a sum of cosine functions oscillating at different frequencies. A DCT is a fourier related transform similar to the discrete fourier transform (DFT), but using only real numbers. The discrete wavelet transform (DWT) uses a discrete set of the wavelet scales and translations obeying some defined rules. This transformation decompose the signal into mutually orthogonal set of wavelets. The wavelet is constructed from a scaling function which describes its scaling properties. Wavelet Scalar Quantization is a DWT based algorithm. WSQ is an algorithm to compress gray scale fingerprint images. It is a Federal Bureau of Investigation (FBI) standard for the compression of fingerprint images. In addition to WSQ, Contourlet Transform is another fingerprint compression algorithm. Bezier curves are used in many fields such as industrial and computer-aided design, vector-based drawing, font design and 3D modeling and also used in computer graphics to model smooth curves. Since the curve is in the convex hull of its control points and it is possible to graphically display the points and also the control points can be used to manipulate the curve intuitively.
II. FINGERPRINT COMPRESSION METHODS
T. Hopper et al. [2] proposed a method based on Wavelet Scalar Quantization (WSQ). WSQ is an algorithm used to compress gray scale fingerprint images. This algorithm is based on wavelet theory and are used for the storage and exchange of fingerprints. The WSQ encoders decompose the fingerprint image into a number of sub bands and it represents information in a particular frequency band. The decomposition of sub band is obtained by a discrete wavelet transformation of the fingerprint image. Each of the sub bands is then quantized using values from a quantization table. The quantized coefficients are then passed to a Huffman encoding procedure which compresses the data. Huffman table specifications must be given to the encoder. Shoreh Kasaei et al. [4] proposed an algorithm to solve the wedge problem in the pyramidal lattice shells. This algorithm has high compression ratio and high reconstructed image quality with a low computational load. This algorithm uses three lattice densities which are separated by two surfaces of constant probabilities. The first lattice is taken as the densest lattice. The second lattice is designed to cover less probable high energy vectors and the third lattice is empty. This algorithm starts with possible condition under which all of the input vectors are inside the two lattices which have the smallest scaling factors. The image is divided into sub images and are classified into different levels. D4 – PLVQ is applied to all the sub images except for the diagonal sub image of the first level and the vertical and low passed sub image of the fourth level. The input vectors are scaled and quantized. In this algorithm after selecting the essential data the binary image containing the mean of the positive and negative data is generated. As the quality of the reconstructed image will be preserved, this algorithm is simplified. This algorithm improves the quality of the reconstructed image. Karl Skretting et al. [6] proposed a method to compress image using learned dictionaries by RLSDLA and compared with K-SVD. The rate of compression is lower than the other algorithms. In this method, a set of vectors are formed from the non-overlapping patches of the image. Sparse matrix is obtained using sparse approximation. The non-zero weights are then quantized and coded. The dictionary is not included in the compressed file, it is a general dictionary and an internal part of the compression scheme. In this method, all learned dictionaries are general for the large class of grayscale natural images. This method is used to explore the compression capability of sparse approximations with dictionaries learned by RLS-DLA both in pixel domain and in the 9/7 wavelet domain. This method uses similar entropy coding as wavelet based schemes and Discrete Cosine Transforms (DCT). Vani Perumal et al. [3] proposed a method for fingerprint compression using Bezier curve representations. The ridges present in the fingerprint image are extracted along with their co-ordinate values and the control points are determined by visualizing each ridge as a Bezier curve. These control points of all the ridges determined are stored and are used to represent the fingerprint image. The fingerprint image is reconstructed from the stored control points using Bezier curves when needed. The two steps in this method consists are: extraction of ridges along with their co-ordinate values and compression using Bezier curve representations. Preprocessing and ridge extraction are the main steps in the extraction of ridges and their coordinates. Histogram equalization, orientation field estimation, fast fourier transform, binarization, enhancement, region of interest (ROI) extraction by morphological operations are the steps involved in preprocessing. In compression using Bezier curve representations each ridge is visualized as a cubic Bezier curve and its Bezier control points i.e. two end points and two control points are determined. The set of Bezier control points determined are served as compressed form of an individual ridge. This method achieves an effective reduction in the memory space required to store the fingerprint. T. Veerakumar et al. [5] proposed a new coding technique for Fingerprint compression based on Contourlet Transform and Self-Organizing Feature Map (SOFM) vector quantization. The Contourlet Transform is a directional transform that captures contours and fine details in images. In Contourlet Transform, the Laplacian pyramid does the decomposition of images into sub bands and then the directional filter banks analyze each detail image. SOFM has lower computational complexity and easier to implement than that of LBG algorithm. This technique is able to preserve minutiae of the fingerprints because of the following reasons: Contourlet have the property of preserving edges and ridges in fingerprints. Unsupervised learning is an approach for image coding because it can extract the features of images from self-organization properties. This main steps in this method are: 1. Contourlet transform is applied to the fingerprints to decorrelate the relationship between pixels. 2. Group the neighboring contourlet coefficients into one vector. 3. Weights are initialized as some random values. 4. The input vector is given to the network and the distance between the output vector and the input pattern nodes are calculated. 5. The output node with minimum distance is taken as the winning cluster. 6. To resemble the input vector the weight of the winning cluster is updated. 7. The indices are encoded by Huffman coding. Guangqi Shao et al. [1] proposed a new fingerprint algorithm based on sparse representation. Sparse representation has been used in some applications of image processing. It has good performance and compression efficiency is consistently lower than other algorithms. The methods based on sparse representation do not work well in general image compression and is used to compress fingerprint images. It includes construction of dictionary, compression of fingerprint, coding and quantization and analysis of algorithm complexity. i. Construction of Dictionary The dictionary is obtained from a training set. The fingerprint images are cut into fixed size square patches. Initially the dictionary is empty so the first patch is added. The next patch is compared with the previously added patch. If there is any similarity then it is discarded. Else add it to the dictionary. Repeat step 2 until all the patches are tested. ii. Compression of Fingerprint Slice the fingerprint into same size square patches. The size of the patch has direct impact on the compression efficiency. As the size of the patch increases the algorithm becomes more efficient and the computation complexity and size of dictionary also increases. So the proper size must be chosen. For each patch the mean is calculated and is subtracted from the patch. Also solve the - minimization problem using the Matching Pursuit method. The coefficients whose absolute value are less than the given threshold are taken as zero and record the remaining coefficients and their locations. For each patch encode the atom number, the mean value and the indexes. Also quantize and encode the coefficients. iii. Coding and Quantization Static arithmetic coders are used for entropy encoding of the atom number and the mean value of each patch, the indexes and the coefficients. The atom number and the mean value are separately coded. Lloyed algorithm is used for the quantization of coefficients. iv. Complexity of Algorithm Analysis The algorithm includes two processes: the training process and the compression process. The training process is off-line so only the complexity of the compression process is analyzed. The compressed stream does not include the dictionary and the information about the models. It consists the atom number, mean value and the indexes plus the coefficients.
III. CONCLUSION
The fingerprint compression using sparse representation have high compression ratio as compared to the other techniques. Some methods are better, some have high compression ratios, some have high performance, and some methods improves the quality of the images. One of the main difficulties in the fingerprint compression algorithms is the need for preserving the minutiae which are used in the identification. During the compression and reconstruction, the method based on sparse representation can hold most of the minutiae robustly.
REFERENCE
[1]. Guangqi Shao, Yanping Wu, Yong A, Xiao Liu, and Tiande Guo, “Fingerprint compression based on sparse representation”, IEEE Trans. on Image Pro., Vol. 23, No. 2, February 2014.
[2]. T. Hopper, C. Brislawn, and J. Bradley, “WSQ gray-scale fingerprint image compression specification,” Federal Bureau of Investigation, Criminal Justice Information Services, Washington, DC, USA, Tech. Rep. IAFIS-IC0110-V2, Feb. 1993.
[3]. Vani Perumal, Dr.Jagannathan Ramaswamy, “An innovative scheme for effectual fingerprint data compression using Bezier curve representations”, IJCSIS, Vol. 6, No. 1, 2009.
[4]. Shoreh Kasaei, Mohamed Deriche, “Fingerprint compression using a piecewise- pyramid lattice vector quantization”, Copyright 1997 IEEE.
[5]. T. Veerakumar, S. Esakkirajan, R. Sudhakar, and V. Senthil Murugan, “Fingerprint compression using contourlet transform and self-organizing feature map”, Iranian Journal of Electrical and Computer Engg., Vol. 6, No. 2, Summer-Fall 2007.
[6]. K. Skretting and K. Engan, ”Image compression using learned dictionaries by RLS-DLA and compared with KSVD,” in Proc. IEEE ICASSP, May 2011.pp.1517-1520.
[7]. P. Jonathon Philips, Alvin Martin, C. L. Wilson, Mark Przybocki,” An Introduction to evaluating Biometric Systems, “ IEEE computer, pp. 56-63, 2000.
No comments:
Post a Comment