With the exponential increase in the dependence on mobile devices in everyday life, there is a growing concern related to privacy and security issues in the Gulf countries; therefore, it is imperative that security threats should be analyzed in detail. Mobile devices store enormous amounts of personal and financial information, unfortunately without any security. In order to secure mobile devices against different threats, biometrics has been applied and shown to be effective. However, biometric mobile applications are also vulnerable to several types of attacks that can decrease their security. Biometric information itself is considered sensitive data; for example, fingerprints can leave traces in touched objects and facial images can be captured everywhere or accessed by the attacker if the facial image is stored in the mobile device (lost or stolen). Hence, an attacker can easily forge the identity of a legitimate user and access data on a device. In this paper, the effects of a trace attack on the sensitivity of biometric mobile applications are investigated in terms of security and user privacy. Experimental results carried out on facial and fingerprint mobile authentication applications using different databases have shown that these mobile applications are vulnerable to the proposed attack, which poses a serious threat to the overall system security and user privacy.
It is inevitable that the use of the PIN (personal identification number) as the sole secret to control authenticated access will become obsolete due the exponential growth and accessibility of handheld devices. As an alternative solution, biometric-based authentication techniques on mobile devices can efficiently verify the identity of a person, not just for unlocking the device but also for approving payments and as part of multifactor authentication services. Biometrics refers to physiological/behavioral traits, such as fingerprint, iris, face, and keystroke dynamics. Unlike a password, biometrics cannot be forgotten or stolen. This makes biometric modalities more suitable for authentication applications, especially from the perspective of the users. Biometric-based authentication applications consist of two stages. The first stage is
Although biometrics can increase the security of mobile applications over existing classical authentication techniques, these technologies have some drawbacks and are vulnerable to several attacks that undermine the authentication process by either bypassing the security of the system or preventing the functioning of the system [
In this paper, we present a new attack on biometric mobile applications based on the alteration of user images. We suppose that the impostor has modified versions of the user’s images and uses them to gain unauthorized access. This type of alteration has not yet been presented in literature and has not been applied to biometric mobile applications. We evaluated the effect of this attack on the security of fingerprint and facial biometric mobile applications and user privacy using different types of image alterations.
The rest of the paper is organized as follows. Section
Biometric-based applications are vulnerable to several types of attacks, which can be classified into direct and indirect attacks as shown in Figure
Attack types and levels in a biometric authentication system.
Indirect attacks can be launched on the interface between modules or on the software modules. In the case of the interface between modules, the attacker can resubmit previously intercepted biometric data before extraction or after extraction of biometric features (replay attack). The transmission channel between the comparator and database can be altered, and the result of the comparator can also be compromised [
For the mobile biometric application, spoofing is by far the most used direct attack. The impostor can use information from mobile data (left unwatched or stolen) to gain illegitimate access to the mobile applications. In [
In the case of indirect attacks, several studies have concluded that the majority of mobile application users do not understand permission warnings when a malicious software (e.g., backdoor) is installed, allowing the attacker to gain system privileges and remotely access the device’s camera [
Despite active research in recent years in the evaluation of biometric-based mobile applications, very few studies have focused on the effect of alteration on the security and robustness of these systems. Alteration of fingerprints has been used to hide the identity of the impostor and gain unauthorized access to the biometric system [
In this paper, we present other types of alterations that can be applied on different biometric authentication systems, especially biometric mobile applications. This attack can be applied using different modalities, making it dangerous not only in the case of mobile applications based on fingerprint or facial authentication but also in iris- and voice-based mobile applications. Unlike the alterations in [
Difference between sensor attacks and the proposed attack in a biometric authentication system.
The modified version of the user image can be recovered from biometric traces using, for example, the user’s picture or traces of the fingerprint left on a touched surface. The impostor can use this image as a request to gain unauthorized access or to acquire some information about the user, which affects the user’s privacy. We have focused on six categories of alteration based on the changes made on the reference images, as shown in Figure
Samples of altered images from the FVC2002 and Yale database.
In order to evaluate the security of biometric mobile applications against the proposed attack, we defined the criterion to measure the percentage of acceptance of the impostor who used altered images in order to gain illegitimate access. We named this criterion
Since biometric information is very sensitive data, the potential to misuse or abuse it poses a serious menace to the user’s privacy. Therefore, we analyze the effect of the alteration attack on the privacy of the user. We suppose that the impostor does not know the system parameters. Our goal is to quantify the amount of information on reference images that can be gained from altered images. To this end, we consider an information theory analysis under various kinds of alteration attacks, and we examine the information content in biometric data. We use mutual information [
We test the different categories of the proposed attack against fingerprint and facial mobile applications using different databases. Both applications are evaluated at two levels: security and privacy. To evaluate the security, we calculate the matching score and the number of matched associations of the altered images used by the impostor to gain unauthorized access. At the privacy level, we evaluate the amount of information leaked by the impostor concerning the reference image.
The fingerprint authentication application is implemented based on four stages (Figure
Different steps of minutia point extraction.
At first, in order to show the effect of the level of alteration on the security of the biometric mobile application, we evaluate the security of the fingerprint-based mobile application against an alteration attack for all the users according to the alteration levels. Figure
Alteration based on luminosity for fingerprint authentication applications.
To evaluate the effect of blurring on the fingerprint authentication system based on FVC2004 and FVC2002 databases, blurred images are used. As shown in Figure
Alteration based on blurring for fingerprint authentication applications.
In order to study the effect of noise alteration, we first calculate the peak signal to noise ratio (PSNR) [
PSNR with respect to noise level using fingerprint images.
Moreover, we also consider the case of biometric mobile applications where images are preprocessed and then postprocessed. Hence, we compare the extracted features from noisy images and the reference image of the user.
We present the variation of the matching score depending on the noise levels. We notice that the matching score is increased, even if the percentage of noise is higher in altered images (Figure
Alteration based on noise for fingerprint authentication applications.
When the impostor possesses a partial reference image of the real user, he/she can use a partial attack to gain unauthorized access to the biometric authentication system. To illustrate this attack scenario, we use different parts of the user image and calculate the matching score between the extracted features from the partial altered image and the complete reference image (Figure
Alteration based on part of user’s image for fingerprint authentication applications.
In the case of alteration using a negative image, as shown in Figure
Alteration based on negative image for fingerprint authentication applications.
For alterations based on a mosaic image, we combine four different parts of the user’s biometric trait images to create a mosaic image. As shown in Figure
Alteration based on mosaic image for fingerprint images.
A second point that we evaluate in this paper is the privacy concern under different types of alterations. To test the effect of information leakage on the user’s privacy, we first measure the amount of information leaked for each user. Then, for each type of alteration, we calculate the average mutual information using all altered images at different levels for FVC2002 (Figure
Average of mutual information for FVC2002 database.
Average of mutual information for FVC2004 database.
For each user, the impostor can leak more information about the reference image using altered images, especially in the case of noisy images and increased luminosity. This vulnerability varies from one user to another. Hence, the attack effect is not the same for all users. This can be explained by the difference of image quality between different users and interclass variability.
To create the face-based authentication application, we calculate the number of associations between the reference and request images. At first, local features are detected and extracted using scale-invariant feature transform (SIFT) [
Figure
SIFT-based facial authentication system.
Figure
Number of associations using luminosity alteration for facial authentication applications.
Figure
Number of associations using blurring alteration for facial authentication applications.
In the case of noise alteration, we first calculate the difference between the reference and altered images without considering the biometric authentication application (Figure
PSNR with respect to the noise levels.
On the other hand, considering the facial authentication application, we notice in Figure
Number of associations using noise alteration for facial authentication applications.
In order to illustrate the effect of partial images on the face-based mobile application, we measure the number of matched associations between different partial images and the reference image of the user (Figure
Number of associations using partial alteration for facial authentication applications.
In the case of alteration using mosaic images, we notice in Figure
Number of associations using mosaic alteration for facial authentication application.
For negative images, using the facial authentication application based on the SIFT descriptor, the devices cannot accept the negative image attack. This is due to the SIFT process, where the associations are randomly matched. The failure to match using the negative image cannot be generalized to all face-based authentication devices. This type of attack can be successful for facial authentication devices based on other biometric feature extraction processes. Figure
Matching of the original and the negative image.
In order to test the privacy consequence for face-based mobile applications, we calculate the mutual information between the reference and altered images. The average of mutual information for each user is measured using all altered images of each user for every type of alteration.
It can be clearly noted that, as shown in Figure
Average of mutual information using different types of alterations for the AR database.
Average of mutual information using different types of alterations for the Yale database.
This can be explained by the nature of the Yale facial database, which contains grayscale images, unlike the AR facial database, which has RGB color images. Moreover, we notice that even if the impostor who used negative images cannot be accepted by the system, he/she can gain important information about the user, which represents a privacy concern.
Tables
Results of fingerprint authentication application.
Fingerprint authentication application | |||||
---|---|---|---|---|---|
Alteration | Levels | Matching score | Number of features in | ||
Reference image | Altered image | ||||
FVC2002 database | Blur | 1 | 71.42 | 403 | 273 |
2 | 55.1 | 345 | |||
6 | 31 | 352 | |||
Noise | 1.45 | 77.7 | 162 | 807 | |
49.9 | 88.9 | 490 | |||
82.29 | 55.66 | 210 | |||
Luminosity | −84.8 | 78 | 360 | 598 | |
−12.25 | 75 | 681 | |||
50 | 50 | 418 | |||
Part of user image | 31.16 | 52 | 600 | 672 | |
93.5 | 28 | 321 | |||
115.83 | 16 | 196 | |||
Mosaic | — | 66.66 | 287 | 1,167 | |
— | 37.5 | 700 | 1,071 | ||
— | 30.43 | 505 | 658 | ||
Negative | — | 84.78 | 505 | 1,728 | |
— | 66.66 | 700 | 1,216 | ||
— | 44.44 | 527 | 1,147 | ||
|
|||||
FVC2004 database | Blur | 1 | 70.58 | 476 | 703 |
2 | 55.88 | 623 | |||
6 | 50 | 405 | |||
Noise | 1.45 | 14.84 | 518 | 801 | |
49.9 | 58.06 | 370 | |||
90.36 | 70 | 598 | |||
Luminosity | −84.8 | 100 | 356 | 34 | |
−12.25 | 100 | 518 | |||
50 | 22.22 | 51 | |||
Part of user image | 80 | 65 | 339 | 612 | |
120 | 55 | 490 | |||
160 | 25 | 108 | |||
Mosaic | — | 60 | 339 | 875 | |
— | 54.54 | 176 | 629 | ||
— | 42.85 | 237 | 531 | ||
Negative | — | 82.5 | 518 | 528 | |
— | 75 | 339 | 503 | ||
— | 57.14 | 231 | 233 |
Results of face authentication application.
Face authentication application | |||||
---|---|---|---|---|---|
Alteration | Levels | Number of associations | Number of associations in | ||
Reference image | Altered image | ||||
Yale database | Blur | 1 | 103 | 163 | 134 |
2 | 65 | 106 | |||
6 | 12 | 43 | |||
Noise | 1.45 | 65 | 79 | 169 | |
49.9 | 75 | 170 | |||
90.36 | 119 | 123 | |||
Luminosity | −44.61 | 86 | 79 | 118 | |
−15.64 | 114 | 126 | |||
31 | 104 | 118 | |||
Part of user image | 48 | 68 | 79 | 118 | |
96 | 61 | 99 | |||
240 | 36 | 60 | |||
Mosaic | — | 65 | 98 | 110 | |
— | 35 | 79 | 113 | ||
— | 8 | 141 | 150 | ||
|
|||||
AR database | Blur | 1 | 158 | 195 | 169 |
3 | 133 | 152 | |||
6 | 81 | 121 | |||
Noise | 1.4 | 137 | 195 | 314 | |
49.62 | 189 | 248 | |||
90.36 | 140 | 198 | |||
Luminosity | −44.61 | 181 | 195 | 194 | |
−15.64 | 190 | 195 | |||
31 | 151 | 192 | |||
Part of user image | 48 | 147 | 195 | 204 | |
144 | 130 | 179 | |||
240 | 105 | 164 | |||
Mosaic | — | 151 | 264 | 305 | |
— | 146 | 251 | 451 | ||
— | 131 | 195 | 245 |
In this paper we have presented, to the best of our knowledge, the first alteration attack on biometric mobile applications. This attack is based on image trace using altered versions of reference images of the user in order to gain illegitimate access to biometric mobile applications. We have distinguished between six types of alteration attacks and their effects on face- and fingerprint-based authentication mobile applications. We have altered the user’s image using the modification of luminosity, noise, blurring, and negative images. We have also considered the case when an impostor has a part or several parts of the user’s image(s). Experiments are conducted on fingerprints using FVC2002 and FVC2004 databases and on face-based authentication applications using the Yale and AR databases. We have evaluated the matching score of both systems using the alteration attack and then studied the effects on user privacy. The experimental results show that biometric-based mobile applications based on fingerprint and facial images are vulnerable to the proposed attack. Furthermore, using this attack, the impostor can gain more information about the user’s reference image, which compromises the user’s privacy. In future work, we intend to extend this work and study the effect of trace attacks on protected biometric mobile devices using template protection algorithms, such as fuzzy vault and fuzzy commitment.
The authors declare that they have no competing interests.
This research project was supported by a grant from the “Research Center of the Female Scientific and Medical Colleges,” Deanship of Scientific Research, King Saud University.