understanding the importance and impact of anonymity and authentication in a networked society
navigation menu top border

.:home:.     .:project:.    .:people:.     .:research:.     .:blog:.     .:resources:.     .:media:.

navigation menu bottom border
main display area top border
« Taking it to the Streets | Main | it’s different for girls: the importance of recognizing and incorporating equality in discussions of Internet speech »

Are Biometrics Race-Neutral?

posted by:Shoshana Magnet // 11:59 PM // June 05, 2007 // ID TRAIL MIX

trailmixbanner[1].GIF

Biometrics regularly are described as technologies able to provide both "mechanical objectivity" [1] and race-neutrality. The suggestion is that biometrics can automate identity inspection and verification and that these technologies are able to replace the subjective eye of the inspector with the neutral eye of the scanner. In this way, biometric technologies are represented as able to circumvent racism: they are held up as bias-free technologies that will objectively and equally scan everyone's bodily identity. Frances Zelazny, the director of corporate communications for Visionics (a leading US manufacturer of biometrics systems) asserted that the corporation's newly patented iris scanning technology "is neutral to race and color, as it is based on facial features recognized by the software" (2002). In an online discussion on the use of iris scanners at the US-Canada border, one discussant claimed he would prefer "race-neutral" biometric technologies to racist customs border officials:

If I was a member of one of the oft-"profiled" minorities, I'd sign up for sure. Upside--you can walk right past the bonehead looking for the bomb under your shirt just because of your tan and beard. . . . In short, I'd rather leave it up to a device that can distinguish my iris from a terrorist's, than some bigoted lout who can't distinguish my skin, clothing or accent from same (Airport Starts Using Iris Screener, 2005).

Biometrics are central to the attempt to make suspect bodies newly visible. This is a complicated task, and one that is regularly tied to problematic assumptions around race, class and gender identity. It is not surprising therefore, that when biometric technologies are enlisted in this task they fail easily and often. What is most interesting about biometric malfunctions are the specific ways that they fail to work. Thus, as biometrics are deployed to make othered bodies visible, they regularly break down at the location of the intersection of the body's class, race, gender and dis/abled identity. In this way, biometrics fail precisely at the task that they have been set.

As biometric technologies are developed in a climate of increased anxiety concerning suspect bodies - stereotypes around "inscrutable" racialized bodies are technologized. For example, biometrics technologies significantly are unable to distinguish the individual bodies of people of colour. Research on the use of biometric fingerprint scanners has regularly found that it is difficult to fingerprint "Asian women . . . .[as they] had skin so fine it couldn't reliably be used to record or verify a fingerprint" (Sturgeon, 2004). Arguably, stereotypes concerning the inscrutability of orientalized bodies thus are codified in the biometric iris scanner.

These biometric failures result in part from the technological reliance on outdated and erroneous assumptions that race is biological. These assumptions partially can be noted from the titles of the studies that describe the biometric identification technologies. For example, one paper is titled "Facial Pose Estimation Based on the Mongolian Race's Feature Characteristic" (Li et al., 2004). Others titles include "Towards Race-Related Face Identification" (Yin et al, 2004) and "A Real Time Race Classification System" (Ou et al, 2005).

race classification.gif
This image is taken from A Real Time Race Classification System. Its caption in the original article reads: Two detected faces and the associated race estimates.

The suggestion that race is a stable biological entity that reliably yields common measurable characteristics is deeply problematic. Such conclusions are repeated in a number of articles that claim to classify "faces on the basis of high-level attributes, such as sex, 'race' and expression " (Lyons et al, 2000). Although the quotes around the word "race" would suggest that the authors acknowledge that race is not biological, they still proceed to train their computers to identify both gender and race as if it were so. This task is accomplished by scanning a facial image and then identifying the gender and race identity of the image, until the computer is claimed to be programmed to classify the faces itself. Unsurprisingly, error rates remain high. Neither gender nor race are stable categories that consistently may be identified by the human eye, let alone by computer imaging processes.

The assumptions concerning the dependence of biometric performance on racial and ethnic identity can also be noted in the locational differences in hypotheses around race and biometrics that are specific to each site of the study. In the US, biometric technologies have failed to distinguish "Asian" bodies. In the UK, biometric technologies have difficulty distinguishing "Black" bodies. In Japan, one study posited that it would be most difficult for biometrics to identify "non-Japanese" faces (Tanaka et al, 2004).

Nor do the failures of biometrics end with the errors that result from the codification of a biological understanding of race. Biometric technologies consistently are unable to identify those who deviate from the norm of young, able-bodied persons. In general, studies have shown that "one size fits all" biometric technologies do not work. For example, biometric facial recognition technology works poorly with elderly persons and failed more than half the time in identifying those who were disabled (Black Eye for ID Cards, 2005; Woolf et al, 2005). Other studies on biometric iris scanners have shown that the technologies are particularly bad at identifying those with visual impairments and those who are wheelchair users (Gomm, 2005).

Class is also a factor that affects the functioning of biometric technologies. Those persons with occupations within the categories "clerical, manual, [and] maintenance" are found to be difficult to biometrically fingerprint (UK Biometrics Working Group, 2001). Biometric iris scanners failed to work with very tall persons (Gomm, 2005) and biometric fingerprint scanners couldn't identify 20% of those who have non-normative fingers: "One out of five people failed the fingerprint test because the scanner was 'too small to scan a sufficient area of fingerprint from participants with large fingers'" (Black Eye for ID Cards, 2005). Many kinds of bodily breakdown give rise to biometric failure. "Worn down or sticky fingertips for fingerprints, medicine intake in iris identification (atropine), hoarseness in voice recognition, or a broken arm for signature" all gave rise to temporary biometric failures while "[w]ell-known permanent failures are, for example, cataracts, which makes retina identification impossible or [as we saw] rare skin diseases, which permanently destroy a fingerprint" (Bioidentification, 2007).

In addition to having technologized problematic notions around the comprehensibility of difference, biometrics are discursively deployed in ways that continued to target the specific demographics of suspect bodies. For example, biometric facial recognition technology requires Muslim women to completely remove their veils in order to receive new forms of id cards while older forms of identification such as the photos on driver's licenses only required their partial removal. In this way, biometric technologies are literally deployed to further the invasion by the state of the bodily privacy of Muslim women – an application that surely is not "race-neutral."

The examples cited above demonstrate that the objectivity and race-neutrality of biometrics needs to be called into question.

[1] I take this phrase from Daston and Galison (1992).


References

(2005). "Airport Starts Using Iris Screener." Available at http://www.vivelecanada.ca/article.php/20050715193518919. April 27, 2007.

(2005). "Black Eye for ID Cards." Available at http://www.blink.org.uk/pdescription.asp?key=7477&grp=21&cat=99. April 27, 2007.

Bioidentification. (2007). "Biometrics: Frequently asked questions." Available at http://www.bromba.com/faq/biofaqe.htm. April 27, 2007.

Daston, L. and P. Gallison. 1992. "The image of objectivity." Representation 40, Fall.

Gomm, K. 2005. "U.K. agency: Iris recognition needs work". News.com, October 20.

Li, H., M. Zhou, et al. 2004. "Facial Pose Estimation Based on the Mongolian Race’s Feature Characteristic from a Monocular Image ". In S. Z. Li, Z. Sun, T. Tanet al (eds.) Advances in Biometric Person Authentication.

Lyons, M. J., J. Budynek, et al. 2000. Classifying Facial Attributes using a 2-D Gabor Wavelet Representation and Discriminant Analysis. Fourth IEEE International Conference on Automatic Face and Gesture Recognition, 2000. Proceedings, Grenoble, France.

Ou, Y., X. Wu, et al. 2005. A Real Time Race Classification System. Proceedings of the 2005 IEEE: International Conference on Information Acquisition, Hong Kong and Macau, China,.

Roy, S. "Biometrics: Security boon or busting privacy?" PC World.

Sturgeon, W. (2004). "Law & Policy Cheat Sheet: Biometrics." Available at http://management.silicon.com/government/0,39024677,39120120,00.htm. April 27, 2007.

Tanaka, K., K. Machida, et al. 2004. Comparison of racial effect in face identification systems based on Eigenface and GaborJet. SICE 2004 Annual Conference.

UK Biometrics Working Group. (2001). "Biometrics for Identification and Authentication - Advice on Product Selection." Available at http://www.idsysgroup.com/ftp/Biometrics%20Advice.pdf. April 27, 2007.

Woolf, M., F. Elliott, et al. 2005. "ID Card Scanning System Riddled with Errors ". The Independent, October 16.

Yin, L., J. Jia, et al. 2004. Towards Race-related Face Identification: Research on skin color transfer. Sixth IEEE International Conference on Automatic Face and Gesture Recognition.

Comments

Post a comment




Remember Me?


main display area bottom border

.:privacy:. | .:contact:.


This is a SSHRC funded project:
Social Sciences and Humanities Research Council of Canada