understanding the importance and impact of anonymity and authentication in a networked society
navigation menu top border

.:home:.     .:project:.    .:people:.     .:research:.     .:blog:.     .:resources:.     .:media:.

navigation menu bottom border
main display area top border
« Escaping your history | Main | Subjectright (S), a reciprocal to Copyright (C) »

Surveillance in Spheres of Mobility: Privacy, Technical Design and the Flow of Personal Information on the Transportation and Information Superhighways

posted by:Michael Zimmer // 11:59 PM // March 21, 2006 // ID TRAIL MIX

trailmixbanner[1].GIF

A recent Nassau County Supreme Court ruling held that data retrieved from a vehicle’s black box - a computer module that records a vehicle’s speed and telemetry data in the last five seconds before airbags deploy in a collision - could be admitted as evidence even though law enforcement officials did not have a search warrant. The court ruled that by driving the vehicle on a public highway, “the defendant knowingly exposed to the public the manner in which he operated his vehicle on public highways. ...What a person knowingly exposes to the public is not subject to Fourth Amendment protection.” A federal judge in upstate New York made a similar ruling, stating that police officers did not need a warrant to secretly attach a Global Positioning System device to a suspect’s vehicle. The judge said that a suspect traveling on a highway has no reasonable expectation of privacy.

In January 2006, the web search engine Google resisted requests from the U.S. Department of Justice to turn over a large amount of data, including records of all Google searches from any one-week period, partially on the grounds that it would violate their users’ privacy. This event generated widespread concern over the privacy of web search histories, and prompted many users to question the extent to which this component of their online intellectual activities might be shared with law enforcement agencies. (Indeed, it was later revealed that three other search engine providers – America Online, Yahoo and Microsoft – had previously complied with government subpoenas in the case, without public notice.) Similar concerns have arisen over commercial access to search engine histories as the vast databases of search histories held by these providers are increasingly matched up with individual searchers and demographic information from other search-related services in order to provide individually targeted search results and advertising.

The two technological systems described above - networked vehicle information systems and web search engines - represent important tools for the successful navigation of two vital spheres of mobility: physical space and cyberspace. However, they also share a reliance on the capturing and processing of personal information flows, and provide the platforms for surveillance of the person on the move. Networked vehicle information systems, which include GPS-based navigational tools, automated toll collection systems, automobile black boxes, and vehicle safety communication systems, rely on the transmission, collection and aggregation of a person’s location and vehicle telemetry data as she travels along the public highways. Similarly, web search engines, striving to provide personalized results and deliver contextually relevant advertising, depend on the monitoring and aggregation of a user’s online activities as she surfs the World Wide Web. Taken together, these two technical systems are compelling examples of the increased “everyday surveillance” (Staples, 2000) of individuals within their various spheres of mobility: networked vehicle systems constitute large-scale infrastructures enabling the widespread surveillance of drivers traveling on the public highways, while web search engines are part of a larger online information infrastructure which facilitates the monitoring and aggregation of one’s intellectual activities on the information superhighway.

The political and value implications of these infrastructures on individuals as they navigate through these spaces cannot be understated, yet they generally remain unexplored. These implications include shifts in the contextual integrity of the norms of personal information flows, challenges to the expectation of privacy in public spaces, concerns over whether one’s online intellectual activities are shared with third parties, and the potential for the “panoptic sorting” (Gandy, 1993) of citizens into disciplinary categories. Taken together, these infrastructures of everyday surveillance increasingly threaten the privacy of one’s personal information, and contribute to a rapidly emerging “soft cage” (Parenti, 2003) of everyday surveillance, a growing environment of discipline and social control.

In his book Technopoly, Neil Postman warned that we tend to be “surrounded by the wondrous effects of machines and are encouraged to ignore the ideas embedded in them. Which means we become blind to the ideological meaning of our technologies” (1992, p. 94). As the ubiquity of networked vehicle systems and web search engines intensifies, it becomes increasingly difficult for users to recognize or question their political and value implications, and more tempting to simply take the design of such tools “at interface value” (Turkle, 1995, p. 103). It becomes vital, then, to heed Postman’s warning, remove the blinders, prevent the political and value implications of networked vehicle systems and web search engines from disappearing from public awareness, and to critically engage with the design communities to mitigate these unintended consequences.

To accomplish this, three things must happen:

1. Broaden conceptual understanding of privacy: Efforts must be made to broaden the conceptual understanding of privacy to fully appreciate how the introduction of these new technologies disrupt the norms of personal information flows in the contexts of their particular use. A starting point is embracing more contextually-based theories of privacy, such as Helen Nissenbaum’s formulation of privacy as “contextual integrity.” Contextual integrity is a benchmark theory of privacy where the privacy of one’s personal information is only maintained if certain norms of information flow remain undisturbed. Rather than aspiring to universal prescriptions for privacy, contextual integrity works from within the normative bounds of a particular context. If the introduction of a new technology into a particular context violates either the norms of information appropriateness or information distribution, the contextual integrity of the flow of one’s personal information has been violated.

The theory of privacy as contextual integrity is particularly well suited, then, to consider how the introduction of networked vehicle information systems and web search information infrastructures might impact the governing norms of the flow of personal information in the contexts of highway travel and online intellectual activities. (For a starting point in such an analysis, see my paper presented at the “Contours of Privacy” conference.)

2. Engage in value-sensitive design: The notion that the design and use of technical systems have certain political and value consequences suggests the possibility of achieving alternative technical designs that might help to resist or otherwise mitigate such threats prior to their final design and deployment. It becomes vital, then, to engage directly with these technical design communities to raise awareness of the political and value implications of their design decisions and to make the value of privacy a constitutive part of the technological design process.

The multi-disciplinary perspective known as value-sensitive design is well suited to guide this endeavor. Value-sensitive design has emerged to identify, understand, anticipate and address the ethical and value-laden concerns that arise from the rapid design and deployment of media and information technologies. Recognizing how technologies contain ethical and value biases, the primary goal of value-sensitive design is to affect the design of technology to take account for human values during the conception and design process, not merely retrofitted after completion.

3. Foster critical technical practices: Recognizing that the choices designers make in shaping these systems are guided by their conceptual understandings of the values at play, work must be done to ensure technical designers possess the necessary conceptual tools to foster critical reflection on the hidden assumptions, ideologies and values underlying their design decisions. This is best accomplished by fostering “critical technical practices” within the design community. Formulated by Phil Agre, critical technical practice works to increase critical awareness and spark critical reflection among technical designers and engineers of the hidden assumptions, ideologies and values underlying their design processes and decisions. An example of critical technical practice in action is the Culturally Embedded Computing Group at Cornell University, which seeks to elucidate the ways in which technologies reflect and perpetuate cultural assumptions, as well as design new computing devices that reflect alternative possibilities. Their work provides a model for integrating critical technical practices into the technical design communities of networked vehicle information systems and web search information infrastructures.

At a moment when concern over government surveillance of its citizens is high, the prospect of the creation of a nationwide networked vehicle system infrastructure capable of monitoring vehicle location and activity causes pause. Similarly, general concerns over the privacy of web search histories is further aggravated by the possibility of the information being shared with government authorities. Broadening the conceptualizations of privacy to include approaches such as contextual integrity can help raise awareness of the political and value implications of these emerging information technologies. Further, embracing the pragmatic tools of “value-sensitive design” and “critical technical practice,” will ensure attention to political and ethical values becomes integral to the conception, design, and development of technologies, not merely considered after completion and deployment.

These prescriptions mark the first steps towards avoiding the ideological blindness Postman feared, engendering critical exploration of both the privacy threats of these emerging technologies, as well as their potential to trigger widespread surveillance and social control within two vital spheres of mobility.

Michael Zimmer is a PhD student in the Department of Culture and Communication at New York University, and maintains a blog at www.michaelzimmer.org.

Comments

Post a comment




Remember Me?


main display area bottom border

.:privacy:. | .:contact:.


This is a SSHRC funded project:
Social Sciences and Humanities Research Council of Canada