understanding the importance and impact of anonymity and authentication in a networked society
navigation menu top border

.:home:.     .:project:.    .:people:.     .:research:.     .:blog:.     .:resources:.     .:media:.

navigation menu bottom border
main display area top border
« Authentic[N]ation | Main | Blogging While Female, Online Inequality and the Law »

PETS are Dead; Long Live PETs!

posted by:A Privacy Advocate // 11:59 PM // August 14, 2007 // ID TRAIL MIX

trailmixbanner.gif

In this Google Era of unlimited information creation and availability, it is becoming an increasingly quixotic task to advocate for limits on collecting, use, disclosure and retention of personally-identifiable information ("PII"), or for meaningful direct roles for individuals to play regarding the disposition of their PII "out there" in the Netw0rked Cloud. Information has become the currency of the Modern Era, and there is no going back to practical obscurity. Regarding personal privacy, the basic choices seem to be engagement or abstinence, so overwhelming are the imperatives of the Information Age, so unstoppable the technologies that promise new services, conveniences and efficiencies. Privacy, as we knew it, is dying.

Privacy advocates are starting to play the role of reactive luddites: suspicious of motives, they criticize, they raise alarm bells; they oppose big IT projects like data-mining and profiling, electronic health records and national ID cards; and they incite others to join in their concerns and opposition. Privacy advocates tend to react to information privacy excesses by seeking stronger oversight and enforcement controls, and calling for better education and awareness. Some are more proactive, however, and seek to encourage the development and adoption of
privacy-enhancing technologies (PETs). If information and communication technologies (ICTs) are partly the cause of the information privacy problem, the thinking goes, then perhaps ICTs should also be part of the privacy solution.

In May the European Commission endorsed the development and deployment of PETs(1), in order to help “ensure that certain breaches of data protection rules, resulting in invasions of fundamental rights including privacy, could be avoided because they would become technologically more difficult to carry out.” The UK Information Commissioner issued similar guidance on PETs in November 2006(2). Other international and European authorities have released studies and reports discussing and supporting PETs in recent years. (see references and links below)

PETs as a Personal Tool/Application

Are PETs the answer to information privacy concerns? A closer look at the European and UK communiqués suggests otherwise - for all their timeliness and prominence, they reflect thinking about PETs that is becoming outdated. The reports cite, as examples of PETs, technologies such personal encryption tools for files and communications, cookie cutters, anonymous proxies and P3P (a privacy negotiation protocol). Not a single new privacy-enhancing technology category here in seven years. Other web pages dedicated to promoting PETs list more technologies, such as password managers, file scrubbers, and firewalls, but otherwise don’t appear to have significantly new categories of tools.(3,4).

The general intent off the PETs endorsements seem clear and laudable enough: publicize and promote technologies that place more controls into the hands of individuals over the disclosure and use of their personal information and online activities. PETs should directly enable information self-determination. Empowered by PETs, online users can mitigate the privacy risks arising from the observability, identifiability, linkability of their online personal data and behaviours by others.

Unfortunately, few of the privacy-enhancing tools cited by advocates have enjoyed widespread public adoption or viability (unless installed and activated by default on users’ computers, e.g. SSL and Windows firewalls). The reasons are several and varied: PETs are too complicated, too unreliable, untrusted, expensive or simply not feasible to use. The threat model they respond to, and benefits they offer, are not always clear or measurable to users. PETs may interfere with normal operation of computer applications and communications, for example, they can render web pages non-functional. In the case of P3P, a privacy negotiation protocol, viable user-agents were simply never developed (except for a. modest but largely incomprehensible cookie implementation in IE6 and IE7). PETs simply haven't taken off in the marketplace, and the bottom-line reason seems to be that there are few incentives for organizations to develop them and make them available. (Where there has been a congruence of interests between users and organizations, some PETs have thrived, for example, SSL for encrypted secure web traffic and e-commerce. Perhaps the same is happening for anti-spam and anti-phishing tools, since deployment of these technologies helps to promote confidence and trust in online transactions.)

Perhaps the underlying difficulty may be a conceptualization of PETs as a technology, tool or application exclusively for use by individuals, complete in itself, expressed perhaps in its purest form by David Chaum’s digital cash Stefan Brands' private credentials. As brilliant as those ideas are, they have had limited deployment and viability to date. It seems that, to be viable, PETs must be also meet specific, recognizable needs of organizations. Secure Socket Layer (SSL) is a good example, responding as it did to well-understood problems of interception, surveillance and consumer trust online. SSL succeeded because organizations had a mutual interest in seeing that it was baked into the cake of all browsers and its use largely transparent to user.

Meanwhile, technology marches on. Many PETs weren't very practical to use. Sure you can surf anonymously, if don't mind a little latency and the need to tweak or disable browser functionality. But as soon as you want to carry out an online transaction, sign on to a site, make a purchase, or otherwise become engaged online in a sustained way, you had to identify yourself, provide a credit card, login credential, registration form, mailing address, etc. Privacy suffered from the 100th window syndrome: your house, just like your privacy, could be Fort Knox secure but all it took was to leave one window open and the security (privacy) was compromised. Privacy required too much knowledge and effort and responsibility on the part of the individuals to sustain in an ongoing way. Online privacy was just too much work.

And, anyway, the benefits of online privacy tended to pale in the face of immediate gratification needs, and greater conveniences, personalization, efficiency, and essential connectedness afforded by consent and trust. The privacy emphasis slides inexorably towards holding others accountable for the personal information they must inevitably collect about us, not PETs. The only effective privacy option for most people in the online world is disengagement and abstinence.

PETs as a Security Technology

Certain consumer PETs have thrived, such as SSL, firewalls, anti-virus/anti-spyware tools, secure authentication tools. Perhaps anti-phishing tools and whole disk encryption will follow –if incorporated and activated by default into users’ hardware/software. But note: these are all largely information security tools. PETs have tended to become equated with information security. Safeguards are certainly an important components of privacy. We may not be able to stifle the global information explosion, but with appropriate deployment of PETs we can help ensure that our data stays where it belongs, is not accessed inappropriately, tampered with, or otherwise subject to breaches of confidentiality, integrity and availability.

Personal security tools like firewalls, virus/spyware detection, encryption are available to individuals. To the extent that PETs have been adopted by organizations public and private, rather than users, they have been security technologies. Legal and regulatory compliance for managing sensitive information in accountable ways, and for notifying individuals of data breaches, as well as the desire to build brand and promote consumer trust, have helped drive innovation and growth in the data security technology products market. Organizations, both public and private, today are deploying information security technologies throughout their operations, from web SSL to encrypted backup tapes to data ingress and egress filtering, to strong authentication and access controls, to privacy policy enforcement tools such as intrusion detection/prevention systems, transaction logging and audit trails, and so forth. When it comes to organizational PET deployments in practice, security is the name of the game.

But are these technologies really PETs? They may be technologies that are deployed with the end-user in mind - it is their data after all, but they don't really involve the user in a meaningful way in the life-cycle management of the information. The security measures listed above are put in place mainly to protect the interests of the organization. Of course, some organizations do go further and put in place technologies that help express important principles of fair information practices, such as technologies that promote openness and accountability in organizational practices, that capture user consent and preferences, and which allow to clients a measure of direct access and correction rights to the data and preferences stored about them - but this is still the exception rather than the norm..

PETs as Data Minimization Tools

More critically, security-enhancing and access/accountability technologies controls really miss out on the final ingredient of a true PET: data minimization. Information privacy is nothing if not about data minimization. The best way to ensure data privacy is not to disclose, use or retain the data at all. The minimization impulse is well captured by the fair information practices that require purposes to be specified and limited, and which seek to place limits on all data collected, used, disclosed and retained pursuant to those purposes. But such limitations run contrary to the impulses of most information-intensive organizations today, which is to collect and stockpile as much data as possible (and then to secure it as best as possible) because it may be useful later. More data, not less, is the trend. Why voluntarily limit a potential competitive advantage?

Apart from being a legal requirement, arguments for data minimization should be compelling, beginning with fewer cost and liabilities associated with maintaining and securing the data against leaks and misuse, or with bad decisions based upon old, stale and inaccurate data, as well as reputation and brand issue (faced with growing public concerns about excessive data collection, use and retention, major search engines and transportation agencies alike are now adopting more limited data usage policies and practices, but off course these policy-level decisions not PETs).

The problem is that there are few benchmarks against with to judge whether data minimization is being observed via use of technologies. How much less is enough to qualify as a PET? Is a networked, real-time passenger/terrorist screening program that flashes only a red, yellow or green light to the front line border security personnel a PET because the program design minimized unnecessary transmission and display of sensitive passenger PII? Similarly, is an information technology that automatically aggregates data after analysis, or which mines data and computes assessments on individuals for decision-making, or which is capable of delivering targeted bbut pseudonymous ads, a true PET because the actual personal information used in the process was minimized so not to be revealed to a human being? If a specific technology’s purpose for collecting, using, disclosing, and retaining customer or citizen data is sharply limited to "providing better services" and "for security purposes" then can these technology properly be considered PETs?!

PETs as expressing the Fair Information Principles (FIPs)

PETs minimize data, but not all technologies that minimize data are PETs. Data minimization is a necessary but insufficient requirement to become a PET. Enhanced information security is a necessary but insufficient requirement to become a PET. User empowerment is a necessary but insufficient requirement to become a PET. Together, all these impulses are expressed in the ten principles of (CSA) fair information practices, all of which must be substantially satisfied, within a defined context, in order for a given technology to be judged a PET worthy of the name, and of public support and adoption:

To enable user empowerment, we find the (CSA) fair information practices of:
1. Accountability; 2. Informed Consent; 3. Openness; 4. Access; and 5. Challenging Compliance. These principles and practices should be substantially operationalized by PETs.

To enable data minimization, we find the CSA fair information principles requiring 1. Identifying Purposes; 2. Limiting Collection; and 3. Limiting Use, Disclosure, and Retention.

Finally, the CSA Privacy Code calls for Security (Safeguards() appropriate to the sensitivity off the information.

[Comment: The CSA principle ‘Accuracy’ can fit under all three categories, since it implies a right for users to inspect and correct errors, as well as an obligation upon organizations to discard stale and/or inaccurate data, as well as a security obligation to assure integrity of data against unauthorized tampering and modification.]

A more comprehensive approach to defining and using PETs is required - one that clearly accommodates the interests and rights of individuals in a substantial way, yet which can be adopted or at least accommodated by organizations with whom individuals must inevitably deal. This requires a more systemic, process-oriented, life-cycle, and architectural approach to engineering privacy into information technologies and systems.

PETs as we know them are effectively dead, reduced to a niche market for paranoids and criminals, claimed by some security products (e.g., two-factor authentication dongles) or else deployed by organizations as a public relations exercise to assuage specific customer fears and to build brand confidence (e.g. banks' anti-phishing tools, web seals).

PETs as Information Architecture?

The future of PETs is architecture, not applications. Large-scale IT-intensive transformations are underway across public and private sector organizations, from real-time passenger screening programs and background/fraud checking, to the creation of networked electronic health records and eGovernment portals, to national identity systems for use across physical and logical domains. What is needed is a comprehensive, systematic process of ensuring that PETs are full enabled and embedded into the design and operation of these complex data systems. If code is law, as Lawrence Lessig posited, then systems architecture will be the rightful domain for privacy technologies to flourish in the current Google era.

The time has come to speak of privacy-enabling technologies and systems that help create favorable conditions for privacy-enhancing technologies to flourish and to express the three essential privacy impulses: user empowerment, data minimization, and enhanced security. Objective and auditable standards are essential preconditions.

Examples abound: Privacy-embedded "Laws of Identity" can enable privacy-enhanced identity systems and technologies to emerge; as is the development of 'smart' data that carries with it enforceable conditions of its use, in a manner similar to digital rights management technologies. Another example are intelligent software agents that can negotiate and express the preferences –and take action on behalf of- of individuals with respect to the disposition of their personal data held by others. Yet another promising development are new and innovative technologies that enable secure but pseudonymous user authentication and access to remote resources. These and other new information technologies may be the true future of PETs in the Google Era of petabytes squared, and worthy of public support and encouragement.

Recap

So, to summarize: the essential messages of this think piece are:
* PETs are attracting renewed interest and support, after several years of neglect and failure
* PETs are an essential ingredient for protecting and promoting privacy in the Information Age (along with regulation and awareness/education), but their conception and execution in practice is highly variable and still rooted in last-century thinking.
* True PETs should incorporate into information technologies ALL of the principles of fair information practices, rather than any subset of them.
* In today's Information Age, true PETs must be comprehensive, and involve all actors and processes. Evaluating PETs will increasingly be a function of whole systems and information architectures, not standalone products.
* It may be more useful to think of privacy-enabling technologies and architectures, which enable and make possible specific PETs.


Endnotes:

(1) European Commission Supports PETs
Promoting Data Protection by Privacy Enhancing Technologies (2 May 2007)
http://ec.europa.eu/information_society/newsroom/cf/itemlongdetail.cfm?item_id=3402
Background Memo (2 May 2007): http://europa.eu/rapid/pressReleasesAction.do?reference=MEMO/07/159&format=HTML&aged=0&language=EN&guiLanguage=en

(2) Office of the UK Information Commissioner
Data Protection Technical Guidance Note: Privacy enhancing technologies (Nov 2006)
www.ico.gov.uk/upload/documents/library/data_protection/detailed_specialist_guides/privacy_enhancing_technologies.pdf

(3) Center for Democracy and Technology
Page on Privacy Enhancing Technologies
www.cdt.org/privacy/pet/

(4) EPIC Online Guide to Practical Privacy Tools
www.epic.org/privacy/tools.html


Other Useful Resources:

Dutch Ministry of the Interior and Kingdom Relations, the Netherlands
—Privacy-Enhancing Technologies. White paper for decision-makers (December 2004)
www.dutchdpa.nl/downloads_overig/PET_whitebook.pdf

OECD Directorate For Science, Technology And Industry
—Committee For Information, Computer And Communications Policy
Inventory Of Privacy-Enhancing Technologies (January 2002)
www.olis.oecd.org/olis/2001doc.nsf/LinkTo/dsti-iccp-reg(2001)1-final

Danish Ministry of Science, Technology and Innovation
—Privacy Enhancing Technologies
Report prepared by the META Group v1.1 (March 2005)
www.itst.dk/image.asp?page=image&objno=198999309

Office of the UK Information Commissioner
—Data protection best practice guidance (May 2002)
Report prepared by UMIST
www.hispec.org.uk/public_documents/BPDMay02.pdf

—Privacy enhancing technologies state of the art review (Feb 2002) www.hispec.org.uk/public_documents/7_1PETreview3.pdf

EU PRIME Project
—White paper v2 (June 2007)
https://www.prime-project.eu/prime_products/whitepaper/PRIME-Whitepaper-V2.pdf

Andreas Pfitzmann & Marit Hansen,
TU Dresden, Department of Computer Science, Institute For System Architecture
—Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management - A Consolidated Proposal for Terminology (Version v0.29 - July 2007)
http://dud.inf.tu-dresden.de/Anon_Terminology.shtml

EU FIDIS Project
—Identity and impact of privacy enhancing technologies (2006)
www.fidis.net/fileadmin/fidis/deliverables/fidis-wp13-del13.1.identity_and_impact_PET.pdf

Roger Clarke
—Introducing PITs and PETS Technologies: technologies affecting privacy (Feb 2001)
www.anu.edu.au/people/Roger.Clarke/DV/PITsPETs.html

Office of the Ontario Information and Privacy Commissioner & Dutch Registratierkamer
—Privacy-Enhancing Technologies: The Path to Anonymity (Volume I - August 1995)
www.ipc.on.ca/index.asp?layid=86&fid1=329

George Danzesis, University of Cambridge Computer Lab (Date Unknown)
—An Introduction to Privacy-Enhancing Technologies
www.isoc.ch/events/show/privacy/july2004/150704_Georges_Danezis_Isocgva-PETS.pdf

Comments

Post a comment




Remember Me?


main display area bottom border

.:privacy:. | .:contact:.


This is a SSHRC funded project:
Social Sciences and Humanities Research Council of Canada