Existing and Emerging Privacy-based Limits In Litigation and Electronic Discovery
posted by:Alex Cameron // 11:59 PM // August 28, 2007 // ID TRAIL MIX
Privacy law is increasingly important in litigation in Canada. Contemporary litigants routinely file requests for access to their personal information under PIPEDA and its provincial counterparts. Such requests can give a party a partial head-start on litigation discovery, or aid a party in rooting out information held by an opponent or potential opponent.
That said, with some possible room for improvement (at least in the case of PIPEDA),  data protection law in Canada takes a relatively hands-off approach when it comes to legal proceedings. Parties in legal proceedings are generally required to disclose information in accordance with long-standing litigation rules and are largely exempted from restrictions that might otherwise be applicable under data protection laws in other contexts. Yet, this does not mean that privacy considerations are not relevant or applicable to discovery in legal proceedings. This short article identifies some existing and emerging privacy-based limits in litigation discovery at the intersection between privacy interests and the need for full disclosure in litigation.
I. The Implied Undertaking Rule
As a starting point, it is important to note that privacy protections are built into discovery at a fundamental level. Information obtained through discovery is generally subject to an implied undertaking of confidentiality. This prohibits parties from using or disclosing information obtained during discovery for purposes outside of the litigation. The implied undertaking rule is based on a recognition by Canadian courts of the general right of privacy that a person has with respect to his or her own documents.  Many Canadian decisions cite the English text Discovery by Matthews & Malek for the principle behind the rule:
The primary rationale for the imposition of the implied undertaking is the protection of privacy. Discovery is an invasion of the right of the individual to keep his own documents to himself. It is a matter of public interest to safeguard that right. The purpose of the undertaking is to protect, so far as is consistent with the proper conduct of the action, the confidentiality of a party’s documents. 
A party may apply for relief from the implied undertaking rule where a party's interest in using information outweighs the privacy interest protected or where the document is otherwise available. However, the courts do not take the principle of privacy behind the rule lightly, as such applications for relief are frequently denied, for example, on the basis that it would be “an unwarranted intrusion on [the party’s] privacy rights”. 
Privacy has similarly been invoked as a limitation in defining what is and is not reasonable in discovery. For example, in Fraser v. Houston, the court declined to order production of the plaintiff’s financial documents on the basis of privacy concerns, despite concluding that the documents had “at least marginal probative value” to an allegation of economic duress:
I am satisfied that this line of questioning, […] could result in a detailed exploration of a man’s state of wealth or state of non-wealth as the case may be, and that that is a major invasion into a man's privacy which is generally only allowed in matters of execution on judgments that are not paid and perhaps, in some other circumstances. However, in the present case I am of the view that to allow an exploration of the nature that is requested by the defendants has a potential prejudicial effect upon Mr. Fraser's privacy which well outweighs any apparent probative value that there may be. 
Information potentially subject to disclosure in legal proceedings could be held directly by a party to the litigation or by a third party, such as an Internet service provider (ISP). In each of these categories, discussed in turn below, courts have balanced privacy considerations against the interests of full disclosure in litigation.
II. Information Held by a Party
A. Motions for Production
In Park v. Mullin,  a party applied for discovery of its opponent’s computer. Relying on earlier Supreme Court of Canada jurisprudence, Dorgan J. expressly drew on privacy considerations in refusing to order disclosure:
That the issue of privacy is a robust and real issue should be taken into account on an application such as this. In [A.M. v. Ryan, 1997 CanLII 403 (S.C.C.)], McLachlin J. commented on a party’s privacy interests in the context of an application for third party clinical records under Rule 26(11). […]:
... I accept that a litigant must accept such intrusions upon her privacy as are necessary to enable the judge or jury to get to the truth and render a just verdict. But I do not accept that by claiming such damages as the law allows, a litigant grants her opponent a licence to delve into private aspects of her life which need not be probed for the proper disposition of the litigation.
In my view, similar privacy concerns should be considered in a determination under Rule 26(10) where the order sought is so broad it has the potential to unnecessarily “delve into private aspects” of the opposing party’s life. 
Privacy also played an integral role in the leading case Desgagne v. Yuen , where the Court balanced the relevance of the information sought against other considerations, including privacy. The plaintiff had been injured in an accident, and the defendant sought production of her hard drive, Palm Pilot, video game unit, and photographs (both electronic and hard copies) taken since the accident. The plaintiff argued that the information was relevant since it would shed light on the defendant’s post-accident cognitive abilities and quality of life. Myers J. refused to order production of the plaintiff’s photographs because of privacy considerations:
In my opinion, the vacation photographs (and other photographs relating to the plaintiff’s family, friends and hobbies) sought have limited - if any - probative value on this matter. Production of these photographs, however, is invasive of the plaintiff’s personal life, because the photographs are largely of moments spent with her family and friends. The limited probative value considered against the invasiveness of production leads me to conclude that production of the photographs should not be ordered. 
Access to the plaintiff’s video game unit, Palm Pilot, and Internet Browsing history were also denied on the basis of their probative value being outweighed by the plaintiff’s privacy interest and the invasiveness of ordering their production. Similar reasoning was applied in Goldman, Sachs & Co. v. Sessions,  Ireland v Low , and Baldwin Janzen Insurance Services (2004) Ltd. v. Janzen. 
B. Motions for Preservation
In the context of preserving evidence for discovery, ex parte orders for the seizure of evidence (such as Anton Piller orders) allow litigation opponents access to documents that may contain personal or confidential information. Although such orders relate to the preservation of evidence, they form part of the overall process of document discovery. Given the invasiveness of such orders, privacy considerations can play an important role in Anton Piller cases. Courts urged taking a cautionary approach to Anton Piller orders as early as 1981. In the words of Browne-Wilkinson J. (as he then was) in Thermax Ltd v. Schott Industrial Glass Ltd: 
As time goes on and the granting of Anton Pillar [sic] orders becomes more and more frequent, there is a tendency to forget how serious an intervention they are in the privacy and rights of defendants. One is also inclined to forget the stringency of the requirements as laid down by the Court of Appeal. 
In Harris Scientific Products Ltd. v. Araujo,  the Court found that an Anton Piller order had been improperly obtained and improperly executed. The plaintiff had misrepresented a material fact in its application for the order, and the court found numerous and serious breaches of the order’s execution by the plaintiff. Two of the more serious breaches included the seizure of material subject to solicitor-client privilege and the seizure of an audio cassette that clearly had no relation to the proceedings (“a state-assisted major invasion of Mr. Araujo’s privacy on an unrelated matter”) . When considering the quantum of damages to be awarded, the court reiterated how seriously such breaches of privacy are taken:
Damages for trespass resulting from a defective Anton Piller order should not be so low as to condone the wrongdoing; the use of state powers to breach an individual’s privacy must be jealously guarded. Even where the target of the order has suffered no, or little, in the way of pecuniary damage, the level of damages awarded can be more than nominal and can reflect mental distress. 
Finally, in CIBC World Markets v. Genuity Capital Markets,  an order in the nature of an Anton Piller order was made for full preservation of “computers, Blackberries and other types of similar electronic devices of every nature and kind” including all devices “owned or used by others including spouses, children or other relatives”.  An order for a seizure of this magnitude obviously has a broad privacy impact. However, the order provided that a technical consultant would perform the imaging and indexing of information and that the imaged drives and information would not initially be shared with the plaintiffs.  The court addressed the matters of relevance and confidentiality in a subsequent order, holding that if there were confidential or irrelevant documents contained in the devices imaged, then the defendants could apply to have the full index of documents sealed and one made public that only contained relevant material. 
IV. Information Held by a Non-Party
Privacy also plays an important role in contouring limits to discovery from non-parties in litigation. A great deal of personal information is held by non-parties such as ISPs and banks; it is increasingly sought out by parties in litigation.
In BMG v. Doe,  the Federal Court of Appeal considered an appeal by music providers who were seeking disclosure of the identities of customers alleged to have infringed copyrights by sharing music on peer-to-peer networks. Sexton JA, for the court, held that plaintiffs must conduct their initial investigations in a way that minimized privacy invasion; failure to do so could justify a court refusing to order ISPs to identify potential defendant customers as requested by the plaintiffs:
If private information irrelevant to the copyright issues is extracted, and disclosure of the user’s identity is made, the recipient of the information may then be in possession of highly confidential information about the user. If this information is unrelated to copyright infringement, this would be an unjustified intrusion into the rights of the user and might well amount to a breach of PIPEDA by the ISPs, leaving them open to prosecution. Thus in situations where the plaintiffs have failed in their investigation to limit the acquisition of information to the copyright infringement issues, a court might well be justified in declining to grant an order for disclosure of the user's identity. 
In other similar cases of discovery from non-parties, courts have relied on privacy as one of the key considerations factoring into whether production should be granted. For example, in Irwin Toy Ltd. v. Doe,  Wilkins J. provided the following view of privacy considerations: “some degree of privacy or confidentiality with respect to the identity of the internet protocol address of the originator of a message has significant safety value and is in keeping with what should be perceived as being good public policy.”  Although the court ordered the ISP to disclose the identity of the targeted ISP customer, it required the plaintiffs to meet a privacy-informed threshold test before disclosure would be granted.
Finally, discovery limits based on privacy considerations may also be developed after the fact, in the form of sanctions for wrongful behaviour. Where ex parte orders for evidence seizure (such as Anton Piller orders) are obtained or executed improperly in a way that has an impact on privacy, the courts may step in. This may result in the removal of the offending party’s counsel, or possibly even a stay of proceedings. For example, Grenzservice Speditions Ges.m.b.H. v. Jans  concerned an order in the nature of an Anton Piller order. The Court found that the plaintiff’s solicitor allowed flagrant abuses of privacy in the execution of that order, including questioning of the occupants of the home and videotaping of the proceedings surrounding the search. Because of the egregious nature of the infringement on the individual’s right to privacy, Huddart J. (as she then was) disqualified the plaintiff's counsel from further involvement in the case, in order to “assure the defendants and members of the public, all of whom are potential subjects of search and seizure orders, that their rights will be protected.” 
This article has briefly reviewed some of the rules and jurisprudence at the intersection between privacy and litigation discovery. Although data protection legislation has an impact on discovery, it generally leaves established litigation rules untouched. However, as seen in the cases reviewed here, there are a number of existing and emerging privacy-based limits on discovery in litigation. Conflicts between the need for full disclosure in litigation and privacy interests will certainly arise more frequently in light of the increasing prominence of electronic discovery and the increasing role that electronic devices play in the creation, processing and storage of personal information.
| Comments (0) |
 Statutory Review of the Personal Information protection and Electronic Documents Act (PIPEDA), Fourth Report of the Standing Committee on Access to Information, Privacy and Ethics, Tom Wappel, MP, Chairman, May 2007, 39th Parliament, 1st Session, online: Standing Committee on Access to Information, Privacy and Ethics
 See Lac d'Amiante du Québec Ltée v. 2858-0702 Québec Inc., 2001 SCC 51 (CanLII) at para. 61.
 Paul Matthews and Hodge M. Malek, Discovery (London: Sweet & Maxwell, 1992) at 253, cited in Goodman v. Rossi,  O.J. No. 1906 (C.A.) (QL) at para. 29. See also Tanner v. Clark, 2003 CanLII 41640 (ON C.A.); Royal Bank of Canada v. Bacon (1999), 218 N.B.R. (2d) 98 (Q.B.); Vitapharm Canada Ltd. v. F. Hoffmann-La Roche Ltd.,  O.J. No. 1400 (S.C.) (QL).
 Letourneau v. Clearbrook Iron Works Ltd., 2003 FC 949 (CanLII) at para. 5.
 Kunz v. Kunz Estate, 2004 SKQB 410 (CanLII) at para. 17. See also Letourneau v. Clearbrook Iron Works Ltd., ibid.; L. H. v. Caughell,  O.J. No. 3331 (Ont. Gen. Div.); Sezerman v. Youle, 1996 CanLII 5610 (NS C.A.).
 Fraser v. Houston, 1997 CanLII 3227 (BC S.C.) at para. 21.
 Park v. Mullin, 2005 BCSC 1813 (CanLII).
 Ibid. at para 21.
 Desgagne v. Yuen, 2006 BCSC 955 (CanLII).
 Ibid. at para. 49.
 Goldman, Sachs & Co. v. Sessions, 2000 BCSC 67 (CanLII).
 Ireland v Low, 2006 BCSC 393 (CanLII).
 Baldwin Janzen Insurance Services (2004) Ltd. v. Janzen, 2006 BCSC 554 (CanLII).
 Thermax Ltd v. Schott Industrial Glass Ltd,  F.S.R. 289 (Ch. D.).
 Ibid. at 294.
 Harris Scientific Products Ltd. v. Araujo, 2005 ABQB 603 (CanLII).
 Ibid. at para. 103.
 Ibid. at para. 105.
 CIBC World Markets Inc. v. Genuity Capital Markets, 2005 CanLII 3944 (ON S.C.).
 Ibid. at para. 3.
 Persons connected to the defendants were entitled to review the information in order to assess whether to advance claims of privilege.
 CIBC World Markets v. Genuity Capital Markets, 2006 CanLII 11908 at para. 5.
 BMG Canada Inc. v. Doe, 2005 FCA 193 (CanLII).
 Ibid. at para. 44.
 Irwin Toy Ltd. v. Doe,  O.J. No. 3318 (S.C.) (QL).
 Ibid. at para. 11.
 Grenzservice Speditions Ges.m.b.H. v. Jans 1995 CanLII 2507 (BC S.C.).
 Ibid. at para. 116.
Blogging While Female, Online Inequality and the Law
posted by:Louisa Garib // 11:59 PM // August 21, 2007 // ID TRAIL MIX
“Those who worry about the perils women face behind closed doors in the real world will also find analogous perils facing women in cyberspace. Rape, sexual harassment, prying, eavesdropping, emotional injury, and accidents happen in cyberspace and as a consequence of interaction that commences in cyberspace.”
- Anita Allen, “Gender and Privacy” (2000) 52 Stan. L Rev. at 1184.
In 2006, the University of Maryland’s Clark School of Engineering released a study assessing the threat of attacks associated with the chat medium IRC (Internet Relay Chat). The authors observed that users with female identifiers were “far more likely” to receive malicious private messages and slightly more likely to receive files and links.  Users with ambiguous names were less likely to receive malicious private messages than female users, but more likely to receive them than male users.  The results of the study indicated that the attacks came from human chat-users who selected their targets, rather than automated scripts programmed to send attacks to everyone on the channel.
The findings of this study highlight the realities that many women face when they are online. From the early days of cyberspace, women who identify as female are frequently subject to hostility and harassment in gendered and sexually threatening terms.  These actions typically stem from anonymous users.
Recent news articles from around the world have chronicled the latest spate of online misogyny.  Not only have the women bloggers in these cases been personally threatened, their images distorted and disseminated, in some cases their blogs and websites have also been subject to denial of service (DoS) attacks. Feminists  and women who blog about contentious political or social issues are not the only women who are singled out for abuse. Similar patterns of violent threats have also been directed toward women who blog about the daily life of a single mother,  computer programming,  and a variety of ordinary interests on sites with a female following, but no feminist content or agenda.
The effects of repeated online harassment has profound consequences for women’s equality online and in the real world. Online threats and attacks can have had a chilling effect on women’s expression.  Some women may either stop participating in open online forums, unless under the cloak of anonymity or pseudonymity, or self-censor their speech, rather than risk being the subject of violent threats or DoS attacks. These choices reduce a woman’s online identity to being the invisible woman, or a quieter, edited version of herself. Fortunately, women actively continue to blog and participate in cyber-life in the face of threats and harassment, with the support of both women and men in online communities.
Women’s retreat from the Internet can also have an economic impact on those seeking entry into technology-based labour markets. One prominent technology blogger observed: “If women aren’t willing to show up for networking events [because of harassment], either offline or online, then they’re never going to be included in the industry.”  Women’s absence from the creative process also has implications for equality in terms of influencing what kinds of technology are made, and what societal interests those innovations ultimately serve. 
To date, the law has provided a limited response to harms directed against women online. Traditional torts such as defamation are available, but are difficult to pursue against multiple, anonymous individuals who could be anywhere in the world. In light of the uncertainly in Canadian case law,  a claim for invasion of privacy would be very challenging to make in the absence of an appellate level decision recognizing the right to privacy. An action for intentional or negligent infliction of emotional distress may also be possible, although plaintiffs must meet stringent standards to succeed.  Complainants may have difficulty overcoming the view that in the absence of physical contact, no real harm can be inflicted in the virtual world, particularly within the context of fantasy/gaming environments.
Without a more complete and critical examination of actions that target women in cyberspace, there is the danger of reinforcing substantive inequality by dismissing the individual and social harm experienced as an “natural” part of online life. Although tort actions represent some avenues for redress, they are individual, private law remedies that do not speak to the public nature of harms against women. While criminal sanctions for assault, obscenity, hate speech and uttering threats are possible, they would only apply if actions could be proved to fall within Criminal Code  definitions and precedents. It should not be forgotten that women continue to face difficulties with the law in seeking protection from, and compensation for violence, harassment, discrimination and exploitation experienced in the real world. 
Given the market drive for more intense and realistic sensory experiences in the virtual world, it is not far-fetched to foresee online acts that more closely reflect conventional legal and social notions of physical and sexual violence in the future.  As “[t]he courts will increasingly be confronted with issues that are ‘lying in wait’ as virtual worlds expand,”  so too will feminists, lawyers, and policy makers be faced with opportunities to think about how to expand the law in favour of greater equality.
 Robert Meyer and Michel Cukier, “Assessing the Attack Threat due to IRC Channels,” (2006) University of Maryland School of Engineering, at 5-6 http://www.enre.umd.edu/content/rmeyer-assessing.pdf
 See Rebecca K. Lee, “Romantic and Electronic Stalking in a College Context,” (1998) 4 WM. & Mary J. Women & L. 373 at 404, 405-6 which discusses sexual harassment from e-mail messages, in chat rooms, and Usenet newsgroups. A well-known account of sexualized threats towards female and androgynous virtual personas and the emotional harm experienced by the real-life participants is in Julian Dibbell’s, “A Rape in Cyberspace,” My Tiny Life (1998), ch. 1 http://www.juliandibbell.com/texts/bungle.html.
 Jessica Valenti, “How the web became a sexists’ paradise” The UK Guardian (April 6, 2007) http://www.guardian.co.uk/g2/story/0,,2051394,00.html; Anna Greer, “Misogyny bares its teeth on Internet,” Sydney Morning Herald (August 21, 2007) http://www.smh.com.au/news/opinion/misogyny-bares-its-teeth-on-internet/2007/08/20/1187462171087.html;
Ellen Nakashima, “Sexual Threats Stifle Some Female Bloggers,” Washington Post (April 30, 2007)
 See Posts on “Greatest Hits: The Public Woman” and “What do we do about Online Harassment?” on Feministe http://feministe.powweb.com/blog/archives/2007/08/09/what-do-we-do-about-online-harassment/?s=online+harassment&submit=Search
 Ellen Nakashima, Washington Post, supra note 4.
 BBC News, “Blog Death Threat Sparks Debate” (27 March 2007) http://news.bbc.co.uk/1/hi/technology/6499095.stm
 Deborah Fallows, “How Women and Men Use the Internet,” Pew Internet & American Life Project (December 28, 2005), at 14 <http://www.pewinternet.org/pdfs/PIP_Women_and_Men_online.pdf>. The report states.” “The proportion of internet users who have participated in online chats and discussion groups dropped from 28% in 2000 to as low as 17% in 2005, entirely because of women’s fall off in participation. The drop off occurred during the last few years coincided with increased awareness of and sensitivity to worrisome behavior in chat rooms.”
 Nakashima, Washington Post, supra note 4.
 For an study on women, technology and power see Judy Wacjman, Technofeminism (Polity Press: Cambridge, UK, 2004).
 Recently, lower courts in Ontario have found that complaints are free make a case for invasion of privacy: Somwar v. McDonald’s Restaurant of Canada Ltd.,  O.J. No. 64 (Ont. S.C.J.) and Re: Shred-Tech Corp. v. Viveen  O.J. No. 4893. However, the Ontario Court of Appeal has explicitly found that there is no right to privacy in Euteneier v. Lee,  O.J. No. 4533 (SCJ); rev’d  O.J. No. 4239 (SCJ, Div Ct); rev’d (2005) 77 O.R. (2d) 621 (CA) at para 22.
 Jennifer McPhee, “New and Novel Torts for Problems in Cyberspace,” Law Times (30 July-August 6 2007) at 13.
 Criminal Code ( R.S., 1985, c. C-46 )
 Just two examples are: Jane Doe, The Story of Jane Doe: A Book About Rape (Random House: Toronto, 2003) and Patricia Monture-Angus, Thunder in my Soul: A Mohawk Woman Speaks. (Halifax: Fernwood Publishing, 1995). For an analysis of the limitations of the Supreme Court’s privacy analysis in obscenity, hate propaganda and child pornography cases, see Jane Bailey, Privacy as a Social Value - ID Trail Mix: http://www.anonequity.org/weblog/archives/2007/04/privacy_as_a_social_value_by_j.php
 Lydia Dotto, “Real lawsuits set to materialize from virtual worlds; Harm, theft in online gaming may land players in the courts: Precedents few, but Vancouver lawyer thinks cases coming” Toronto Star (2 May 2005) at D 04 (ProQuest).
PETS are Dead; Long Live PETs!
posted by:A Privacy Advocate // 11:59 PM // August 14, 2007 // ID TRAIL MIX
In this Google Era of unlimited information creation and availability, it is becoming an increasingly quixotic task to advocate for limits on collecting, use, disclosure and retention of personally-identifiable information ("PII"), or for meaningful direct roles for individuals to play regarding the disposition of their PII "out there" in the Netw0rked Cloud. Information has become the currency of the Modern Era, and there is no going back to practical obscurity. Regarding personal privacy, the basic choices seem to be engagement or abstinence, so overwhelming are the imperatives of the Information Age, so unstoppable the technologies that promise new services, conveniences and efficiencies. Privacy, as we knew it, is dying.
Privacy advocates are starting to play the role of reactive luddites: suspicious of motives, they criticize, they raise alarm bells; they oppose big IT projects like data-mining and profiling, electronic health records and national ID cards; and they incite others to join in their concerns and opposition. Privacy advocates tend to react to information privacy excesses by seeking stronger oversight and enforcement controls, and calling for better education and awareness. Some are more proactive, however, and seek to encourage the development and adoption of
privacy-enhancing technologies (PETs). If information and communication technologies (ICTs) are partly the cause of the information privacy problem, the thinking goes, then perhaps ICTs should also be part of the privacy solution.
In May the European Commission endorsed the development and deployment of PETs(1), in order to help “ensure that certain breaches of data protection rules, resulting in invasions of fundamental rights including privacy, could be avoided because they would become technologically more difficult to carry out.” The UK Information Commissioner issued similar guidance on PETs in November 2006(2). Other international and European authorities have released studies and reports discussing and supporting PETs in recent years. (see references and links below)
PETs as a Personal Tool/Application
Are PETs the answer to information privacy concerns? A closer look at the European and UK communiqués suggests otherwise - for all their timeliness and prominence, they reflect thinking about PETs that is becoming outdated. The reports cite, as examples of PETs, technologies such personal encryption tools for files and communications, cookie cutters, anonymous proxies and P3P (a privacy negotiation protocol). Not a single new privacy-enhancing technology category here in seven years. Other web pages dedicated to promoting PETs list more technologies, such as password managers, file scrubbers, and firewalls, but otherwise don’t appear to have significantly new categories of tools.(3,4).
The general intent off the PETs endorsements seem clear and laudable enough: publicize and promote technologies that place more controls into the hands of individuals over the disclosure and use of their personal information and online activities. PETs should directly enable information self-determination. Empowered by PETs, online users can mitigate the privacy risks arising from the observability, identifiability, linkability of their online personal data and behaviours by others.
Unfortunately, few of the privacy-enhancing tools cited by advocates have enjoyed widespread public adoption or viability (unless installed and activated by default on users’ computers, e.g. SSL and Windows firewalls). The reasons are several and varied: PETs are too complicated, too unreliable, untrusted, expensive or simply not feasible to use. The threat model they respond to, and benefits they offer, are not always clear or measurable to users. PETs may interfere with normal operation of computer applications and communications, for example, they can render web pages non-functional. In the case of P3P, a privacy negotiation protocol, viable user-agents were simply never developed (except for a. modest but largely incomprehensible cookie implementation in IE6 and IE7). PETs simply haven't taken off in the marketplace, and the bottom-line reason seems to be that there are few incentives for organizations to develop them and make them available. (Where there has been a congruence of interests between users and organizations, some PETs have thrived, for example, SSL for encrypted secure web traffic and e-commerce. Perhaps the same is happening for anti-spam and anti-phishing tools, since deployment of these technologies helps to promote confidence and trust in online transactions.)
Perhaps the underlying difficulty may be a conceptualization of PETs as a technology, tool or application exclusively for use by individuals, complete in itself, expressed perhaps in its purest form by David Chaum’s digital cash Stefan Brands' private credentials. As brilliant as those ideas are, they have had limited deployment and viability to date. It seems that, to be viable, PETs must be also meet specific, recognizable needs of organizations. Secure Socket Layer (SSL) is a good example, responding as it did to well-understood problems of interception, surveillance and consumer trust online. SSL succeeded because organizations had a mutual interest in seeing that it was baked into the cake of all browsers and its use largely transparent to user.
Meanwhile, technology marches on. Many PETs weren't very practical to use. Sure you can surf anonymously, if don't mind a little latency and the need to tweak or disable browser functionality. But as soon as you want to carry out an online transaction, sign on to a site, make a purchase, or otherwise become engaged online in a sustained way, you had to identify yourself, provide a credit card, login credential, registration form, mailing address, etc. Privacy suffered from the 100th window syndrome: your house, just like your privacy, could be Fort Knox secure but all it took was to leave one window open and the security (privacy) was compromised. Privacy required too much knowledge and effort and responsibility on the part of the individuals to sustain in an ongoing way. Online privacy was just too much work.
And, anyway, the benefits of online privacy tended to pale in the face of immediate gratification needs, and greater conveniences, personalization, efficiency, and essential connectedness afforded by consent and trust. The privacy emphasis slides inexorably towards holding others accountable for the personal information they must inevitably collect about us, not PETs. The only effective privacy option for most people in the online world is disengagement and abstinence.
PETs as a Security Technology
Certain consumer PETs have thrived, such as SSL, firewalls, anti-virus/anti-spyware tools, secure authentication tools. Perhaps anti-phishing tools and whole disk encryption will follow –if incorporated and activated by default into users’ hardware/software. But note: these are all largely information security tools. PETs have tended to become equated with information security. Safeguards are certainly an important components of privacy. We may not be able to stifle the global information explosion, but with appropriate deployment of PETs we can help ensure that our data stays where it belongs, is not accessed inappropriately, tampered with, or otherwise subject to breaches of confidentiality, integrity and availability.
But are these technologies really PETs? They may be technologies that are deployed with the end-user in mind - it is their data after all, but they don't really involve the user in a meaningful way in the life-cycle management of the information. The security measures listed above are put in place mainly to protect the interests of the organization. Of course, some organizations do go further and put in place technologies that help express important principles of fair information practices, such as technologies that promote openness and accountability in organizational practices, that capture user consent and preferences, and which allow to clients a measure of direct access and correction rights to the data and preferences stored about them - but this is still the exception rather than the norm..
PETs as Data Minimization Tools
More critically, security-enhancing and access/accountability technologies controls really miss out on the final ingredient of a true PET: data minimization. Information privacy is nothing if not about data minimization. The best way to ensure data privacy is not to disclose, use or retain the data at all. The minimization impulse is well captured by the fair information practices that require purposes to be specified and limited, and which seek to place limits on all data collected, used, disclosed and retained pursuant to those purposes. But such limitations run contrary to the impulses of most information-intensive organizations today, which is to collect and stockpile as much data as possible (and then to secure it as best as possible) because it may be useful later. More data, not less, is the trend. Why voluntarily limit a potential competitive advantage?
Apart from being a legal requirement, arguments for data minimization should be compelling, beginning with fewer cost and liabilities associated with maintaining and securing the data against leaks and misuse, or with bad decisions based upon old, stale and inaccurate data, as well as reputation and brand issue (faced with growing public concerns about excessive data collection, use and retention, major search engines and transportation agencies alike are now adopting more limited data usage policies and practices, but off course these policy-level decisions not PETs).
The problem is that there are few benchmarks against with to judge whether data minimization is being observed via use of technologies. How much less is enough to qualify as a PET? Is a networked, real-time passenger/terrorist screening program that flashes only a red, yellow or green light to the front line border security personnel a PET because the program design minimized unnecessary transmission and display of sensitive passenger PII? Similarly, is an information technology that automatically aggregates data after analysis, or which mines data and computes assessments on individuals for decision-making, or which is capable of delivering targeted bbut pseudonymous ads, a true PET because the actual personal information used in the process was minimized so not to be revealed to a human being? If a specific technology’s purpose for collecting, using, disclosing, and retaining customer or citizen data is sharply limited to "providing better services" and "for security purposes" then can these technology properly be considered PETs?!
PETs as expressing the Fair Information Principles (FIPs)
PETs minimize data, but not all technologies that minimize data are PETs. Data minimization is a necessary but insufficient requirement to become a PET. Enhanced information security is a necessary but insufficient requirement to become a PET. User empowerment is a necessary but insufficient requirement to become a PET. Together, all these impulses are expressed in the ten principles of (CSA) fair information practices, all of which must be substantially satisfied, within a defined context, in order for a given technology to be judged a PET worthy of the name, and of public support and adoption:
To enable user empowerment, we find the (CSA) fair information practices of:
1. Accountability; 2. Informed Consent; 3. Openness; 4. Access; and 5. Challenging Compliance. These principles and practices should be substantially operationalized by PETs.
To enable data minimization, we find the CSA fair information principles requiring 1. Identifying Purposes; 2. Limiting Collection; and 3. Limiting Use, Disclosure, and Retention.
Finally, the CSA Privacy Code calls for Security (Safeguards() appropriate to the sensitivity off the information.
[Comment: The CSA principle ‘Accuracy’ can fit under all three categories, since it implies a right for users to inspect and correct errors, as well as an obligation upon organizations to discard stale and/or inaccurate data, as well as a security obligation to assure integrity of data against unauthorized tampering and modification.]
A more comprehensive approach to defining and using PETs is required - one that clearly accommodates the interests and rights of individuals in a substantial way, yet which can be adopted or at least accommodated by organizations with whom individuals must inevitably deal. This requires a more systemic, process-oriented, life-cycle, and architectural approach to engineering privacy into information technologies and systems.
PETs as we know them are effectively dead, reduced to a niche market for paranoids and criminals, claimed by some security products (e.g., two-factor authentication dongles) or else deployed by organizations as a public relations exercise to assuage specific customer fears and to build brand confidence (e.g. banks' anti-phishing tools, web seals).
PETs as Information Architecture?
The future of PETs is architecture, not applications. Large-scale IT-intensive transformations are underway across public and private sector organizations, from real-time passenger screening programs and background/fraud checking, to the creation of networked electronic health records and eGovernment portals, to national identity systems for use across physical and logical domains. What is needed is a comprehensive, systematic process of ensuring that PETs are full enabled and embedded into the design and operation of these complex data systems. If code is law, as Lawrence Lessig posited, then systems architecture will be the rightful domain for privacy technologies to flourish in the current Google era.
The time has come to speak of privacy-enabling technologies and systems that help create favorable conditions for privacy-enhancing technologies to flourish and to express the three essential privacy impulses: user empowerment, data minimization, and enhanced security. Objective and auditable standards are essential preconditions.
Examples abound: Privacy-embedded "Laws of Identity" can enable privacy-enhanced identity systems and technologies to emerge; as is the development of 'smart' data that carries with it enforceable conditions of its use, in a manner similar to digital rights management technologies. Another example are intelligent software agents that can negotiate and express the preferences –and take action on behalf of- of individuals with respect to the disposition of their personal data held by others. Yet another promising development are new and innovative technologies that enable secure but pseudonymous user authentication and access to remote resources. These and other new information technologies may be the true future of PETs in the Google Era of petabytes squared, and worthy of public support and encouragement.
So, to summarize: the essential messages of this think piece are:
* PETs are attracting renewed interest and support, after several years of neglect and failure
* PETs are an essential ingredient for protecting and promoting privacy in the Information Age (along with regulation and awareness/education), but their conception and execution in practice is highly variable and still rooted in last-century thinking.
* True PETs should incorporate into information technologies ALL of the principles of fair information practices, rather than any subset of them.
* In today's Information Age, true PETs must be comprehensive, and involve all actors and processes. Evaluating PETs will increasingly be a function of whole systems and information architectures, not standalone products.
* It may be more useful to think of privacy-enabling technologies and architectures, which enable and make possible specific PETs.
(1) European Commission Supports PETs
Promoting Data Protection by Privacy Enhancing Technologies (2 May 2007)
Background Memo (2 May 2007): http://europa.eu/rapid/pressReleasesAction.do?reference=MEMO/07/159&format=HTML&aged=0&language=EN&guiLanguage=en
(2) Office of the UK Information Commissioner
Data Protection Technical Guidance Note: Privacy enhancing technologies (Nov 2006)
(3) Center for Democracy and Technology
Page on Privacy Enhancing Technologies
(4) EPIC Online Guide to Practical Privacy Tools
Other Useful Resources:
Dutch Ministry of the Interior and Kingdom Relations, the Netherlands
—Privacy-Enhancing Technologies. White paper for decision-makers (December 2004)
OECD Directorate For Science, Technology And Industry
—Committee For Information, Computer And Communications Policy
Inventory Of Privacy-Enhancing Technologies (January 2002)
Danish Ministry of Science, Technology and Innovation
—Privacy Enhancing Technologies
Report prepared by the META Group v1.1 (March 2005)
Office of the UK Information Commissioner
—Data protection best practice guidance (May 2002)
Report prepared by UMIST
—Privacy enhancing technologies state of the art review (Feb 2002) www.hispec.org.uk/public_documents/7_1PETreview3.pdf
EU PRIME Project
—White paper v2 (June 2007)
Andreas Pfitzmann & Marit Hansen,
TU Dresden, Department of Computer Science, Institute For System Architecture
—Anonymity, Unlinkability, Undetectability, Unobservability, Pseudonymity, and Identity Management - A Consolidated Proposal for Terminology (Version v0.29 - July 2007)
EU FIDIS Project
—Identity and impact of privacy enhancing technologies (2006)
—Introducing PITs and PETS Technologies: technologies affecting privacy (Feb 2001)
Office of the Ontario Information and Privacy Commissioner & Dutch Registratierkamer
—Privacy-Enhancing Technologies: The Path to Anonymity (Volume I - August 1995)
George Danzesis, University of Cambridge Computer Lab (Date Unknown)
—An Introduction to Privacy-Enhancing Technologies
posted by:Jeremy Hessing-Lewis. // 11:59 PM // August 07, 2007 // ID TRAIL MIX
A short story on the ID Trail
Incorrect username or password. Please try again.
He tried again.
Incorrect username or password. Please try again.
He tried again.
Incorrect username or password. Your ID is now locked. Please proceed to the nearest SECURE ID Validation Center for formal authentication. The nearest location can be found using the GoogleFED Search Tool.
After sitting stunned for a couple moments, Ross began to appreciate the full gravity of the situation. His ID was frozen. Everything was frozen. He just couldn't remember his damn PIN and that was the end of it. No PIN. No renewal. No ID. No authentication. No anything.
Since the government had launched the Single Enhanced Certification Using Reviewed Examination [SECURE] initiative, he really hadn't thought too much about it. Aside from a couple of headlines describing massive budget overruns and the usual privacy geeks heralding the end of the world, the New Government had pushed everything through without much fanfare.
That was four years ago. Since Ross already had a passport, the conversion to SECURE ID was pretty painless. He vaguely remembered something to do with a strand of hair and that they didn't even give him a card or anything, just read him his reauthorization PIN, thanked him for his time, and took his passport.
Since the carbon rationing system came into place in 2012, Ross really hadn't traveled anywhere off-line. There was no way he was going to save up carbon credits just to take a damn flight to some 45° cesspool. Plus, Google Travel could put him anywhere in the world in two clicks. A couple weeks ago he made some sangria and hit-up all the top clubs in Spain. He even bought a t-shirt at one which arrived in the mail two days later. That's why the SECURE ID renewal caught him off guard – it just rarely came-up for someone in his position.
Ross was just trying to buy a new snowboard for his Third Life avatar when things went wrong. He was notified that the transaction could not be processed because his GoogleCash account had been frozen pending authorization of his SECURE ID. Like just about everything else on or off-line, his identity was always confirmed back to this single source. While his ID Keychain supported a Federated identity management system in which he currently had 47 profiles (male, female, and gecko), they were all meaningless without reference to the master ID.
The SECURE system required multiple layers of redundancy. The PIN component would be required in addition to variable biometric authenticators. He had specifically written his 10 digit reauthentication PIN on a piece of paper and put it somewhere “safe.” So much for high-tech. That was four years ago and now, “safe” could be anywhere. The idea behind the routine expiry of SECURE IDs was to prevent identity theft from the deceased using stolen biometrics. Grave-robbing had been rampant for the first couple years of the program.
Ross grabbed his jacket and headed off to the SECURE ID Validation Center downtown knowing full well that he was as good as useless until he could authenticate himself.
The SECURE ID Validation Center was run by Veritas-SECURE, a public-private-partnership born of the New Deal 3.0. The idea was to exploit private-sector efficiencies while delivering top-notch public services. This P3 mantra had been something of an ongoing joke for years now but the government was unlikely to admit the error of its ways any time soon. Interestingly, the company that won the contract also ran the municipal waste disposal system. The critics couldn't stop talking about “synergies” and “leveraging technical expertise” when the winning bid was announced.
Ross arrived at the blue-glassed Veritas facility just after noon. He couldn't even buy lunch because the digital wallet in his phone had been deactivated when his SECURE ID was frozen. The day before, Ross had been mired in expense reports, cursing his multiple digital cash accounts associated with different profiles, devices, and credit sources.
Today, he had been thwarted by the keystone ID, the one that held everything else together and couldn’t be separated from his DNA.
The line for Formal Authentication zigzagged around two corners of the building against a cold marble wall. The only consolation was a nice big overhang covering the identity refugees from a light rain. He stepped into line behind a professional looking man with a brown leather briefcase and gray sports jacket.
Normally, he would've passed the time by watching movies on his iPod. Along with everything else, the DRM on his iPod was frozen pending authentication. The days of watching movies, or doing much of anything without authentication had evaporated long ago.
After a couple minutes of preliminary boredom, he tapped the gentleman with the briefcase on the shoulder asking with generalized ennui “Is this line even moving?”
“It depends how you define moving” the man replied, “if you're talking physics, then the answer is not for at least an hour. If you mean the decay of civil rights, then I guess you might say that we’re racing straight to the bottom.”
Somewhat surprised by the unprovoked disapproval, Ross was just happy to have a conversation to pass the time. He nodded his head enthusiastically. “This new ID system is only moderately infuriating though” he said. “I just hate these queues and the way they always try to make you feel like you're just another number.”
“Are you kidding? I would love nothing more than to be a number. Instead, I'm cursed with Jihad!” the man spat the final words.
Ross glanced up anxiously looking for the nearest Proxycam. Those things all had microphones and speakers these days and he was sure that the unit would ask the two of them to step out of line for questioning. Nothing happened.
The man quickly realized his error and extended his right hand saying. “I’m very sorry if I shocked you. My name is Jihad Azim, but everyone calls me Azi. I’m a university professor.”
Ross relaxed immediately, shaking the man’s hand as Azi continued “It’s just that my name brings me no end of grief. Jihad is actually a somewhat common name, but that sure isn't what you find with a Google search. The reason I'm stuck in this forsaken line is that they've red flagged my SECURE ID again! It happens every couple of weeks. I'm supposed to fly to Scottsdale for a conference tomorrow, but I'm pretty much grounded until I get this cleared up. The minions at the airport could neither confirm nor deny that the sky was blue, so I had to come down here. That's why I'd like nothing more than to be identified as a number. Then at least some fool with a grade 9 education wouldn't be fighting a holy war against my parents’ choice of name.”
“But couldn't you just change your name?” Ross asked, without giving it much thought.
“I could, but then I'd have a yellow flag on my ID noting that there'd been a change to my identity profile. That could be even worse. A colleague of mine has retinal implants and had to have her SECURE ID changed accordingly. Now she can't do anything without being questioned about the changes.” Azi said.
“I couldn't help but hear you two,” said a woman who had approached behind Ross and was pushing a stroller. “I know that this new system has been hard on some people, but you've gotta admit that this whole country is safer for it.”
Ross could see that this logic was going to make Azi angry, so he intervened first, questioning “But don't you think that sacrificing anonymity and privacy in the name of security is something of a false dichotomy?” Ross wasn’t entirely sure what he’d said, but he'd heard the line before and was satisfied that it sounded smart.
“Well, there might have been a better way.” She replied, “But I don't mind sacrificing a little privacy. I don't have anything to hide. And my daughter here, I'd gladly sacrifice my privacy for the security of my daughter. I can't bear to think of all those sickos out there. We’re here today for her first formal authentication so that they can confirm the samples they took at birth. Did you know that the SECURE ID is issued at birth now? I feel better knowing that she's already in the system.”
“You people are so out of it,” a new voice chimed in, “haven't you ever stopped to ask what an ID really is? It's not a number or name.” It was a young woman sitting crosslegged in front of Azi and wearing a pair of yoga jeans.
She continued “Identity doesn't come from some guy behind a computer representing the Government. Identity is how you tell the world who you are. My identity changes all the time. Like when I get a new job, or new friends, or a new hook-up. It seems like the older you get, the more attached you get to who you are. I don’t really care, for the last two weeks my avatar was a gecko.”
“No kidding.” Ross nostalgically remembered going through his gecko days.
The young woman cleared her throat and continued “The point is, you can't let The Man tell you who you are. It should be the other way around. We should control our identities.”
“So why are you here then?” the new mother retorted sarcastically. “Shouldn't you be busy launching DoS attacks against the ‘corporate agenda’ and all the complicit government agencies that hold it together?”
“I want to go volunteer at a monastery in New Burma, but The Man won't let me leave the country without a valid SECURE ID.”
Ross jumped-in noting “Hey, I was at a New Burmese monastery a couple weeks ago with Google Travel. Because of the time change, prayers don’t begin until four in the afternoon our time. Its perfect.”
The young woman was clearly not impressed. “No, like a REAL monastery with air and things you can touch.”
Ross had this debate all the time. “But…”
Azi was clearly not impressed by where this was going and interrupted “Well, I appreciate your helpful commentary. On the way to Scottsdale, maybe I’ll try ‘I am whoever I say I am and I choose to fly anonymously. If you absolutely must be provided with an ID, I happen to enjoy green tea, string theory, and the colour orange. Now please let me board the plane.”
As Azi was dismissing the young woman, a man in a gray suit neared Ross and stared blankly into the horizon of the queue. The man's pale face looked like he’d seen a ghost.
“Hey, so what's your story?” Ross couldn't help but ask.
“Ummm, I don’t know” the man replied.
“You don’t know? How can you not know?” Ross said.
“I just don’t know who I am anymore.” the man stuttered. “my identity has been stolen.”
The others gasped.
“Well, it's not that I don't know who I am, it’s just that the system has canceled my identity file as a result of concurrent use. There’s no way to verify that I am who I say I am because all my biometrics in have been compromised.”
The others remained silent. The SECURE ID system had been designed to be unbreakable. The authentication routine is so strong, and identity theft so difficult, that victim recovery remained nearly impossible. Everybody knew this. The only option was to create a new ID and start from scratch. The media labeled these victims “Born Agains.” Ross hadn't actually met one, but he’d read a couple blogs describing depressing encounters with these unfortunate souls. It was like being killed but leaving the body left to rot.
The young woman stood up, approached the identityless man, gave him a hug and gently requested: “Please, go in front of me.” The others tried not to make eye contact.
Out of sight and far down the line came a call for: “NEXT!” The line moved forward one meter.
Jeremy Hessing-Lewis is a law student at the University of Ottawa. He is writing a travel guide entitled “101 Must See Hikes in Google Maps” as well as his first novel “Things That are Square” (2009).| Comments (2) |