Reasonable Expectation of Privacy Workshop Movies
posted by:Jeremy Hessing-Lewis // 02:10 PM // May 23, 2007 // Computers, Freedom & Privacy Conference (CFP) | General | TechLife
The IDTrail Team produced two short films exploring the "reasonable expectations of privacy". They were used at the Computers, Freedom, and Privacy (CFP) 2007 conference in Montreal, Canada. The short films were produced and directed by Max Binnie, Katie Black and Jeremy Hessing-Lewis with contributions from Daniel Albahary, Ian Kerr, and Jane Bailey. They are available for download under a Creative Commons Attribution 2.5 license after the jump.
The first film, "Tessling-Just the Facts", is a brief dramatization of the facts that gave rise to R. v. Tessling , a criminal case which addressed the concept of the "reasonable expectation of privacy" with respect to forward-looking infrared (FLIR) technology.
Download Tessling-Just the Facts (Save As...))
Format: .mov[Quicktime],Duration: 4min22sec, Size: 9.53MB.
The second film, "CFP-Interviews", is a documentary that provides the viewer with a taste of various public interest perspectives on how to conceive of "reasonable expectations of privacy". It features short interviews with the following experts in the field of privacy, civil rights and law, in order of appearance:
Starring (in order of appearance):
Clayton Ruby, Ruby & Edwardh
Andrew Clement, University of Toronto
Peter Jordan, Engineer (ret.)
Chris Hoofnagle, Samuelson Clinic, UC Berkeley
Eugene Oscapella, Lawyer, Foundation for Drug Policy
David Sobel, Electronic Frontier Foundation (EFF)
Pippa Lawson, Canadian Internet Policy and Public Interest Clinic (CIPPIC)
Jim Karygiannis, MP Scarborough-Agincourt
Marc Rotenberg, Electronic Privacy Information Center (EPIC)
Cindy Cohn, Electronic Frontier Foundation (EFF)
Marlene Jennings, MP Notre-Dame-de-Grâce -- Lachine
Deirdre Mulligan, Samuelson Clinic, UC Berkeley
Comments (0) |
Download Public Interest Perspectives (Save As...)
Format: .mov[Quicktime], Duration:25min52sec, Size: 54.8MB.
On E-Government Authentication and Privacy
posted by:Stefan Brands // 01:40 PM // November 01, 2005 // Computers, Freedom & Privacy Conference (CFP) | Digital Activism and Advocacy | Digital Democracy: law, policy and politics | ID TRAIL MIX | Surveillance and social sorting | TechLife
Governments around the world are working to implement digital identity and access management infrastructures for access to government services by citizens and businesses. E-government has the potential of bringing major cost, convenience, and security benefits to citizens, businesses, and government alike. There are major architecture challenges, however, which cannot be solved by simply adopting modern enterprise architectures for identity management. Namely, these architectures involve a central server that houses the capability to electronically trace, profile, impersonate, and falsely deny access to any user. In the context of an e-government infrastructure, the privacy and security implications for citizens of such a panoptical identity architecture would be unprecedented.
By way of example, consider the implications of adopting the Liberty Alliance ID-FF architecture (the leading industry effort for so-called "federated" identity management) for e-government. The ID-FF describes a mechanism by which a group of service providers and one or more identity providers form circles of trust. Within a circle of trust, users can federate their identities at multiple service providers with a central identity provider. Users can also engage in single sign-on to access all federated local identities without needing to authenticate individually with each service provider. Liberty Alliance ID-FF leaves the creation of user account information at the service provider level, and in addition each service provider only knows each user under a unique “alias” (also referred to by ID-FF as “pseudonyms”). However, the user aliases in Liberty Alliance ID-FF are not pseudonyms at all: they are centrally generated and doled out by the identity provider, which acts in the security interests of the service providers.
While the Liberty Alliance ID-FF architecture may be fine for the corporate management of the identities of employees who access their corporate resources, it would have scary implications when adopted for government-to-citizen identity management. The identity provider and the service providers would house the power to electronic monitor all citizens in real time across government services. Furthermore, insiders (including hackers and viruses) would have the power to commit undetectable massive identity theft with a single press of a central button. Carving out independent “circles of trust” is not a solution: the only way to break out of the individual circle-of trust “silos” that would result would be to merge them into a “super” circle by reconciling all user identifiers at the level of the identity providers. This would only exacerbate the ID-FF privacy and security problems.
More generally, replacing local non-electronic identifiers by universal electronic identifiers has the effect of removing the natural segmentation of traditional activity domains; as a consequence, the damage that identity thieves can do is no longer confined to narrow domains, nor are identity thieves impaired any longer by the inherent slowdowns of a non-electronic identity infrastructure. At the same time, when the same universal electronic identifiers are relied on by a plurality of autonomous service providers in different domains, the security and privacy threats for the service providers no longer come only from wiretappers and other traditional outsiders: a rogue system administrator, a hacker, a virus, or an identity thief with insider status can cause massive damage to service providers, can electronically monitor the identities and visiting times of all clients of service providers, and can impersonate and falsely deny access to the clients of service providers.
On the legal side, the compatibility of modern enterprise identity architectures with data protection legislation and program statutes is highly questionable. Also, the adoption of enterprise identity architectures in the context of e-government would directly interfere with Article 8 rights under the European Convention on Human Rights. Specifically, any interference with privacy rights under Article 8 must do so to the minimum degree necessary. Enterprise identity architectures violate this requirement: far less intrusive means exist for achieving the objectives of e-government.
Specifically, over the course of the past two decades, the cryptographic research community has developed an array of privacy-preserving technologies that can be used as building blocks for e-government in a manner that simultaneously meets the security needs of government and the legitimate privacy and security needs of individuals and service providers. Relevant privacy-preserving technologies include digital credentials, secret sharing, private information retrieval, and privacy-preserving data mining.
By properly using privacy-preserving technologies, individuals can be represented in their interactions with service providers by local electronic identifiers. Service providers can electronically link their legacy account data on individuals to these local electronic identifiers, which by themselves are untraceable and unlinkable. As a result, any pre-existing segmentation of activity domains is fully preserved. At the same time, verifier-trusted authorities can securely embed into all of an individual’s local identifiers a unique “master identifier” (such as a random number). These embedded identifiers remain unconditionally hidden when individuals identify themselves on the basis of their local electronic identifiers, but their hidden presence can be leveraged by service providers for all kinds of security and data sharing purposes without introducing privacy problems. The privacy guarantees do not require users to rely on third parties - the power to link and trace the activities of a user across his or her activity domains resides solely in the hands of that user.
In the context of e-government, security and privacy are not opposites but mutually reinforcing, assuming proper privacy-preserving technologies are deployed. In order to move forward with e-government, it is important for government to adopt technological alternatives that hold the promise of multi-party security while preserving privacy.
For more information, interested readers are referred to my personal blog at www.idcorner.org.| Comments (0) |
CFP PATRIOT Act session
posted by:Catherine Thompson // 12:28 PM // April 15, 2005 // Computers, Freedom & Privacy Conference (CFP)
We’re just in the FISA and PATRIOT Act session. Ronald Lee spoke in part about section 215 that gives the director of the FBI wide production order powers and whether there should be a dedicated national security department in the U.S. Kevin Bankston talked about the “new normal” and how anti-terrorism has included a lot of law enforcement reforms, as well as the SAFE Act but that we need to be safer. Peter Swire right now is talking about how to do secret surveillance in an open society. He’s speaking about how we can limit the use of gag orders...| Comments (0) | | TrackBack
CFP - Second day
posted by:Veronica Pinero // 03:00 PM // April 14, 2005 // Computers, Freedom & Privacy Conference (CFP)
An amazing debate is going on! Today is the second day of the CPF conference, and the panel, which is moderated by Marcia Hofmann (EPIC), is to address the problems arising by Data Mining & Public Records.
The dynamic of this panel is that the presenters, Doug Klunder (ACLU-WA Privacy Project), Daniel Solove (George Washington University Law School), and Cindy Southworth (Safety Net: the National Safe & Strategic Technology Project at the National Network to End Domestic Violence), are to discuss audience’s solutions to two problems put to the audience before the panel started. The problems and the questions are the following:
Problem 1 – Public Records
Problem: a state is planning to put its public records online and make available over the Internet its public records. The state is working on a policy about disclosing the following information about individuals: (A) home address; (B) phone numbers; and (C) Social Security Number.
1. To what extent should each type of information be disclosed?
2. At one extreme, should each piece of information be redacted entirely from any public documents and never disclosed under any circumstances?
3. At the other extreme, should each piece of information be fully disclosed without limitation?
4. Are there any workable compromises between these two extremes?
Problem 2 – Background checks
Problem: State X currently provides for the disclosure of conviction records from a single agency, which maintains a centralized database. This database is updated to remove records when a conviction is reversed on appeal, vacated, pardoned, etc. A data broker proposes to provide a “full” background check service by creating its own database that includes every arrest and criminal proceeding which will be retained forever in the database.
Question: Should limits (and if so what limits) be placed on the agencies releasing information, on the data broker, on the entities obtaining background checks from the broker, or on some combination?
What is your own position with regard to this? I have not thought very much about the first problem, but with regard to the second, I have some concerns:
First of all, what is the purpose of a criminal conviction? Or, what is the purpose of sentencing? If we are to think about this, we will realize that among others, the purpose of sentencing is rehabilitation and reintegration. How are we going to achieve such a purpose by implementing (or allowing to implement) these sorts of intrusive practices?
Second, there is an assumption: the information provided by State X is not shared with other states; it is “static information”. What happens if this information is shared with other countries, and therefore, subjected to foreign jurisdiction (“dynamic information”)? How do we assure that all criminal information that has being modified (for instance, reversed convictions on appeal, granted pardons, etc.) is actually modified by the foreign jurisdiction that got the information?
Computer, Freedom, and Privacy Conference
posted by:Veronica Pinero // 05:51 PM // April 13, 2005 // Computers, Freedom & Privacy Conference (CFP)
Good morning! Today was the first day of the CFP Conference. The first panel, which was moderated by Anita Ramasastry and organized by Stephanie Perrin, addressed the topic of Sousveillance in the Panopticon. The first presenter, Steve Mann, opened the debate with his presentation about Equiveillance, the balance between sous/surveillance. He introduced ten hypotheses to his audience and the panel, which were to be accepted or refused by the latter (this text is available at www.anonequity.org).
The first presenter to address Steve’s hypotheses was David Brin. He noted that the concepts of freedom and privacy were not opposing notions, but notions that can work together. His presentation was followed by Latanya Sweeney’s and Ivan Szekely’s presentations, the former addressing the issue of When sousveillance becomes surveillance, and the latter about New democracies? New Panopticon? Lessons learnt from Central and Eastern Europe. The panel was closed by Simon Davis, who highlighted whether sousveillance was a way to fight surveillance, or was just the same. He noted that the concept of sousveillance was also perpetrating privacy.
Many interesting questions were put by the audience, which lead to a fascinating debate.
The second panel addressed Privacy risks of new passports technologies, and focused on United State and European legislation.
CFP first day summary
posted by:Catherine Thompson // 01:49 PM // // Computers, Freedom & Privacy Conference (CFP)
Yesterday was a very busy and very amazing day!!! The Anonymity Project held an all day workshop and attendance was incredible! When we first arrived, we were surprised to find we were assigned one of the ballrooms. After all, we only had about 40 registrants. But as the day wore on, it became clear that there was great interest in our session. There were about 80 seats, all filled, as well as people sitting on the floor by the walls. We figure there were about 100 people who came to check us out!
After an introduction by Stephanie Perrin and Ian Kerr, I was the first to present. My topic was Intelligent Transportation Systems – the short paper for it is available on the anonequity.org website along with most of the other speakers’ papers. Next was Alex Cameron who talked about the similarities between the panopticon and digital rights management technologies. Although he apologized for talking about philosophy first thing in the morning, the audience was nevertheless very interested!
Ed Hasbrouck spoke about the undiagnosed post traumatic stress disorder that policy makers are suffering from. EPIC’s Marcia Hoffman spoke about the Hiibel case and various legislative initiatives that mistakenly put faith in the ability to detect terrorists by increasing identification requirements among citizens. Peter Hope-Tindall spoke about the latest biometric technology, as well as Canadian and American projects in the area. Veronica Pinero spoke about panopticism and how we should rethink the use of digital criminal records to prevent discrimination.
Ian Goldberg detailed the barriers of use for some privacy enhancing technologies and some new developments in the area of anonomized remailers, messaging, file sharing, and the WWW. Roger Dingledine spoke about TOR technology as a means to remain anonymous online. TOR basically involves servers not being able to see where data is coming or going beyond the next server it communicates with. Stefan Brands detailed the relationship between verifiers and identifiers in a multi-threat environment, ending with the promise of the next generation of identifiers.
Philippa Lawson spoke about CIPPIC’s consumer profiling research, answering the question of who is using my data and what are they doing with it? Valerie Steeves and Ian Kerr presented on virtual playgrounds and buddy bots. Val spoke about the embedding of commercialism into children at a young age through websites that invade childrens’ privacy. Ian spoke about bots that imitate intelligent human interactions. Children interact with these bots without realizing that they’re not people. Lillie Coney of EPIC spoke about racial profiling and the suspected terrorist rationale. Ian Spriel related his own experience with racial profiling one day while taking pictures by the docks in Seattle.
Stephanie Perrin spoke for a few minutes about RFIDs and European developments. Simon Davies followed with a chilling account of the national ID that will have 51 separate pieces of data. Both Stephanie and Simon talked about the lack of democratic process in both of their talks. The day ended with Steve Mann talking about sousveillance and how having to show ID is evidence of being owned.
Right now we’re listening to the opening keynote debate and we’ll have another blog installment soon!| Comments (0) | | TrackBack
Keeping an Eye on the Panopticon: Workshop on Vanishing Anonymity
posted by:Veronica Pinero // 08:54 PM // April 12, 2005 // Computers, Freedom & Privacy Conference (CFP)
Good afternoon! Many members of the research group On the Identity Trail are participating in a whole day workshop at the Computer, Freedom, and Privacy Conference that is being held in Seattle (U.S) and that was organized by Stephanie Perrin, another team member.
Val Steeves and Ian Kerr have just presented their research, Virtual playgrounds and BuddyBots: a data-minefield for tinys & tweenies. By the way, this research is already available at www.anonequity.org.
Valerie highlighted the need to address how children are being targeted as web consumers: “the net is a place in which children spend a lot of time, and it is also a place where much of their private life is collected.” She noted that children are “vulnerable population” and that current model of “informed consent” does not address this characteristic. She also pointed out how all these websites that focus on children try to manipulate them by reinforcing the discourse of “friendship between the child and the product.”
Ian presented his interaction with a NativeMind Bot, “affective computing”, in an attempt to present to his audience how to understand and model emotional experience in machine behaviour.
Many interesting questions followed their presentation. While addressing one of them, Valerie noted that, with regard to how to protect children from these intrusive techniques of data-collection, the key point is to promote child-education material to let children know how they are manipulated. She also noted that members of the research group are already doing research in this area.
Keeping an Eye on the Panopticon: Workshop on Vanishing Anonymity
posted by:Veronica Pinero // 07:55 PM // // Computers, Freedom & Privacy Conference (CFP)
Just a quick message. Edward Hasbrouck, who did an excellent presentation this morning with regard to Travel ID, has just posted his paper on his own website http://hasbrouck.org/blog/archives/000556.html| Comments (0) | | TrackBack
Computers, Freedom and Privacy Conference (CFP) 2005
posted by:Marty // 09:01 PM // April 10, 2005 // Computers, Freedom & Privacy Conference (CFP)
On the Identity Trail will be extremely active at CFP this year, running KEEPING AN EYE ON THE PANOPTICON: WORKSHOP ON VANISHING ANONYMITY all day Tuesday April 12.
Blog*on*nymity will be blogging the conference. See the CFP category for on going posts.| Comments (0) | | TrackBack