understanding the importance and impact of anonymity and authentication in a networked society
navigation menu top border

.:home:.     .:project:.    .:people:.     .:research:.     .:blog:.     .:resources:.     .:media:.

navigation menu bottom border
main display area top border
« Privacy is Changing Outsourcing in Canada | Main | Wearable Sensors to Improve Soldier Post-Action Reports »

EULAs and the Geniuses of Uninformative Dissemination

posted by:Jeremy Clark // 08:50 AM // May 09, 2006 // ID TRAIL MIX

trailmixbanner.gif

“[C]ontrol of the Western species of the human race seems to turn upon language. Anyone who has worked with language, from the devil on, has been in the business of spreading knowledge. They are not knowledge itself. Novelists, playwrights, philosophers, professors, teachers, journalists have no proprietary right over knowledge. They do not own it. They may have some training or some talent or both. They may have a great deal of both. They will still be no more than the geniuses of dissemination. That knowledge — once passed on as the mirror of creativity or as an intellectual argument or as the mechanisms of a skill or as just plain information — may lead to increased understanding. Or it may not. So be it.” – John Ralston Saul, The Unconscious Civilization

In one unintentional way, Sony’s decision to secure a series of audio CDs with a very nasty piece of digital rights management (DRM) last fall was a partial victory for anti-DRM activists. This misstep on Sony’s part effectively catapulted a niche topic of concern to international attention and into the collective consciousness of the informed public where it lingered for a week or two, and then slipped into the chambers of recent history. The mainstream coverage largely focused, and rightly so, on how Sony’s DRM compromised the security, anonymity, and control of those who unwittingly inserted one of these audio CDs into their Windows machine. The DRM installed as rootkit — a technique that allows software to run invisibly on a system. Worst still, the DRM did not merely install itself as a rootkit; it created an open mechanism to allow itself to run invisibly, and by extension any other piece of properly constructed software. In other words, it left an open security hole for malware to slip through and become invisible to the majority of anti-virus and anti-spyware utilities protecting our systems. Once installed, the DRM will phone home each time the CD is inserted, and no method for uninstalling the DRM was originally offered.

I will not detail each twist and turn of the subsequent events that eventually provoked a recall on the CDs, and a series of lawsuits. I refer those interested to the blog of Mark Russinovich who originally discovered the rootkit and to the Wikipedia article. I have denoted this specific case as a partial victory for those who oppose DRM because while it temporarily caught mainstream attention and hopefully left an impression, it did not do much to impede the relentless movement of content creators towards DRM — it only caused them to adopt subtler albeit equally restrictive technologies.

However there is another side to the Sony debacle that I want to focus on: user consent. Like most pieces of software, Sony’s DRM included an http://www.eff.org/wp/eula.php>end-user licence agreement (EULA); that daunting piece of legalese that ends with an “I Agree” button. In this case, not only did Sony’s EULA not disclose the rootkit, phoning home, or uninstallability, the DRM installed itself before even displaying the EULA. These issues were subject to an Electronic Frontier Foundation lawsuit, which was eventually settled out of court. While I fully applaud the efforts of EFF, I also have to make an uneasy confession to make. Even if companies like Sony did fully disclose and detail all the undesirable behaviours of their software in a proper EULA, I would never know because I never read them. And I know I am not alone.

As these events transpired, I recalled an opinion piece I read a few years ago in Wired by Mark Rasch. Rasch begins with an anecdote: “I have a recurring nightmare. Microsoft CEO Steve Ballmer shows up on my doorstep demanding my left kidney, claiming that I agreed to this in some "clickwrap" contract” [link mine]. While an attorney and security guru himself, Rasch flatly admits to never reading online privacy policies despite writing them for clients. This confession appears to be part of a widely held consensus. According to internet legend, the software vendor PC Pitstop once buried a potential monetary reward in one of its EULAs for any claimant who responded through a given email address. 4 months and 3000 downloads later, the first person finally wrote in and their diligence was rewarded with a $1000 cheque.

Companies can be surprisingly candid in their EULAs, shamelessly detailing in plain language their intention of installing bundled tracking software, displaying all forms of pop-up ads, or phoning home with user information that can be sold to third parties. However other companies purposely obfuscate the pertinent information with impervious legalese, and many EULAs run to multiple pages inside a tiny window that cannot be resized or copied to the clipboard. As along as we the consumers are complicate with this system, and continue to unintentionally consent to terms of service we make no effort to understand, we are empowering the software vendors to the status of geniuses of uninformative dissemination. The information that is communicated through EULAs falls squarely in the latter half of John Ralston Saul’s distinction — knowledge that does not increase understanding.

In Mark Rasch’s op-ed, he turns to technology to aid in consumer understanding. Specifically he calls for a law robot that can be programmed with user preferences and process a licence or policy on a user’s behalf. Now suppress any visions of an artificially intelligent bot capable of comprehending a legal document for moment, because that technology is still far in our future. Other options exist. One is to pressure vendors into offering a machine-readable summary of their contracts and policies. And ground has already been broken on this front by the Platform for Privacy Preferences (P3P).

P3P was initiated in 1997 by the World Wide Web Consortium (W3C) with the objective of developing a standardized syntax for encoding machine-readable privacy policies for web services. P3P use a versatile mark-up language called XML. Any of you reading this blog entry through an RSS feed is already making use of XML. A P3P policy has a set of predefined disclosures and a company must make all that are applicable to its policy. The absence of any disclosure presumes the action is never taken. This transforms the nature of the policy from being a one-way broadcast into being a response to predetermined questions. This disempowers the vendors from being geniuses of dissemination who push their carefully constructed terms of services onto consumers, and empowers the user to pull understandable information from the vendor.

>From a technological perspective, a P3P policy is very elegant. It uses a hierarchical tree of assertions that require the web service to disclose its identity, the methods that are open for resolving disputes concerning the policy, and what gathered information can be later accessed by the user. It then requires the web service to explicitly detail every type of information that is retained (from a comprehensive and predefined list), what purpose the information will be used for, whom the information can be disclosed to, and how long it will be retained. A user may then specify her preferences to a mediating agent such as Privacy Bird or use a P3P-enabled search engine which will analyze the privacy policies of each website she visits before she actually connects to the service itself, and report any discrepancies between the site and her preferences (or if the site does not have a P3P policy at all).

The syntax of P3P could easily be modified to handle EULAs. As a rough sketch, consider anchoring the assertions in two categories: monitor and install. Because spyware monitors user traffic, the monitor category would essentially inherit all the P3P assertions specifying the information retained. It could also specify, with an action assertion, how the data is obtained (keystrokes, data scrapping, packet sniffing, data interception, et cetera) and how often the information is being obtained (only when the program runs, as long as the operating system is running, one-time only, et cetera). The install category would disclose any third party software that is bundled with the principal software and reference this software’s EULA. It would also include assertions concerning the actions taken by the software (rootkit, displays pop-ups, url redirects, et cetera) and an assertion of how uninstallable it is.

Logistically, porting P3P to handle EULAs is not as simple. The legal status of EULAs is ambiguous, and the enforceability of a machine-readable version is something I am not qualified to speculate on. There was also a need to enforce the accuracy of P3P and natural language privacy policies, resulting in a group of non-profit seal programs that audit and certify web services' privacy practices. Expanding the progress made with privacy policies to EULAs would require similiar programs, and the process demands a massive collaboration between computer scientists and lawyers and other disciplines. ID Trail represents a rare occasion when all the right people are sitting at the same table, and as a result I look forward to feedback concerning this problem from all angles: implementation ideas, critiques concerning its viability, opinions on its legality, and speculation on vendor's incentives to comply. Would an undertaking be in the public interest? Is it needed? Could it be effective?

Jeremy Clark is an MASc student at the University of Ottawa.

Comments

Post a comment




Remember Me?


main display area bottom border

.:privacy:. | .:contact:.


This is a SSHRC funded project:
Social Sciences and Humanities Research Council of Canada