Felix Stalder
(2002) 'The Failure of Privacy Enhancing Technologies (PETs)
and the Voiding of Privacy'
Sociological
Research Online, vol. 7, no. 2,
<http://www.socresonline.org.uk/7/2/stalder.html>
To cite articles published in Sociological Research Online, please reference the above information and include paragraph numbers if necessary
Received: 20/5/2002 Accepted: 9/8/2002 Published: 31/8/2002
public and private ... are best conceptualized as multidimensional (with dimensions sometimes overlapping or blurred and at other times cross cutting or oppositional), continuous and relative, fluid and situational or contextual, whose meaning lies in how they are interpreted and framed.[10]
Print was also a major factor in the development of the sense of personal privacy that marks modern society. It produced books smaller and more portable than those common in a manuscript culture, setting the stage psychologically for solo reading in a quiet corner, and eventually, for completely quiet reading. In a manuscript culture … reading had tended to be a social activity, one person reading to others in a group.[12]
where does (and should) the private person stop and the public person begin? These questions were relatively more settled before new technologies appeared that suddenly give meaning to here-to-fore meaningless [or unobtainable], and therefore inadvertently protected, personal information.[15]
P3P is a standardized set of multiple-choice questions, covering all the major aspects of a Web site's privacy policies.…P3P -enabled Web sites make this information available in a standard, machine-readable format. P3P enabled browsers can "read" this snapshot automatically and compare it to the consumer's own set of privacy preferences. P3P enhances user control by putting privacy policies where users can find them, in a form users can understand, and, most importantly, enables users to act on what they see.[30]
Freenet is implemented as an adaptive peer-to-peer network of nodes that query one another to store and retrieve data files, which are named by location-independent keys. Each node maintains its own local datastore which it makes available to the network for reading and writing, as well as a dynamic routing table containing addresses of other nodes and the keys that they are thought to hold.[45]
2For an overview of recent approaches, see Bennett, Colin; Grant, Rebecca (eds) (1999). Visions of Privacy. Toronto: University of Toronto Press
3For early contributions, see Westin, Alan F. (1967). Privacy and Freedom. New York: Atheneum; Rule, James (1973). Private Lives, Public Surveillance. London: Allen-Lane
4Organization for Economic Cooperation and Development (OECD) (1981). Guidelines to the Protection of Privacy and Transborder Flows of Personal Data. Paris: OECD
5European Union (1995). Directive on the Protection of Individuals With Regard to the Processing of Personal Data and on the Free Movement of Such Data. Official Journal of the European Communities of 23 November 1995 No L. 281
6Flaherty, David H. (1989). Protecting Privacy in Surveillance Societies: The Federal Republic of Germany, Sweden, France, Canada and the United States. Chapel Hill, NC: University of North Carolina Press; Bennett, Colin J. (1992). Regulating Privacy: Data Protection and Public Policy in Europe and the United States. Ithaca, NY: Cornell University Press
7This argument is presented most comprehensively, and convincingly, in Lyon, David (2001). Surveillance Society: Monitoring Everyday Life. Buckingham, Philadelphia: Open University Press. For a more theoretical treatment of this trend, see Bogard, William (1996). The Simulation of Surveillance: Hypercontrol in Telematic Societies. Cambridge, New York: Cambridge University Press. For studies on specific surveillance techniques that reach similar conclusions, see, for example, Norris, Clive; Armstrong, Gary (1999). The Maximum Surveillance Society: The Rise of CCTV. Oxford, UK: Berg and Garfinkel, Simson (2001). Database Nation: The Death of Privacy in the 21st Century. Cambridge, MA: O'Reilly & Associates
8One of the few exceptions: Moore, Barrington (1984). Privacy: Studies in Social and Cultural History. M.E. Sharpe: New York
9Quoted in: Storr, Andrew (1989). Solitude. London: Fontana, p.16
10Marx (2001) p.160
11This line of argument was pioneered in Innis, Harold, A. [1951] (1995). The Bias of Communication. Toronto: University of Toronto Press and in McLuhan, Marshall (1962), The Gutenberg Galaxy: The Making of Typographic Man. Toronto: University of Toronto Press, see more specifically, McLuhan, Marshall; Powe, Bruce (1981). Electronic Banking and the Death of Privacy. Journal of Communication Vol.31, No.1 pp. 164-169
12Ong, Walter (1982). Orality and Literacy: The Technologizing of the World. London, New York: Methuen & Co, p.130
13On why Montaigne was so concerned with the peculiarities of different cultures, Elisabeth Eisenstein writes: "He could see more books by spending a few months in his tower study than earlier scholars had seen after a lifetime of travel. When explaining why Montaigne perceived greater 'diversity and conflict' in the works he consulted than medieval commentators in an earlier age, something should be said about the increased number of texts he had at hand" (p.44). Eisenstein, Elisabeth, L. (1983). The Printing Revolution in Early Modern Europe. Cambridge, UK: Cambridge University Press.
14Habermas, Juergen [1962] (1989). The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society (translated by Thomas Burger with the assistance of Frederick Lawrence). Cambridge, MA: MIT Press
15Marx (2001), p.160
16Ong (1980, p.3) writes: "The electronic age is also an age of 'secondary orality,' the orality of telephones, radio, and television, which depends on writing and print for its existence."
17I will expand on this in the conclusion of this article.
18See, Bennet (1992), and Flaherty (1989)
19Bogart (1996), Garfinkel (2001), Lyon (2001), Norris & Armstrong (1999)
20Agre, Philip E.; Rotenberg, Marc (eds.) (1997). Technology and Privacy: The New Landscape. Cambridge, MA: MIT Press
21Increasingly, there are also PETs for institutions, however, in the following, I will focus on those aimed at individual users.
22Garfinkel, Simson (1995). PGP: Pretty Good Privacy. Sebastopol, CA: O'Reilly & Associates
23Cavoukian, Ann (1996). Go Beyond Security -- Build In Privacy: One Does Not Equal The Other. Paper presented at Cardtech/Securtech '96 Conference, Atlanta Georgia, May 14-16, 1996
24There is also a Type II remailer, or MixMaster, that fragments messages into fixed-size packets, which are then bounced to different remailers in the chain. This greatly decreases the feasibility of traffic analysis.
25Grossmann, Wendy (1995). Alt.scientology.war. Wired
Nr.12 Vol.3 (December).
26Quoted in: Lester, Toby (2001). "The Reinvention of Privacy,"
Atlantic Monthly (March) Vol.287, No.3.
27Company Profile, http://www.anomymizer.com [09.13.2001]
28<http://www.peek-a-
booty.org>
30<http://www.w3.org/P3P/> [13.12.01]
31This narrow mandate is quite deliberate. "We do not want
specification and standard settings bodies determining public policy. W3C does not wish to become the
forum for public policy debates. We don't want to cede the development of substantive policy to
technical organizations." <http://www.cdt.org/privacy/pet/p3pprivac
y.shtml> (March 28, 2000) [13.12.01] The troubled history of ICANN testifies to the
difficulties that arise when technical bodies being to set policies. See: http://www.icannwatch.org
32See, Clarke, Roger (1998a). Platform for Privacy Preferences:
An Overview. Privacy Law & Policy Reporter 5, 2 (July 1998) <http://www.anu.edu.au/peopl
e/Roger.Clarke/DV/P3POview.html> and Clarke, Roger (1998b). Platform for Privacy
Preferences: A Critique. Privacy Law & Policy Reporter 5, 3 (August 1998) <http://www.anu.edu.au/people/R
oger.Clarke/DV/P3PCrit.html>
33<http://www.cdt.org/privacy/pet/p3pprivac
y.shtml> (March 28.200) [29.11.2001]
34Opportunity costs refers to what is being lost in order to gain
something else, for example, the time spent finding the competing service.
35http://www.kcoyle.net/response.html (May 2000) [13.12.01]
36See <http://www.w3c.org/RDF>
37Coyle (2000)
38Cookies are small files stored on the user's hard disk that
identify him/her vis-à-vis a web service. A cookie can store access passwords or information to
customize a web site. It can also be used to track the user's surfing patterns within and across web sites.
39Schneier, Bruce (2000). Secrets and Lies: Digital Security in
a Networked World. New York: John Wiley & Sons, Inc.
40EPIC & Junkbusters (2000). Pretty Poor Privacy: An
Assessment of P3P and Internet Privacy (June).
41Garfinkel, Simson (2000). Can a Labeling System Protect your
Privacy? Salon Magazine, July 11 2001
42Clarke, Ian (1999). A Distributed Decentralised Information
Storage and Retrieval System. Edinburgh: Division of Informatics, University of Edinburgh
43Hong, Theodore (et al.) (2001). Freenet: A Distributed
Anonymous Information Storage and Retrieval System. In Federrath, H. (ed.) Designing Privacy
Enhancing Technologies: International Workshop on Design Issues in Anonymity and Unobservability,
LNCS 2009. New York: Springer
44Adler, S. (1999) The Slashdot effect: an analysis of three Internet
publications, Linux Gazette. Issue 38, March
45Hong (2001)
46As a comparison, the Linux project is more than 10 years old,
and builds on software released in the mid 1980s.
47Schulzki-Haddouti, Christiane (2001) Digitale Freihäfen (Digital
Free Havens). Telepolis 27.09.2001 <http://www.heise.de/tp/deutsch/inhalt/te/9
657/1.html> [27.09.2002]
48This might become less of a problem as bandwidth becomes
cheaper.
49As the Freenet FAQ explains: "Proposals for a more useful
mechanism are being evaluated, and one of them will probably be implemented in an upcoming version
of the protocol. For example, documents could optionally be inserted with public keys attached, and
only updates signed by the corresponding private keys would be accepted. Unsigned documents would
be immutable. Alternately, some type of versioning system could keep track of all previous versions of
documents."
<http://freenet.sourceforge.net/index.php?pag
e=faq> [14.12.2001]
50<http://freenet.sourceforge.net/index.
php?page=philosophy>[14.12.2001]
51Ibid.
52This is ironic, given that many players in the PETs-field hold
strong libertarian beliefs.
53For an overview of recent privacy surveys, see http://www.privacyexchange.org/iss
/surveys/surveys.html
54Concepts such as "ubiquitous computing" envision virtually
every object – cars, frigdes, heating systems etc – connected to the Internet.
55Most prominently, Castells, Manuel (1996). The Rise of the
Network Society, The Information Age: Economy, Society and Culture. Vol. I. Cambridge, MA;
Oxford, UK: Blackwell
56Brin, David (1998). The Transparent Society. Will
Technology Force Us to Choose Between Privacy and Freedom? Reading, MA: Perseus
Books
57Lyon, David (1994). The Electronic Eye: The Rise of the
Surveillance Society. Minneapolis: University of Minnesota Press; Lyon, David (2001)
58Lyon, David (ed.) (in press). Surveillance as Social Sorting:
Privacy, Risk, and Automated Discrimination. London, New York: Routledge
59The most extreme negative case of the power of surveillance as
social sorting is documented in Black, Edwin (2002). IBM and the Holocaust: The Strategic
Alliance between Nazi Germany and America's Most Powerful Corporation. New York: Three
Rivers Press, Random House