Documente Academic
Documente Profesional
Documente Cultură
Page 1 of 4
Harvard Magazine
Main Menu Search Current Issue Contact Archives Centennial Letters to the Editor FAQs
Code is Law
On Liberty in cyberspace
by Lawrence Lessig
Every age has its potential regulator, its threat to liberty. Our founders feared a newly empowered federal government; the
Constitution is written against that fear. John Stuart Mill worried about the regulation by social norms in nineteenth-century
England; his book On Liberty is written against that regulation. Many of the progressives in the twentieth century worried
about the injustices of the market. The reforms of the market, and the safety nets that surround it, were erected in response.
Ours is the age of cyberspace. It, too, has a regulator. This regulator, too, threatens liberty. But so obsessed are we with the
idea that liberty means "freedom from government" that we don't even see the regulation in this new space. We therefore
don't see the threat to liberty that this regulation presents.
This regulator is code--the software and hardware that make cyberspace as it is. This code, or architecture, sets the terms on
which life in cyberspace is experienced. It determines how easy it is to protect privacy, or how easy it is to censor speech. It
determines whether access to information is general or whether information is zoned. It affects who sees what, or what is
monitored. In a host of ways that one cannot begin to see unless one begins to understand the nature of this code, the code of
cyberspace regulates.
This regulation is changing. The code of cyberspace is changing. And as this code changes, the character of cyberspace will
change as well. Cyberspace will change from a place that protects anonymity, free speech, and individual control, to a place
that makes anonymity harder, speech less free, and individual control the province of individual experts only.
My aim in this short essay is to give a sense of this regulation, and a sense of how it is changing. For unless we understand
how cyberspace can embed, or displace, values from our constitutional tradition, we will lose control over those values. The
law in cyberspace--code--will displace them.
http://www.harvard-magazine.com/issues/jf00/forum.html
11/17/2003
Page 2 of 4
OTHER ARCHITECTURES
What makes the net unregulable is that it is hard to tell who someone is, and hard to know the character of the content being
delivered. Both of these features are now changing. Architectures for facilitating identification--or, more generally, for
certifying facts about the user (that he is over 18; that he is a he; that he is an American; that he is a lawyer)--are emerging.
Architectures for rating content (porn, hate speech, violent speech, political speech) have been described and are being
implemented. Each is being developed without the mandate of government, and the two together could facilitate an
extraordinary degree of control over behavior on the Net. The two together, that is, could flip the unregulability of the Net.
Could--depending upon how they are designed. Architectures are not binary. There is not simply a choice about
implementing an identification architecture, or a rating architecture, or not. What the architecture enables, and how it limits
its control, are choices. And depending upon these choices, much more than regulability will be at stake.
Consider identification, or certification, architectures first. We have many certification architectures in real space. The
driver's license is a simple example. When the police stop you and demand your license, they are asking for a certain
certification that you are licensed to drive. That certification includes your name, your sex, your age, where you live. It must
include all that because there is no other simple way to link the license to the person. You must give up all these facts about
yourself to certify that in fact you are the proper holder of the license.
But certification in cyberspace could be much more narrowly tailored. If a site required that only adults enter, you could-using certification technologies--certify that you were an adult, without also revealing who you were or where you came
from. The technology could make it possible to selectively certify facts about you, while withholding other facts about you.
The technology could function under a "least-revealing-means" test in cyberspace even if it can't in real space.
Could--depending upon how it was designed. But there is no necessity that it will develop like this. There are other
architectures developing--we could call them "one-card-shows all." In these architectures, there is no simple way to limit
what gets revealed by a certificate. If a certificate holds your name, address, age, citizenship, and whether you are a lawyer,
and if you need to certify that you are a lawyer, this architecture would certify not only that you are a lawyer--but also all the
other facts about you that the certificate holds. Under this architecture, more is better. Nothing enables the individual to steer
for less.
The difference between these designs is that one enables privacy in a way that the other does not. One codes privacy into an
identification architecture by giving the user a simple choice about how much is revealed; the other is oblivious to that value.
Thus whether the certification architecture that emerges protects privacy depends upon the choices of those who code. Their
choices depend upon the incentives they face. If protecting privacy is not an incentive--if the market has not sufficiently
demanded it and if law has not, either--then this code will not provide it.
The example about identification is just one among many. Consider another, involving information privacy. RealJukebox is a
http://www.harvard-magazine.com/issues/jf00/forum.html
11/17/2003
Page 3 of 4
technology for copying music from a CD to a computer, as well as for downloading music from the Net to store on a
computer's hard drive. In October it was revealed that the system was a bit nosy--that it snooped the hard disk of the user and
reported back to the company what it found. It did this secretly, of course; RealNetworks didn't tell anyone its product was
collecting and reporting personal data. It just did. When this snooping was discovered, the company at first defended the
practice (saying no data about individuals were actually stored). But it quickly came to its senses, and promised not to collect
such data.
This "problem" is caused, again, by the architecture. You can't easily tell in cyberspace who's snooping what. And while the
problem might be corrected by an architecture (a technology called P3P would help), here's a case where law would do well.
If these data were deemed the property of the individual, then taking them without express permission would be theft.
In these contexts, and others, architectures will enable values from our tradition--or not. In each, there will be decisions about
how best to build out the Internet's architecture consistent with those values, and how to integrate those architectures with
law. The choice about code and law will be a choice about values.
Lawrence Lessig is the Berkman professor for entrepreneurial legal studies at Harvard Law School. His most recent book,
Code, and Other Laws of Cyberspace (Basic Books), has just been published (see http://code-is-law.org). The website of the
Berkman Center for Internet and Society at the law school (see page 50) is http://cyber.law.harvard.edu.
http://www.harvard-magazine.com/issues/jf00/forum.html
11/17/2003
Page 4 of 4
Main Menu Search Current Issue Contact Archives Centennial Letters to the Editor FAQs
Harvard Magazine
http://www.harvard-magazine.com/issues/jf00/forum.html
11/17/2003