Sunteți pe pagina 1din 6

Privacy Protection,Control of Information,

and Privacy-EnhancingTechnologies
Herman T. 7hvani James H. Moor
PhilosophyDepartment PhilosophyDepartment
RivierCollege DartmonthCollege
htavani@tivier.edu james.mooi@dartmoudl.edu

The presentstudy is organizedinto two main parts. In Part I, we respondto a to determinefor themselveswhen, how and to what extentinformationabout
recentcriticismthat the restrictedaccesstheoryofprivacydoesnot adequately them is communicated to others." [Westin, 1967p. 7] Arthur Millersays, "...
explainthe role that control of personal information playsin protectingone's the basicattribute of an effectiveright of privacyis the individual'sability to
priva~ In defending a versionof the restrictedaccesstheory,we put forth a control the circulationof information relatingto him..." [Miller, 1971 p. 25]
tripartite model that differentiatesthe concq~tofprivacyfrom both thejustifi- CharlesFriedstates,"Privacyis not simplyair absenceof informationabout us
cation and the management of privacy. This-distinction is important, we in the minds ofothas, rather it is the controlwehaveover information about
argue, becauseit enablesus to avoid conflafingthe concept of privacywhich ourselves."[Fried, 1984 p. 209] More recend~ Dag Elgesemsuggests,"In my
we define in terms of protection from intrusion and information gathering view,to havepersonalprivacyis to havethe abilityto consentto the dissemina-
[Moor 1990; 1997], from the concept of control, which (a) is used to justifi/ tion ofpersonal information." [Elgesem,1996 p. 51]
the flamingofpolitiesthat provideprivacyprrotectionand (b) isessentialto the We believethis traditionofidentififingthe conceptofprivacywith control
managementofptivac~Separatingprivacyfrom controlis necessa~we further is misleading.Control of personal information is extremelyimportant as, of
argue, to preservethe identityofboth notions.Aftershowingwhy the notion course,ispfivaW.But,theseconceptsare more usefidwhen treatedasseparable,
of individual control, as expressedin three differentways-- choice,consent, mutuallysupportingconceptsthan as one. A good theoryofpfivacyhas at least
and correction--plays an important role in the management ofpriva~ we three components: an account of the concept ofpriva~ an account of the
conclude Part I with an account of why individual controls alone are not justificationfor privac~ and an account of the management of privacyThis
sufficient to guarantee the protection nfpersonal privacy and why certain tripartite structure of the theory of privacy is important to keep in mind
externalcontrols,such as thoseprovided by privacypolities,are alsoneeded. because each part of the theory performs a different function. To give an
To illustratesome of the key points made in the firstpart of this essaywe accountofone of the parts isnot to givean accountofthe others. The concept
considerexamplesofptivacy-enhancingtechnologies(or PETs) in Part II. We ofprivacyitselfisbestdefinedin termsof restrictedaccess,not control. [Moor,
argue that even if PETs provideindividualswith a means of controllingtheir 1990; Moor, 1997] Privacyis fundamentallyabout protectionfrom intrusion
personalinformation,thesetoolsdo not necessarilyensureprivacyprotection. and informationgatheringby others. Individualcontrolof personalinforma-
BecamePETs do not provideonline userswith a zone of privacyprotection tion, on the other hand, is part of the justificationofprivacyandplaysa role
that incorporatesexternalcontrols,i.e.,controlsbeyond thoseat the individual in the mauagementofptiwa~ Pfivacyandcontroldo fit togethernaturally;just
level,we condude that the use of PETs can actuallyblur the need for privacy not in the way people often state.
protection, rather than provideit. These philosophicaldistinctionshavepracticalimport. We can havecon-
trol but no pfiva~ and privacybut no control. We should aim to have both
PARTI: THETHEORYOF PRIVACY controland privacy:When we blur the distinctions,we are vulnerableto losing
one ofthem. For example,aswe shallarguelater,providingprivacy-enhancing
In this section,we defend a versionof the restrictedaccesstheory of privacy technologies(PETs)that seemto promote individualccont_olmayactt~dlyblur
[Moor, 1990; 1997] againstrecentattacksthat such a theorydoes not explain the need for strongerprivacyprotection, not provideit.
the important role that one's abilityto control personal information playsin A fimdamentalproblemabout definingthe conceptofprivacyin termsof
protectingpersonalprivacy[Elgesem,1996; 1999].We beginwith a critiqueof individualcontrolof informationis that it greadyreduceswhat can be private.
privacyas understoodmainlyin terms of controloverpersonalinformation. We control so litde. As a practical matter we cannot possibly control vast
amounts ofinformationabout us that circulatesthrough myriadsofcomputer
T h e Role o f C o n t r o l in the Theoryo f Privacy networksand databases.The current globalizafionof theseinformation pro-
In our privateliveswe wish to control information about oursdves. We wish cessesexacerbatesthe problem. If privacydepends by definition on our indi-
to controlinformationthat might be embarrassingor harm us.And, wewishto vidual control, we simply don't have significant privacyand never will in a
control information that might increaseour opportunities and allow us to computerizedworld.
advance our projects. The notion of privacy and the notion of control fit On the contrary, it seems more reasonable to maintain that sensitive
together. But how do they fit together?There is a tradition, especiallywith personalinformation ought to be privateeven if its owner is not in a position
regardto the privacyofinformation,to defineprivacyin termsofcontrol.Alan to control it. A patient should not loseher right to have her medical records
Westinmaintains, "Privacyisthe claimofindividu,~/s,groups, or institutions protectedwhen she is under anesthesia.A residentof the U.S. who is required

6 Computersand Society, March 2001


to give personal information to the Census Bureaushould not thereby lose his can determine what information sent over the Internet requires protection.
right to privacy of the personalinfbrmati6n he has surrendered. In general, loss That is, the restricted access model does not force us to make an all or none
of control should not entail the loss oftbe right to privacy which it would if choice such that the Intemet must be either completely public or completely
individual control reallywere a necessary condition for the right to privacy: private. [see,for example,Tavani, 2000b] In general, diverseprivate and public
Virtually all societies establish normatively private situations, zones of situations can be imbedded in and ovedap each other in complex ways. Con-
privacy, which limit access to people or aspects about them under certain sider a simple example in the realworld. Awoman in a public building may be
conditions. [Moore, 1984] The details of these normatively private situations having a private phone conversation while being publicly viewed holding a
vary somewhat from culture to culture, but they are intended to protect indi- purse whose contents are private. We have no trouble making such public/
viduals and foster social relationships whether the individuals have comrol in private distinctions in ordinary life and we can designate private situations
the situations or not. involving information on the Intemet in similar ways. Information on the Web
We often think of normatively private situations in terms of physical is generally public. Web sites are typicallydesigned to solicit public atttention.
locations. A house is a normatively private situation. Outsiders are expected to Nevertheless, we can be selectivewithin this public framework. For example,
knock and get permission to entel; But situations other than locations, situa- we can insist that consumer and medical transactions be protected as private
tions that involve relationships, activities,and information, can be normatively while allowingother Internet information to remain pnblic Although there is
private as well. Religious confessions are typically private wherever they are some conventionality in how we carve up zones of privacy in social situations
given. Voting is often a private activity whether it is by paper ballot or voting including Intemet use, overall the carving should produce a set of zones that
machine or Internet. Medical records and information are private. All of these offer sufficient protection of personal information. We need to think careful~,
examples are private situations ~ zones in which protection of privacy is not so much in terms ofwhat information we can individually control, though
reasonablyexpected and normatively protected. The normative aspect of these that is important, but about what information and activities need to be pro-
private situations restricts accessby individuals, groups, or gnvemments. This tected on the Intemet.
restricted access expresses a right of protection. It prohibits intrusion and In defining a private situation it is necessary to define who has access to
information processing by someone or something. Of course, it is prudent to what under which circumstances. The privacy of medical information in a
supplement normative protection with securitymeasures. Doors have locksand modern hospital representsa good example ofthecomptexityofthe restrictions
databaseshave passwords. But, if the locksand passwordsa re circumvented, the that must be placed on a privacy situation. Physicians are allowed to see most
right to privacy is not diminished even if the contents are disclosed. if not all the medical records of only a select number of patients. HMOs may
see only part of the medical records of more patients. Financial o~ccrs can sec
N o r m a t i v e P r i v a c y a n d the R e s t r i c t e d Access T h e o r y the finandal records for the medical services.These restrictionsbar most people
Normative privacy (the right to privacy) needs to he distinguished from natural from gaining access and possibly nobody can see all of the records. These
or descriptive privacy (privacy that exists as a matter of fact). Simply being restrictions in accessalso often forbid revelation of private matters to others by
alone doesn't provide a sufficient claim to the right to privacy any more than those who do have access.A physician who has accessto medical records is still
having a rigbt to privacy can guarantee privacy as a matter of fact. In this essay bound by confidentiality and cannot freelyrevealthe contents of those records.
we are primarily concerned with privacy as a normative concept. It is easy to The restricted access analysis of privacy permits a fine-grained analysis of
confuse the two. A critic of the restricted access theory objects, "But it seems privacyamongvarions indMduals in a situation including possibledemands of
that, on this flew, we have to admit that we alwayshave some degree ofpriva~ confidentiality on thosewho do have access.
since there will alwaysbe billions of people who have physicallyrestricted access In a similar manner, sensitive transactions on the Intemet must be identi-
to us. But precisely because all situations are private to some degree, it is fied and protected. Different parties may have different levels of access and
difficult to see exacdy how the private situations are distinguished from the confidentiality restrictions. E-commerce, to pick an obvious example, should
public ones on this theory" [Elgesem, 1999 p. 289] The reply to this objection be designated as a zone ofpriva~ Merchants should be required to treat
is that the relevant public/private distinction isdrawn normativd~ not descrip- consumer information confidentially Legal and social sanctions should be
tivdy Public streets are unrestricted normativdyto virtually everyone; whereas, established for those who do not comply In effect, this is what the European
a house is restricted normativelyto everyone except its residents. The fact that Union Directive on Privacyis aimed a t ~ t h e creation of zones ofprivacywith
a public street is empty at night doesnot make it less(normatively) public; any sanctions to protect personal information. Pressuring commercial partners,
more than the fact that a largefamilylivesin a house in a densely populated area such as the United States, to share this perspective is an attempt to extend a
makes the house less (normatively) private v/s-~-v/soutsiders. securezone of privacy: Recentl~ there are encouraging signs,such as announce-
The restrictedaccessmodel provides a framework for discussingprivacy on ments from the FederalTrade Commission and the Safe HarborAgreement,
the Intemet in awayin which a control theory ofprivacy does not. Individuals that the US is moving closer to treating more situations on the Intemet as
cannot control the packet switching of their personal information or what nonnafivdyprivate.
happens to it once it arrives at a remote destination. Individual control of the With the constant evolution of information technology new zones of
flow of information is out of the question; individual protection is not. We privacy continually need to be created and access relationships defined to

Computers and Society, March 2001 7


maintain high levelsofprotection. In a time inwhich e-commerceisexpand- situafiom,zonesofprivacy can affectwhat choicesone has in seekingprivacy
ing exponentiallydata mining is routine,surveillancefrom spaceat one meter Zones of privacyoffermore possibilkiesfor protection assumingthe privacy
resolutionis commerciallyavailable,biometric identificationis poised to be- rightsare honored and enforced,i.e.,theyare reasonableand secure.Therefore
come commonplace, and the human genome is about to be mapped and in creating zones of privacy it is important to inform people under what
sequenced, it is imperative to rethink and revisewhat the zones of privacy conditionstheyoperateand with what levelofsecurityso that peoplemaytake
shouldbe. [Moor, 1999b;Tavani, 1999b, 1999c] Moreover,thesenewzones advantage of them as theywish.To this end we advocatethe Publidty Prin-
ofprivacyneed to be createdto protectindividualsespeciallywhen individuals dpk that statesthat the rulesand conditionsgoverningprivatesituationscan be
lackcontrolofpersonalinformationand cannot protect themselves. dear and known to the personsaffectedby them. [Culveret al., 1994] Ifofflce
Thus far,we have argued that the tight to privacycannot be adequately e-mail is not a normatively private situation, as it is not in the U.S., then
conceptualized in terms of the control of information but rather is better employeesneed to know that employershavethe rightto examineit. Peopleso
understood in terms ofa theory of restrictedaccess.Citizensmaynot control informedand seekingprivacyin e-mailcan choosean e-mailsystemwith a zone
whether they provide income tax information, but the information they ate of privacy
forced to furnish should be accessibleonly to tax authoritiesbound by con~- Another wayprivacyis managedby controlis through consent. In many
dentiali~ And, we havearguedthat newzonesofprivacywithprotectionsneed normativelyprivatesituationsindMdualshavethe right to waivetheir right to
to be establishedas technologydevelopsbecauseindividualcontrol by itselfis privacyand allowaccessby others.This issometimesthought to be incompat-
not likelyto be sufficientfor adequateprotection. Filingincome tax informa- ible with the restrictedaccessview. Dag Elegesemexplains,"There is, intu-
tion electronicallyshould be normativelyprotected becausethe fder cannot itively,a big differencebetweenthe situationwhereyour privacyisviolated,say
control the flowofsuch personalinformationoverthe Intemet. your phone is tapped, and the situationwhereyon tellyour friendan intimate
secret." To account for the difference,he says,we haveto bring in "the notion
T h e Use o f C o n t r o l in the Justification and ofconsentto the transferof personalinformation,i.e.a notion of control. But
M a n a g e m e n t o f Privacy the restrictedaccessaccount explicidyrejectsthe use of notions pertaining to
We havegone to somelengthto separatethe notion of individualcontrolfi'om control in the characterisationof privacy" [Elgesem,1999 p. 289] However,
the basicconceptofprivacyin order to preservethe identityofboth, but now this line of criticism dissipatesonce we distinguish the concept of privacy,
we wish to emphasize the importance of control in the other areas of the definedby restrictedaccess,from the managementof privacyThe presenceor
tripartitetheoryofprivacy Individualcontrol playsa centralrolein the justifi- absenceof consent makesa cmcialdifferencebetween a proper action and a
cationand in the managementofptivacyThus, in the overalltheoryofprivacy violationof a right. GMng consent is a familiarway of granting accessto an
controland privacyate complementarynotions that reinforceeachother. otherwiserestrictedsituation.We can invitea strangerinto our privatehouse.
A straightforwardjustification for having privacyis the protection it af- No incompatibilityexistsbetween the restrictedaccessdefinition of privacy
fords us to plan our lives,to decide what benefitswe wish to seek and what and the notion of consentto suspendrestrictionsin access.Consent isa means
harmswe wishto avoid,k allowsus to decidewhat projectswewillundertake of control that manages privacy and justifieswhat without it would be an
and what riskswe willassume. Privacyallowsus to seekmedicalcarewe might invasionof privacy
not otherwiseseekand to buy productswithout advertisingour buyinghabits. Control alsoplaysa rolein anotherareain the managementofprivacy:the
Privatesituationspermit intimacyand the developmentofdose personalrela- correctionofpersonal information.A generalprinciplethat characterizesfair
tionships. Inshort, privacyoffersuscontroloverourlives. Privacyis not an information practicesis that data subjectsshould be able to accesstheir per-
unqualifiedgood, as peoplecan alsouse zonesofprivacyto commit robberies sonal information with an abilityto amend if necessary [Bennett, 1992 pp.
and beat their spouses. But all things considered, privacy,perhaps privacy 101- 111] [Bennettand Grant, 1999 p. 6] This principleisdearlya way for an
subjectto carefullymonitored court orderedintervention,providesprotection individual to control personal information and suggestsa safeguardagainst
that most of us would impartiallysupport. This individualcontrol generated maintaining harmful erroneousinformation that has been collectedwithin a
from policiesofprivacyleads to gready increasedhuman happiness and au- zone ofprivacy Such individualcontrol ofpetsonal information resultingin
tonomy [Moor,1999a] the correction of data is consistent with a restricted accessaccount of the
Individualcontrolin turn helpsus to manageour own privacyIndividual conceptofprivacyand needsto be part ofgood privacymanagementpractice.
control for the managementofprivaeytypicallyexpressesitselfin one ofthree All of the aspectsofindividualcontrolm choice,consent,and correction
ways: choice,consent,and correction.We control privacyin part by choosing areimportantingredientsin the managementofprivacyTheyare important
situations that offer the desired levelof accessranging from total privacyto in fair information practices that characterizevarious national and interna-
unabashedpublicly And in seekinga levelofprivacywe maynot onlychoose tional privacy regulations. But all have their limits. There is only so much
the levelofaccessbut choosethe levelof riskas well.Two situationscan have individualchoosing,consenting,and correctingthat one can do. The manage-
the sameactuallevelofprivacybut one may be more secure,i.e.,offerlessrisk ment of privacyrequires controlsbeyond individualcontrol that will ensure
Of aCC~..~. restrictionsin accessand the purposesfor which the normativelyprivatesitua-
The managementofptivacythroughchoiceneednot involvenonnatively tions are created.Additionalcontrols,such asgood nationaland intemational
privatesituationsbut just prudent choosingso that the flowof information is privacypolicies and laws with enforcement, are necessaryin order to fully
controlledand preventsaccess.Ifpeopledon'twant othersto seethem jog, then protect privacyAs an example,we considerprivacy-enhandng technologies
theyshouldchooseto jogwhen othersare not around. No rightofprivacyisat (PETs) and their limitations in Part II of this essa~
stake here, but privacycan be chosen. However,the existenceof normative

8 Computersand Society, March 2001


PARTI1:PRNACY-ENHANCINGTECNOLOGIES security:What couldbe betterin the managementofprivacy?On the one hand,
certainprivacyadvocatesand consumergroupshavearguedthat strongerprivacy
We next consider the role ofpn'vacy-enhancingtechnologiesor PETkin the legislationis needed to protect the interestsand rightsofonline users.On the
protection of personalpriva~ Followinga briefdiscussionofwhat PETsare other hand, groupsrepresentingthee-commercesectorhavelobbiedforvohm-
and why they are viewed by some as a means for resolvingonline privacy tarycontrolsand industtyself-regulationas an alternativeto new privacylegis-
concerns,we examinesome of the waysin which PETs enableindMduals to lation. Generally die respectivesolutions proposed byone camp have been
managepersonalprivacy~ v/z, throughindividualcontrolmechanismssuchas unacceptableto the other.Now,some members ofeach camp appearreadyto
choiceand consent.We then considerwhetherPETs,eveniftheyprovideusers embrace PETs as a compromiseresolution.PETsare clearlyadvantageousfor
with a certainlevelof control overtheir personalinformation,actuallyensure managing personal privacy:Nobody denies that. But we wish to argue that
usersthat their privacywillbe protected. PETs are not formidableguard dogs of privacybut toolswith seriouslimita-
tions.
W h a t Exactly are PETs and W h y are T h e y Appealing? The adequacyof PETs can be challengedin terms of dreartechnological
Accordingto Burkert [1997p. 125], PETscan be understoodas"technicaland effectivenessor on the basisof their securityand public-policyimplications.
organizationalconceptsthat aim at protectingpersonalidentity."As organiza- [Tavani,2000a] With respectto issuesoftechnicaladequa~ some havenoted
tionalconcepts,PETscan perhapsbe thought of in termsof industry-standard that"anonymizing tools"do not alwaysensure that userswill have total ano-
guidelinesfor privacyprotection, such as those adopted by the Platform for nymittywhilethey interactwith the Web, whereasothershavequestioned the
PrivacyProtection (P3P). For example,onlineprivacysealprograms,such as effectivenessof PETs as reliableencryption technologies,thad with respectto
TRUSTe, can inform nsersof an online vendor'sprivacypoliciesand assure public policy and security,some government officialsand law-enforcement
thoseusersthat a vendor'sstatedpoliciesare backedand enfol~d by reputable agentshavearguedthat anonymitytoolsare potentiallydangerousfor national
third-parties. In their technicalsense,on the other hand, PETscan be viewed securitybecanse(a) theyallowterroriststo carty out certaincriminalactivities
as specificonlinetoolsused by individualsto control the amount ofpersoual online that wotfld be extremelydifficultto trace back to the party or parties
information they disclosein online activities.Although Burkert's definition responsiblefor those activitiesand (b) they allowcriminalsand ten'oriststo
correctlycorrectlydistinguishesbetweenthe technicaland organizationalfunc- communicateviaencryptedmessagesthatpossiblycalmotbe decodedbyappro-
tionsthat PETs perform,his definifionwouldalsoseem to implythat all PETs priate law enforcement agencies. However, we will not pursue the lines of
are aimed at protecting the ident/tyof persons. Clearly a primary function of argumentation basedon either technical-or security-relatedinadequaciesin-
certain kinds of PETs is to protect personal identi~ However, one of the volvingPETs. Rather,our interestin this essayis in whether PETscan enable
oldest, most effectivePETs, encryption, can be used simply to protect the individualsto protect their privacywhiletheyare engagedin online activities.
informationalcontent of messages,not the identityof persons.Unfortunatd~ We beginby lookingat PETsastoolsfor controllingpersonalinformationwith
Burkert'sdefhfidon,despitethe factthat it drawsa veryimportant distinction respect to individualchoice.
for helpingus to dividePETs into two usefulcategories,does not seem to take
into accountthat PETsperform tasksother than simplyprotectingone'siden- PETs and the Role o f Individual Choice in C o n t r o l l i n g
tit~ Personal I n f o r m a t i o n
BecausePETshavecome to be tmderstoodand debatedprimarilyin their Burkert [1997 p. 125] notes that, among other things, PETs "givedirect
technical sense, i.e., as tools that can assistusers either in concealing their control over revelationof personal information to the person concerned."
identitywhileonlineor in securingthe content of informationtheycommuni- BecausePETs offerusersa certain degreeofchoiccewith respectto disclosing
cate electronicall)~we willfocusour discussionmainlyon the senseof PETsas personalinformationin onlinetransactions,whichotherwisethoseusersmight
privacy-enhancingtooh.As tools, PETs perform a host of functions. For ex- not have,it would seem that the PETs provideuserswith much more privacy
ample,Cranor [1999p. 30] notesthatsomePE'I~can fimctionas"anonymizing protectionthan wasaffordedthem in the earliersystemsofvoluntarycontrols
agents"and"pseudonymagents.'Whereas certainPETs,such asanonym'zing and industry self-regulation.But even ifPETs provide userswith a means of
tools (e.g.theAnonymizer)and pseudonym agents (e.g.,LucentPersonalized controllinginformation about themselves,do these toolsprovide userswith
W~bAssktant)havebeen designedwith the goalof enablingusersto navigate adequate privacyprotection?How, for example, are userssupposed to learn
the Intemet either anonymouslyor pseudonymousl~other PETS havebeen about the existenceof PETs in the first place. At present, responsibilityfor
developedto allowusersto communicateonlinevia correspondencesthat are learningabout the existenceofthesetoolswould clearlyseemto be incumbent
encryptedwith eitherdigital-signatureor blind-signaturetechnologies.Much upon onlineusersthemselves,sincethereisno requkementforonlineentrepre-
has been written about the technicaldetailsand nuances ofvarions PETs, so neurs either to inform usersof the existenceof PETs or to make such tools
there is no need to repeat that discussionhere. Our primary concern in this availableto users. In this sense, then, PETswotild failto satisfythe Publicity
essayis in determining whether PETs provide online userswith adequate Principle,which requiresthat the riflesand conditionsgoverninga schemefor
privacyprotection. protectingprivacymust be open and public. Becausethe PublicityPrincipleis
The appealof usingPETsisobvious.PETsprovideuserswith controlover a crncialaspectof any normativepolicydesignedto protect personalpriva~
their own information.PETsofferuserschoicesabout what informationthey PETs would have to meet the conditions of such a principle if theyare to be
wish to release.Usersmay consentor not to the acquisitionof personalinfor- consideredan adequateform ofprivacyprotection.
mation.The fundamentalPET, encryption,offersusersprivacywith increased And, who is responsiblefor distributing PETs, if they are not automata-

Computers and Society, March 2001 9


callybundled with either operating-system or application software or if they are would be protected in the future. Thus kwould seem that, beyond the limited
not provided as part of the Web interfaces of online vendors? Should online control ofpersonalinformation provided by PETs and by the specific privacy
entrepreneurs be responsible for providing them, or should consumers be re- policies of certain online vendors, additional controls in the form of policies
qtfired to locate PETs and then be further responsible for installing them on and laws are needed in order to ensure that a zone of privacy is established and
their systems? Is it reasonable to expect online users to be responsible for these enforced to protect individuals in subsequent uses of the personal information
tasks? they disclose in one or more online activities.
Consider the case of one of the earlier and more popularlyknown privacy-
enhancing tools, PGP (Pretty Good Privacy). PGP enabled ordinary users to PETs and the Principle of Informed Consent
send encrypted e-mail messages, and the PGP tool cookie.cutterenabled users Another important question involving PETs, wlfich is also related to the control
to avoid having"cookie" files sent to their computers. Although PGP was of personal information, has to do with whether individuals can make in-
available free of charge, the onus was on users, first to discover that PGP s6rmeddecisions about the disdosure of their personal data in online transac-
applications existed and then to track down the location of those tools and tions. Traditionally, the principle of informed consent has been the model or
download them on to their computers. Currendy, the latest versions of most standard for disdosure involving personal data. But in certain online commer-
Web browsers allow users to reject cookies without having to install separate cial activities, induding those involving file use of PETs, it would seem that tile
privacy-enhancing software to do so. Of course, if the default setting on Web informed consent prindple might not be adhered to as strictly as one might
browsers and the default policies on Web sites were such that no information assume. For instance, userswho willingly consent to provide information about
about users could be collected unless those users explicidy consented, we could themselves for use in one context often have no idea as to how that information
ask whether tools such as PETswould even be needed. might be used subsequen@ That is, they do not always realize that the infor-
Independent ofquesfious about how users are supposed to fred out about mation they disclose for one purpose, or in one online transaction, might also
the existence of PETs and about how those tools should be made available to have secondary uses.Although this particular problem is not unique to PETs, or
users, other problems regarding the aspect of choice need to be addressed and for that matter to online activities, concerns abour the secondary use of a
resolved if PETs are to ensure adequate privacy protection for users. For al- consumers personal data are nonetheless exacerbated by certain online activi-
though PETs may allow users a certain measure of control over their personal ties, including e-commerce transactiom. So it would seem that regardle.~ of
information in an initial online transaction, they do not necessarilyensure that whether users consent to the initial collection of their personal information,
users will have any say (control) about how information about them is subse- they must also be given an explicit choice of whether or not to consent to the
quendy used once that information has been disclosed to an online vendor. future use of that information in secondaryapplications. Unfotxunatel~, not all
Consider, for example, a recent case involving the e-commerce Web site PETs provide users with this explicit option.
Tqysmart.com.Online consumers who engaged in transactions withToysmart One argument that has been advanced by some online entrepreneurs is that
were assured, via an online trustseal~ i.e., a type of PET that would seem to no one is forcing users to reveal personal data and that the disclosure of such
fall under Burkert~ categoryof PETs as organizational concepts (seeabove) data is done on a comphtdy voluntary basis. However, even if it is granted that
that their personal datawonld be protected. This vendor's policystated that a user has willinglyconsented to disclosepersonal data to an e-commercevendor
personal information disclosed to Toysmartwould be usedintemally, butwould for use in aspecific business muasacfion, i.e., in some specific context, does it
not be sold to or exchanged with external vendors. However, in the spring of follow that the user has ipsofactogranted permission to use that information
2000,Toysmart was forced to fde for bankrupts Ceasing operations in May for additional purposes (i.e., secondary uses)? Does the online vendor now
2000, Toysmart decided to solicit bids for its assets, which induded the names "own" that information, and is the vendor now free to do with that informa-
of customers in its databases. Parties interested in purchasingthat information tion whatever he or she chooses? Consider the case of of various data-mining
believed that they were under no obligation to adhere to the privacypolicy that activities in the commercial sector. Specific information given by a consumer
Toysmart had established with its dients. So the party or parties who took over for use in one context, say in an application for an automobih loan, is collected
Toysmart, or who altemarivdy purchasedToysmart's databases, would, in prin- and stored in a data warehouse, and then the data is subsequendy"mined" for
dple, be free to do whatever they wished with the personal information in that implicit consumer patterns. As a result of data-mining activities, an individual
online vendor's databases. Thus personal information about Toysmaffs clients could eventually be"discovered" to be a member of a newly created categoryor
might no longer be protected, despite the fact that such information was given group m conceivably one which the user has no idea even exists. And based
to that online vendor by dients who were operating under the belief that solely on his or her identification in such a newly discovered categoryor group,
information about them would be protected indefinitely And these clients that individual might be denied a consumer loan, despite the fact this partic~ar
would seem to have been justified in holding such a beliefbecause of specific indMdual's credit history is impeccable. [Tavani 1999a]
agreements they made with Toysmart under the provisions of a privacy policy Another argument that might be advanced by online entrepreneurs, espe-
involving a type of PET in the form of a trust seal. dally in defense of the secondary use of personal information as in the case of
TheToysmar t incident illustrates a case in which individuals exercised data-mining practices, is: if the user has put information about him- or her-self
control over their personal information in one context m i.e., in controlling into the public domain of the Intemet, i.e., disdosed the data as part of an
whether they would dect to discloseinformation about themselves to Toysmart online questionnaire for a transaction, then that information is no longer
in online transactions-- based on specific conditions stated in Toysmart's private. Of course, one response to this line of reasoning could be to question
privacypuliqz However, it also turned out that these individuals were not able whether users, who in the process ofcousenting to disclose personal data in
to be guaranteed that the personal information they disclosed to Toysmart response to queries in online business transactions, undetstood clearly all of the

10 Computersand Society, March 2001


conditions in which the data they had consented to reveal could be used, know what choices are available or how to get into a position to make them.
including certain filtttre nses to which that data might also be put. For example, A user may be coerced into giving consent to accept cookies because not doing
if users are queried as to whether they are willing to have their personal data so would make his browsing activities possibly sluggish.
"mined," manywould likely be perplexed by this question since they might PETs, though important tools, are not adequate to fully protect personal
never have heard of the practice of data mining. Also we can certainly ask priva~ National and international policies and laws are also needed to set up
whether the businesses that collect personal data could possibly know in ad- zones of privacy to ensure that personal information continues to be protected
vance exactly how that data will be used ~ v/z., to which uses that data would once k has been given in one or more online transactions. The restricted access
be put in secondaryand future applications. This being the case, it would seem theory of privacy, defended in this essay, describes the kinds of rules and
that online businesses cou/dnotadequatelyinform users about exacdyhow their prindples ~e.g., as stated in the Publicity Principle~ofprotection that need
personal data will be used. What kind of infi~rmedchoice, then, could these to be established.
users make in such a case?Can we ~ indeed should we ~ assume that most
consumers understand the intricacies of a technique such as data mining? P,d~enee~
Some online entrepreneurs have responded to charges involving privacy Bennett, C.J. [1992] Regulatingl~7"vacy.Ithaca, NY:Comell UniversityPress.
violations by pointing out that in most cases users are now provided with the Bennett, C.J., and R. Grant, eds. [1979] VuionsofI'r~vacy:PolicyChoicesfortheDigitalAge.Toronto:
means either to "opt-in" or"opt-out" of having their personal data collected, as UniversityofTorontoPressIncorporated.
Burkert, H. [1997] "Privacy-EnhancingTechnologie~TypologyCritiique,Vision."In ~ehnobgyand
well as having that data made available for secondary use. Currendy; however, l~ivacy:TheNewLanAcape,editedby P.E,A, eand M, Rotenberg,Cambridge,MA: MIT Press.
the default seems to be such that if no option is specified by the user when that Cranor, L. E [1999] "Intemet Privacf' CommunicationsoftheACM, 42 (2): 29-31.
individual disdoses personal data for use in one context, that disclosed personal Culver,C., J. Moor,W. Duerfeldt,M. Kapp, and M. Sullivan.[1994] "Ptiva~" 13~gf~ionalEthics,3(
3&4):3-25.
data would also be available for secondary use. We can certainly ask whether Elgesem,D. [1996] "Priva~ Respectfor Persons,and Risk."In PhilosophicalPerspe~iveson Computer-
that presumption is a reasonable one. We can also ask whether having the MediatedComnsunication,edited by C. Ess.NewYork.'StateUniversityof NewYorkPress.
ability simply to opt-in or opt-out of disclosing personal information is is itself Elgesem,D. [1999] "The Structtt~eofRightsin Dkective95/46/EC on theProtectionoflMividualswith
Regardto the Processingof PersonalDataand theFreeMovementofSuch Data."EthicsandlnJbr-
suffident to protect personal privacy:
mation7~chnoLgy,1 (4):283-293.
Because users can choosewhether to grant or withhold information about Fried,C. [1984] "Priva~" In PhilosophicalDimemionsofPriultyy,editedby E D. Schoeman.New York:
themselves in online transactions~ i.e., either opt-in or opt-out i it would CambridgeUniversityPress.
certaiulyseem that users have at least some means of controlling their personal Miller,& R. [1971] TheAssaultonPn'mq: Computen,DataBanks,andDossiets.Ann Athor: Uuiversity
ofMichiganPress,
information, at least initially;And in some cases, users might also retain some Moor,J. H. [1989]"How to hwadeand ProtectPrivacywithComputerz" In Thelnformat/an!~kb,edited
say about how their information is used subsequently since they can elect to by C. C. Gould. Boulder:WestfiewPr~.
"sell"thek personal information in return for certain financial incentives (in the Moor,J. H. [1990] "Ethicsof PrivacyProtection."Library7ienA, 39 (l & 2):69-82.
Moor,J. H. [1997] "TowardsaTheoryofPrivacyin the Informatio l Age.' ComputersandSodeO,,27
form of discounts and rebates) currendy offered by some e-commerce sites. (3):27-32.
Unfortunatdy, lessaffluent persons might be more indined than would their Moor,J. H. [1999a] "JustConsequentialismaud Computing," Ethia andlr~brmation @chnology,1
wealthier counterparts to sell their personal data. This factor would seem to (1):65-69.
Moor,J. H. [1999b] "UsingGeneticInformationWhile Protectingthe PrivacyoftheSoul."Ethicsand
suggest that those users who are membels of lower socioeconomicgroups will, Infwmation Techno/ogy,1 (4):257-263.
by virtue of their economic status, have less choice in (i.e., less control over) Moore,B. [1984] Palmy: Studa in Sxialand CultundHhtmy.Armonk,NewYork:M. E. Sharpe,hie.
whether to sell their personal data. Informed consent should be free, not Tavani,H.T [1999a] "InformationalPriva~ Data Mining and the Imemet." Ethicsandln3~rmation
coerced. So the use of PETs would also seem to raise issues of social equity /~chnokgr,1(2):137-145.
Tavani,H. T [1999b] "KDD, DataMitring,and the ChallengeforNormativePfiva~" Ethicsandlnfir-
[Tavani, 2000c] aswell asconcems involving the protection of personal infor- marion~c.bndogy,1 (4):265-273.
mation. Of course, our main concern in this section of the present essay has Tavani,H.T. [1997c]"PrivacyOnline." ComputenandSacieo,,29 (4): 11-19.
been with whether PETs necessarily provide adequate privacy protection for Tavani,H.T. [2000a]"PrivacyandSecuri~"Chap.4 inImemetEthics,editedbyD. i angford.NewYork:
St. Martin'sPress.
online users. We conclude that theydo not. Tav~afi,H.T. [2000b] "Privacyandthe htternet."hi Pn~,acyandtheComtitution,editedby M. Phcencia.
Hamden, CCI':GarlandPublishing,Inc.
PET Owners Beware Tavani, H. T [2000c]"Privaty-EnhandngTechnologiesasa PanaceaforOnline PrlvacyConcems:Some
EthicalCoosiderations,"JournalofInJ~rmatianEtbics,9 (2).
We have distinguished the concept of privacyfrom the notion of control.The
Westin,A. R. [1967]Pn~vaffandFreedom.NewYorleAtheneum.
concept ofprivacyis defined in terms of restricted access;whereas, courrol has
a central role in the justification and management of privacy; One practical Acknowledgments.
payoffin making this distinction is that one can resist the temptation to think
that because one has increased control, one has increased priva~ The conclu- WewouldliketoexpressourthankstoDagHgesemwhosecardillcfitiquesofthe
restrictedaccessaccount
sion is not a conceptual truth as it would be on the control theory of privacy: havekxtustoamoredeadydeedopedtheotyofprivacyWearea~sogratefultoMkhadScanlanforsome
PETs give us increased control but it remains an open question whether privacy hdptillo0rnmm~a~,~onofthise~aypresm~attheCorahencemCanpu~Ethics-lPhil~sophical
Enquity(CEPE2000),DartmouthCCollege,July14-16,2000.
is increased.
Moreover, although PETs give us more control through choice and con- iAnteadierversionofthispaperwaspresmtedattheConth'enceonComputeEthics- Philosophical
sent, there are good reasons to be skeptical how about how easy and effective Fnquity(CEPE2000),DartmouthColkge,Hanover,NH,July15,2000.ArevisMver~onappeatsin
PETs are to use. For example, die controls for cookies are more hidden than Readngsin CyberEtt~cs.K A.SpindloandH.Z Tavani,editors.Sudbur~MA:Jonesand liardett
ever in the most recent versions of Web browsers. The average user may not Publishers,2001.

Computers and Society, March 2001 11

S-ar putea să vă placă și