Sunteți pe pagina 1din 38

Working_Paper_March 2016

Working Title: A Legal Turn in Human Computer Interaction? Towards Regulation by Design for
the Internet of Things

Authors: Lachlan Urquhart and Tom Rodden, University of Nottingham1

ABSTRACT: 1

INTRODUCTION: 2

PART I: PRIVACY BY DESIGN THE INTERNET OF THINGS 5


INTERNET OF THINGS 5
PRIVACY BY DESIGN 7

PART II: LAW AND REGULATION BY DESIGN 11


THE ROLE OF SYSTEM DESIGNERS IN THE POST REGULATORY STATE AND CHALLENGES THEYLL FACE 11
BEYOND CODE IS LAW 16

PART III: HUMAN COMPUTER INTERACTION AND DESIGN 20


HCI AS THEORETICALLY POROUS 20
ENGAGEMENT WITH REGULATORY ISSUES BY DESIGNERS 22
THE VALUE LED APPROACHES 24
REMEMBERING THE POLITICAL ROOTS OF PARTICIPATORY DESIGN: THE SCANDINAVIAN SCHOOL 28

PART IV: PROPOSALS 32


BEYOND SYSTEMS THEORY 33
BRINGING REGULATORY AND LEGAL VALUES INTO SYSTEM DESIGN? 34
EXAMPLE 1: SMART SECURITY CAMERAS 34
PARTICIPATORY DESIGN FOR REGULATION 35
EXAMPLE 2: SMART THERMOSTATS 36
CONCLUDING POINTS 37

Abstract:
This discursive paper explores the role of law in HCI through the concept of regulation
by design. Technology designers are increasingly being called upon by law and policy to
act in a regulatory capacity, for example in privacy by design. This is problematic as
technology designers are not traditionally involved in regulation and regulators may not
fully appreciate what these designers do. We argue that to practically and conceptually
achieve regulation by design requires greater understanding of and interaction between
the regulation and design communities.

This paper consolidates and assimilates work from the fields of human-computer
interaction and technology regulation. It is framed within the context of privacy by
design and the Internet of Things. It lays out theoretical tools and conceptual
frameworks available to each community and explores barriers and commonalities
between them, proposing a route forward.

1 Doctoral Researcher and Professor of Computing, Horizon Digital Economy Research/ Mixed Reality

Lab, School of Computer Science, University of Nottingham. If you want to get in contact, please email
lachlan.urquhart@nottingham.ac.uk.

Electronic copy available at: http://ssrn.com/abstract=2746467


Working_Paper_March 2016

It contends five main points: 1) regulation by design involves prospective, as opposed to


just retrospective, application of law; 2) HCI methods need to be repurposed to engage
with legal and regulatory aspects of a system; 3) the legal framing of regulation and
design is still anchored in systems theory but human computer interaction has a range of
rich approaches for understanding the social, and regulation by design needs to use
these; 4) designers are now regulators and this brings a range of responsibilities; and
lastly, 5) design and human values perspectives in HCI need to be extended to legal
values and participatory design is a strong candidate for doing this.

Introduction:

In this discursive paper we explore the role of law in HCI through the concept of
regulation by design. Technology designers are increasingly being called upon by
regulators, law and policy to act in a regulatory capacity for example in privacy by
design. Whilst system design is recognised as a regulatory tool, system designers are not
traditionally involved in regulation and regulators may not fully appreciate what designers
do. Accordingly, the nature of the new designer-regulator and how they fit into current
models of regulation requires unpacking. We argue that to practically and conceptually
achieve regulation by design requires greater interaction between the regulation and
design communities. In order to support them in this new role we argue for the need to
better understand the respective epistemologies of system design and regulation.

In this paper we look at theoretical tools and conceptual frameworks available to each
community in order to explore barriers and commonalities between them. We focus on
areas that we think are conceptually open to aligning the goals of regulation and system
design. They may be complementary in their goals but not currently aligned eg moving from
human values to legal values in design; or modified in scope to reflect interests of the other
community eg using participatory design to engage with citizens.

To ground and frame our discussions we look to the context of privacy by design for the
domestic Internet of Things. The novelty of our paper is in the consolidation of
otherwise disparate areas of research, framing them around privacy challenges of IoT
and exploring what regulation by design might actually entail in this domain.

We want this paper to start the process of building a solid shared epistemological
foundation between system designers and state regulators. By arguing for common
theoretical tools, we can then focus on addressing the practical challenges, such as
implementing privacy by design for the domestic Internet of Things.

One challenge of cross-disciplinary consolidation is drawing the bounds of our inquiry.


Whilst we touch on many themes and areas of research, our core analysis draws on
concepts and theories from the domains of information technology law and human
computer interaction.

Another challenge is that we are not providing a top down, absolute definition of what a
system designer or a regulator is. Nevertheless, for navigating this paper, by designer, we
broadly mean those who create new systems. If we framed it in the agile/scrum
development process that may include those designing and engineering information
systems from requirements engineers to system architects, software engineers, user

Electronic copy available at: http://ssrn.com/abstract=2746467


Working_Paper_March 2016

interface/experience system designers, and testers.2 Equally, for regulators we mean
those in charge of creating and enforcing law or policy. They may be actors from the state
like the various arms of government i.e. legislator, executive and judiciary to law
enforcement bodies, and criminal justice system. Equally, non-state actors like
multinational companies, corporate governance, policymakers, legal profession, lobbyists
(industry, NGOs, charities etc) also have their part to play, as we discuss in part II.

Providing a definitive list of actors is not really the point of this paper. Part of the
process of bringing together these distinct communities is reflecting on perceived
borders between them, wherever these may lie. Accordingly, we are keeping the
definitions of system designers and regulator quite open in the hope that through a
process of discussion and reflection fuzzy parameters may emerge from, within and most
importantly, between the communities.

For the hurried reader, our high level arguments of this paper are as follows:
1) Regulation by design involves prospective, as opposed to just retrospective,
application of law.
2) HCI methods need to be repurposed to engage with legal and regulatory aspects
of a system.
3) The legal framing of regulation and design is still anchored in systems theory.
Human computer interaction has a range of rich approaches for understanding
the social, and regulation by design needs to use these.
4) Designers are now regulators and this brings a range of responsibilities
5) Design and human values perspectives in HCI need to be extended to legal
values and participatory design is a strong candidate for doing this.

Structure:

In Part I we start by broadly defining a problem space of privacy by design for the
domestic Internet of Things. This provides an example of regulation by design, and the
kinds of problems designers may need to engage with.

In Part II we present a number of theories from the domain of law and regulation,
situating these in the context of privacy by design for the IoT.

- The Role of System Designers in the Post Regulatory State and Challenges
Theyll Face
- Moving Beyond Code is Law

In Part III we turn to theories from design and human computer interaction.

- HCI as Theoretically Porous


- Value Led Approaches
- Remembering the Political Roots of Participatory Design: The Scandinavian
School

In Part IV we look to areas of commonality and crossover between these domains, in


order to define questions that need to be addressed in the future, particularly for IoT.


2 R Pressman Software Engineering: A Practitioners Approach (2009 7th Ed McGraw Hill)

3
Working_Paper_March 2016

- Stuck in Systems Theory: Why Law Needs to Play Catch-up to HCI
- Bringing Regulatory Values into System Design?
- Participatory Design for Regulation
- Conclusions

4
Working_Paper_March 2016

Part I: Privacy by Design the Internet of Things



We contextualise our discussions by looking at the regulatory environment of privacy by
design for domestic internet of things (IoT). This example is timely as system designers
and regulators are increasingly aligned in policy and law, and the domestic IoT market is
growing significantly. There are a number of privacy regulation challenges posed by
domestic IoT, explored below.
Internet of Things

Internet of things is sitting at the peak of inflated expectations as an emerging


technology.3 Drawing on a lineage of work from ambient intelligence4, ubicomp5, calm
computing6, and home automation, hype surrounding the IoT is significant.7 Various
technology and consultancy firms predict numbers of internet connected devices over
the coming years, from Cisco at 24 billon by 2019 8to Huawei at 100 billion by 2025.9
The Internet Society argues market forces like cloud computing, advanced data analytics,
miniaturisation of devices, Moores law, dominance of IP networking and ubiquitous
connectivity have fed the growth of IoT.10

In trying to narrow down what IoT actually is, we see a range of definitions from
government, academic and technical advisory bodies. Regulatory body, the EU Article 29
Working Party,11 define IoT as devices that can be controlled remotely over the internetmost
home automation devices are constantly connected and may transmit data back to the manufacturer.12
Technical body, the Internet Engineering Task Force (IETF) defines IoT as a trend where
a large number of embedded devices employ communications services offered by the Internet protocols.


3 2015 Gartner Hype Cycle for Emerging Technologies we see Internet of Things
https://www.gartner.com/newsroom/id/3114217
4 E Aarts & S Marzano. The New Everyday: Views on Ambient Intelligence (Rotterdam 2003) and more

critically, D Wright et al Safeguards in a World of Ambient Intelligence (Springer 2008)


5 M Weiser Some Computer Science Issues in Ubiquitous Computing (1993) Communications of the ACM

36(7) 75-84. p76; R. Caceres & A. Friday "Ubicomp Systems at 20: Progress, Opportunities and
Challenges"(2012) IEEE Pervasive Computing p15 and more critically S Reeves Envisioning Ubiquitous
Computing CHI 2012 and G Bell and P Dourish Yesterdays Tomorrows: Notes on Ubiquitous
Computings Dominant Vision (2006) Personal and Ubiquitous Computing
6 M. Weiser and J. S. Brown The Coming Age of Calm Technology(1996) Xerox PARC and more

critically Y. Rogers Moving on From Weisers Vision of Calm Computing: Engaging Ubicomp
Experiences (2006)
7 Projects like Philips HomeLab. More critically A Crabtree and T Rodden Domestic Routines and

Design for the Home (2004) Computer Supported Cooperative Work 13 191-220. p206 ; R Harper (Ed) Inside
the Smart Home (Springer Verlag: London 2003)
8 http://www.cisco.com/c/en/us/solutions/service-provider/visual-networking-index-vni/index.html
9 Huawei Global Connectivity Index
10 Rose et al/ ISOC IoT Report http://www.internetsociety.org/blog/public-policy/2015/10/internet-

society-releases-internet-things-iot-overview-whitepaper p8; See also p14-16 Blackett Review on Trends


creating growth of IoT.
11 A body providing advice to member states on issues of data protection
12 A29 WP Opinion 8/2014 on the Recent Developments on the Internet of Things (2014) Section 1.3;

UK Government Office for Science opts for a more descriptive version of the Internet of Things is made up of
hardware and software technologies. The hardware consists of the connected devices which range from simple sensors to
smartphones and wearable devices and the networks that link them, such as 4G Long-Term Evolution, Wi-Fi and
Bluetooth. Software components include data storage platforms and analytics programmes that present information to user
Blackett Review (2015) p13

5
Working_Paper_March 2016

Many of these devices often called smart objects are not directly operated by humans, but exist as
components in buildings or vehicles, or are spread out in the environment.13 For academic group,
Cambridge Public Policy SRI, it is sensors which react to physical signals; software in these sensors
transmitting information; a network for this information to be transmitted on; a database and control
system which receives and processes this data, and sends a message back out over the network to instruct
the initial device or another one that is networked.14

Across all groups we see elements like remote controllability, constant connectivity and
networking for data transmission, third party involvement, physical objects embedded in
the environment, backend computational power (eg databases, servers) and devices
communicating to each other without direct human input. The application areas for IoT
of built environment (smart homes/cities), energy (smart grids/metering), transport
(autonomous vehicles), health (wearables, quantified self) or agriculture (smart farms) are
growing just as significantly. 15 Our interest is domestic IoT, so primarily smart homes.

The home, as a setting for domestic IoT, is important. Homes are complex social spaces
where different practices and routines persist. 16 As Tolmie has said routines are the very
glue of everyday lifeRoutines help provide grounds whereby the business of home life gets done. Routines
mean that people can get out the door, feed themselves, put the children to bed, and so on, without
eternally having to take pause and invent sequences of action anew.17 Any technology for the
home has to reflect these diverse routines, whilst not disrupting the underlying practices.
18
Furthermore, the house will not become smart overnight. As Edwards and Grinter
have long recognised, new technologies will be brought piecemeal into the home 19 and as Rodden
and Benford argue, domestic environments evolve. They are open to continual change and the need to
understand and support this change will be important to ensure the successful uptake and management of
digital devices in domestic spaces. 20 Homes are heterogeneous technologically as well as
socially.

From a privacy perspective, Brown sees IoT as challenging precisely because it operates
in private settings, like homes, and presents an attack target that is harder to secure, and
can compromise physical safety (eg pacemaker hacks) and other home systems (eg smart
fridges as spambots).21 Profiling is also a concern. 22 For example, the A29 WP fears
detailed inferences can be drawn about daily life23 where analysis of usage patterns in such a
context is likely to reveal the inhabitants lifestyle details, habits or choices or simply their presence at


13 IETF RC 7452
14 p8 Cambridge Public Policy The Internet of Things: Shaping Our Future (2015)
15 eg Blackett Review Chapter 4; ITU/Brown (2015)

https://www.gov.uk/government/collections/internet-of-things-review p9-11
16 p A Crabtree and T Rodden Domestic Routines and Design for the Home (2004) Computer Supported

Cooperative Work 13 191-220 p211


17 P Tolmie et al Unremarkable Computing (2002) CHI 02
18 P Tolmie et al (2003) Towards the Unremarkable Computer: Making Technology at Home in Domestic

Routine in R Harper (2003) Inside the Smart Home p203


19 K Edwards, R Grinter At Home with Ubiquitous Computing: Seven Challenges (2001) Ubicomp 01

256-272
20 T Rodden and S Benford The Evolution of Buildings and Implications for the Design of Ubiquitous

Domestic Environments (2003) CHI 03.


21 I Brown GSR Discussion Paper: Regulation and the Internet of Things (2015) ITU p25
22A29 WP (2014) p8; For more on profiling and harms see B Custers The Power of Knowledge. Ethical, Legal,

and Technological Aspects of Data Mining and Group Profiling in Epidemiology. (2004 Nijmegen: Wolf Legal
Publishers) p74-78; Section 4.4 - M Hildebrandt and J Backhouse D7.2: Descriptive analysis and inventory of
profiling practices (2005 FIDIS)
23 A29 WP (2014) Section 2.4

6
Working_Paper_March 2016

home.24 Deakin et al note combinations of non-personal data may create sensitive
personal data (which consequently need explicit user consent) for example, systems that
collect data on food purchases (fridge to supermarket system) of an individual combined with the times
of day they leave the house (house sensors to alarm system) might reveal their religion.25

More generally,26 the A29 WP Opinion on IoT highlights privacy concerns regarding data
collected being repurposed, users insufficient knowledge of data processing by physical
objects and inadequate consent or lack of control over data sharing between such
objects.27 There is significant user concern control of data in Europe. A recent
EuroBarometer Survey of 28,000 EU citizens attitudes to personal data protection28
showed 2/3rds of respondents were concerned about not having complete control over the
information they provide online.29 Nearly 70% think prior explicit approval is necessary
before data collection and processing and worry about data being used for different
purposes from those at collection.30 Around 60% also distrust telcos, ISPs and online
businesses. Privacy by design (PbD) is the often cited solution to many challenges of
IoT31 and now turn to the nature of PbD.
Privacy by Design

Privacy or Data Protection by design (PbD or DPbD)32 is a policy tool that has been
discussed in EU and UK regulatory circles for some time33 through variations like
privacy enhancing technologies (PETS), privacy impact assessments (PIAs),34 privacy
engineering35 and usable privacy. 36 Generally, it requires system designers to consider
privacy risks of their technology as early as possible, before it has been built, in order to build
in safeguards from the beginning. Adopting this approach means technologies can
account for and address privacy risks before they even hit the market. It also helps
narrow the regulatory effectiveness gap created by slow legislative change and quick
technological development.

More formally, Article 23 of the proposed EU General Data Protection Regulation37


would legally require DPbD through38 appropriate technical or organisational measures


24 A29 WP (2014) Section 1.3
25 Deakin et al p15; see also Brown ITU p26
26 See Rose et al/Internet Society (2015) p 26-29 for a good discussion
27 A29 WP (2014) p6
28 not for IoT specifically
29 European Commission Data Protection Factsheet (2015)
30 part 4
31 Across reports cited Brown, etc
32 Data Protection by Design in GDPR
33 See A Cavoukian 7 Foundational Principles of Privacy by Design; Spiekermann, Sarah. 2012. The

Challenges of Privacy by Design. Communications of the ACM (CACM) 55 (7): 34-


37https://www.privacybydesign.ca/index.php/about-pbd/translations accessed 15 Sept 2015;
34 D Wright and P De Hert Privacy Impact Assessment (2012 Springer)
35 M Dennedy, J Fox and T Finneran Privacy Engineers Manifesto (2014) Apress; S Spiekermann and

LF Cranor Engineering Privacy (2009) IEEE Transactions on Software Engineering 35 (1)


36 e.g. CyLab Carnegie Mellon University http://cups.cs.cmu.edu/
37 i.e. all three texts from the European Parliament, Commission and Council they differ and currently all

three are negotiating the final wording for the Regulation which is due Dec 2015 - Out-Law, EU Law
Makers Set to Thrash out Final Version of the General Data Protection Regulation 15 Jun 2015 Out-
Law.com http://www.out-law.com/en/articles/2015/june/eu-law-makers-set-to-thrash-out-final-version-
of-the-general-data-protection-regulation/
38Art 23 GDPR http://statewatch.org/news/2015/dec/eu-council-dp-reg-draft-final-compromise-15039-

15.pdf

7
Working_Paper_March 2016

by data controllers to reflect risks posed by data processing and to protect rights of data
subjects. Data controllers are the natural or legal person, public authority, agency or any other
body which alone or jointly with others determines the purposes and means of the processing of personal
data39, a drafting that is sufficiently broad to encapsulate a range of system designers
from manufacturers to third party services.

Other state regulatory bodies like the UK Information Commissioner Office, 40 the
European Data Protection Supervisor (EDPS)41, European Union Agency for Network
and Information Security,42 and EU Article 29 Working Party also recognise the
importance of PbD/DPbD approaches. 43 For example, the EDPS has stated that
systems and software engineers need to understand and better apply the principles of privacy by design in
new products and services across design phases and technologies.44 More specifically for IoT, the
Article 29 Working Party Opinion 8/2014 on IoT, recommends Every stakeholder in the
IoT should apply the principles of Privacy by Design and Privacy by Default.45

That being said, how this might be implemented is more complex. A detailed
understanding of the law and policy environment is lacking by engineers. Birnhack et al
have argued whereas for lawyers PbD seems an intuitive and sensible policy tool, for information
systems developers and engineers it is anything but intuitive.46 Similarly, a 2014 ENISA report on
Privacy by Design Tools highlighted we observed that privacy and data protection features are, on
the whole, ignored by traditional engineering approaches when implementing the desired functionality.
This ignorance is caused and supported by limitations of awareness and understanding of developers and
data controllers as well as lacking tools to realise privacy by design. While the research community is very
active and growing, and constantly improving existing and contributing further building blocks, it is only
loosely interlinked with practice.47 Academic commentators, like Jaap Koops and Leenes,
echo this, arguing guidance on PbD in practice is lacking, 48 as does Brown who recently
noted the specifics of implementation [for PbD] have so far only been developed to a limited extent.49

Building technical mechanisms into technologies from the start can assist with privacy
law compliance. Privacy engineering, usable privacy and security,50 human data
interaction,51 privacy enhancing technologies and data handling best practices like
encryption or anonymisation of data all have much to offer here. An approach focused


39 Article 4(5) GDPR
40 Information Commissioner Website Privacy By Design accessed 15 Sept 2015 https://ico.org.uk/for-
organisations/guide-to-data-protection/privacy-by-design/
41 For example in the context of mHealth (Opinion of 21 May 2015), drones (Opinion of 26 November

2014) or the eCall system (Opinion of 29 October 2013) to name a few


42 Danezis et al Privacy and Data Protection by Design from policy to engineering (2014) ENISA
43 A29 WP Opinion 8/2014 on the recent Developments on the Internet of Things WP 233 p21
44 EDPS 2015Opinion 4/2015 Towards a New Digital Ethics section 2.3
45 A29 WP Opinion 8/2014 on the recent Developments on the Internet of Things WP 233 p21
46 Birnhack, Toch, Hadar Privacy Mindset, Technological Mindset (2014) Jurimetrics
47 G Danezis et al (2014) page iv
48 B Jaap Koops and R Leenes Privacy Regulation Cannot Be Hardcoded. A Critical Comment on the

Privacy by Design provision in Data-Protection Law (2014) International Review of Computers, Technology and
Law. 159-171 p161
49 Brown (2015) ITU p26
50 J Hong Usable Privacy and Security: A Grand Challenge for HCI (2009) Human Computer Interaction

Institute; G Iachello and J Hong End User Privacy in Human Computer Interaction (2007) Foundations and
Trends in Human Computer Interaction 1(1) p4
51 R Mortier et al Human-Data Interaction: The Human Face of the Data Driven Society

8
Working_Paper_March 2016

on IoT in the home is personal data vaults where users stockpile data being generated
by their devices and control access to this.52

However, implementing privacy by design needs to go beyond merely building in legal


compliance measures into the system. As we know, law is dependent upon interpretation,
and formalist attempts to build in compliance with DP rules have to contend with
subtleties and nuances of legal language. In many regards, this might not be the right
approach. As Jaap Koops and Leenes have argued, we should instead focus energy on
fostering the right mindset of those responsible for developing and running data processing systems may
prove to much more productive than trying to achieve rule compliance by techno-regulation. 53 Similarly,
Savirimuthu states, we cannot just focus on data protection compliance, we must look
more holistically to principles of privacy law like protection for autonomy, liberty and
human dignity.54 Moving even further than this, beyond legal visions of privacy, we see
the role for ethics and human values. The EDPS Opinion55 on risks from IoT notes the
need for dignity at the heart of a new digital ethics56 looking beyond legal compliance
to accountability and sustainable data practices for protection of human dignity.57

More abstractly, we may need to reconceptualise how we understand such big principles
of privacy law and ethics, and how they can be instantiated in design. The transition from
spoken laws to uniform written laws via the printing press supported the emergence of a
law based on precedent and rationality 58 where the very being of law was altered in these new
technical forms: the media that are the whole of the legal message.59 Hildebrandt and Jaap Koops
have argued for a notion of ambient law to reflect the shift from the use of technologies to
interaction with technologies60 looking more broadly than transposing legal norms into technical
architectures61 as a channel for enforcement purposes, but instead to considering how
technology can frame legal norms, for example the actual embodiment of a rule in the
technology of the script, or in another code, will change the nature of the rule.62 To do this

52 H Haddadi et al Personal Data: Thinking Inside the Box(2015) 5th Decennial Aarhus Conferences ; Hub of
All Things http://hubofallthings.com/hatoutputs/hatacademic-publications/
53 B Jaap Koops and R Leenes Privacy Regulation Cannot Be Hardcoded. A Critical Comment on the

Privacy by Design provision in Data-Protection Law (2014) International Review of Computers, Technology and
Law. 159-171 p168 privacy by design may ned to be located not so much in the code section of the regulatory tool-box, but
rather in the section containing tools that regulate through communication; see E. Luger, L. Urquhart, T Rodden and
M Golembewski Playing the Legal Card Proceedings of ACM SIGCHI 15 457-466- ideation cards to get
system designers of ambient technology systems thinking about data protection issues; for discussions
around techno-regulation see R Brownsword Rights, Regulation and the Technological Revolution (2008 Oxford
University Press, Oxford)
54 J Savirimuthu (2013) Smart Meters and the Information Panopticon: Beyond the Rhetoric of

Compliance International Review of Law, Technology and Society Vol. 27, Nos. 12, 161186
55 EDPS Website Duties (2015) https://secure.edps.europa.eu/EDPSWEB/edps/site/mySite/Duties;

This recognition of the importance of ethics, beyond legal compliance, is interesting coming from a body
largely focused on fundamental rights like privacy and personal data protection, which are legally regulated
concepts in the EU
56 EDPS Opinion 4/2015 Towards a New Digital Ethics (2015) Section 3
57 EDPS Opinion 4/2015 Towards a New Digital Ethics (2015) p14
58 Ambient law takes into account that modern law has evolved from the information and communication

infrastructure of the printing press, creating a bodty of written legal rules, written case law and doctrinal
treatises that determine the substance of positive law. The systematic nature of modern legal systems builds
on the needs for systematisation, rationalisation and linear thinking that is inherent in the affordances of
the printing press Hildebrandt (2011) Section 3
59 K de Vries and N van Dijk A Bump in the Road. Ruling out Law from Technology in Human Law and

Computer Law (2013)


60 Both written and unwritten laws play a role here (see p454)
61 p460
62 p456

9
Working_Paper_March 2016

Hildebrandt recognises lawyers need to uncover what language computer systems speak, and
invites us to come to terms with the way computer scientists who design such systems actually think.63

From an HCI perspective, Dourish and Anderson argue that when designing an
interactive system, privacy or security cannot be bolted on using an interface, support
for effective privacy protection cannot be grafted onto a system because it is a pervasive aspect of how that
system it is designedit is a pervasive aspect of how the system will be used, the context in which it is put
to use, the values that it is used to support, the interpretations that others will make of its use64. We
need to think of privacy and security in terms of practices involving risk, trust, identity,
morality, power etc.65 Similarly, Luger and Rodden have argued consent giving, a critical
mechanism in privacy compliance, in ubicomp systems is not isolated to one moment,
but is a process over time, interacting with the system.66

For PbD we also need to reflect on what model of privacy we are using. We argue
Nissenbaums privacy as contextual integrity is a suitably flexible vision of privacy that
has found traction in both communities. Her focus on contextual appropriateness of
information flows focuses on questions of maintenance or breach of integrity of those
information flows, as defined by users. Seeing privacy as contextual integrity provides a
flexible mechanism for analysing context of information flows, and as such fits with
more interaction orientated views67 of how we live with technologies.68 However, we
argue that the more situated, interactive practice led view is one way of moving towards a
workable vision of privacy by design for both communities.

Equally the fundamental rights and values (eg liberty, autonomy, or dignity), explored in
case law (eg from European Court of Human Rights69 and European Court of Justice);70
legislation (eg the new Data Protection Regulation or e-Privacy Directive); and expert
analysis (eg from experts in academia, practice and government, like the A29 WP) are an
invaluable resource to inform system designers about PbD and their role within it.
However, before entering that complexity, we need to situate the role of system
designers in regulation, foster a sensitisation to regulatory issues and find the best
methods to engage the two communities in the long term.71 We argue that front-ending
the complex legal landscape in an inaccessible way is not the best approach.

Accordingly, we present theories that frame the role of system designers in regulation,
and that present opportunities to engage with underlying human values at play on their
own terms. By delving into the background of some schools of thought in design and
regulation, we unpack the nature of regulation by design and what system designers can


63 See M Hildebrandt (2013) Chapter 0: Prefatory Remarks on Human Law and Computer Law in M
Hildebrandt and J Gaakeer (Eds) Human Law and Computer Law: Comparative Perspectives Ius Gentium:
Comparative Perspectives on Law and Justice (Springer 2013) p5
64 Dourish and Anderson (2006) p337
65 Dourish and Anderson (2006) p338
66 E Luger and T Rodden An Informed View on Consent (2013) Ubicomp 13
67 P Dourish and K Anderson Collective Information Practice: Exploring Privacy and Security as Social

and Cultural Phenomena (2006) Human Computer Interaction 21. 319-342


68 H Nissenbaum Privacy In Context: Technology, Policy and the Integrity of Social Life (Stanford:

Stanford University Press 2009)


69 ECHR Internet: Case Law of ECHR (2015)

http://www.echr.coe.int/Documents/Research_report_internet_ENG.pdf
70 European Commission Justice Website http://ec.europa.eu/justice/data-protection/law/index_en.htm
71 e.g. E Luger, L Urquhart, T Rodden and M Golembewski Playing the Legal Card (2015) ACM CHI

'15

10
Working_Paper_March 2016

offer as a new type of regulators in this domain through their methods, approaches and
concepts.

Part II: Law and Regulation by Design


Regulation of emerging technologies, like IoT, poses challenges for traditional models of
command and control regulation because the regulatory target is shifting and the pace of
regulation, particularly law, is slower than the development of technology. However, the
solution of regulation through technology, is often challenging in terms of legitimacy,
democratic accountability and transparency. Whilst our examples draw on privacy by
design, system designers have long been involved in regulation, from architects building
housing estates using situational crime prevention approaches,72 to software engineers
enforcing copyright laws through digital rights management.73 We now unpack how
regulation as a concept has been changing, with a view to understanding how design(ers)
may fit in the regulation process.

The Role of System designers in the Post Regulatory State and Challenges
Theyll Face

It is important to recognise the purposes of and institutions involved in regulation have
broadened significantly. Selznicks more traditional view of regulation as sustained and
focused control exercised by a public agency, on the basis of a legislative mandate over activities that are
generally regarded as desirable to society74 looks outdated alongside modern versions that
reflect the role of non-state actors, and unintended purposes of regulation.

Newer definitions focus on:

- Control, as Baldwin and Cave assert all forms of social control, state and non-state,
intended and non-intended;75
- Specific methods and purposes, as Jaap Koops argues controlling human or societal
behaviour by rules or restrictions;76 or
- Actors as Leenes considers because the state and other (non-state) actors affect the
behaviour of individuals by means of intentional control and because those interventions need to
be justified, I would regard any entity engaging in social control within the scope of regulation.77

However, we contend Blacks definition offers most scope for framing the role of system
designers in regulation i.e. regulation is the sustained and focused attempt to alter the behaviour of
others to standards or goals with the intention of producing a broadly identified outcome or outcomes,
which may involve mechanisms of standard setting, information gathering and behaviour-modification. 78


72Von Hirsch, Garland and Wakefield Ethical and Social perspectives on Situational Crime Prevention (2004)
http://www.hartpub.co.uk/BookDetails.aspx?ISBN=9781841135533
73 N Jondet La France v Apple: Whos the Dadvsi in DRMs? (2006) SCRIPTed 3(4)
74 P Selznick Focusing Organizational Research on Regulation In R G Noll Regulatory Policy and the Social

Science (1985: University of California, Berkley)


75 Baldwin and Cave Understanding Regulation: Theory Strategy and Practice (1999: OUP, Oxford) at p91
76 B Jaap Koops Starting Points for IT Regulation: Vol 9 Deconstructing Prevalent Policy One Liners (2006: Asser

Press, The Hague) at 81


77 Leenes (2011) p149
78 J Black Critical Reflections on Regulation (2002) 27 Australian Journal of Legal Philosophy 1

11
Working_Paper_March 2016

For Black, we have moved to a post regulatory state where there is a hollowing out of the
state through the growth of decentred regulation79 by regulatory agencies from
governments, formal or informal associations, firms, individuals and play other roles: professional
advisers, accreditors, auditors, non-governmental organisations, charities, voluntary organisations, and so
on.80 Concurrently there is a thickening at the centre of government to improve their
powers to steer and control these decentralised institutions.81 Government encourages
hybrid regulation between state and non-state actors in both self and co-regulatory
setups, and increasingly we see regulation in many rooms.82 Braithwaite83 argues traditional
top down command and control governance84 is being replaced by new strategies for
regulating already private institutions through compliance systems, codes of practice and other self-
regulatory strategies. 85 For Scott, these new approaches to regulation enable us to see
beyond law i.e. to enrich our understanding of regulation when we have better tools to understand the
pervasiveness of non-state law and non-hierarchical control processes and their effects on regulatory
processes as they are more conventionally conceived.86

We argue the broadening of regulation to incorporate such activities, aims, institutions


and methods can quite comfortably accommodate a view of system designers as
regulators.87

Importantly, as stressed above, the government still has key oversight. As Black says,
governments now steer but do not row 88 by latently setting regulatory agendas.89 The
state still maintains a powerful position due to hierarchy and law. If we take the example
of Ayres and Braithwaites pyramids of responsive regulation we see the hierarchical


79 Defined as at its simplest a counterpoint to the notion that regulation is something delivered by the
state through the use of legal rules backed by (often criminal) sanctions: command and control or CAC
regulation p61 Black 2007; Also extensive discussion in J Black Decentring Regulation: Understanding
the Role of Regulation and Self-regulation in a Post-Regulatory World (2001) Current Legal Problems 54(1)
103-146 hereinafter Black (2001) p106-122 for discussion of the rise of decentred regulation being driven
by factors such as complexity of interactions between systems and actors; fragmentation of knowledge;
fragmentation of power and control; autonomy of social actors; interactions and interdependencies; and
collapse of private public distinction; J Black 'Critical Reflections on Regulation' (2002) 27 Australian
Journal of Legal Philosophy 1-37
80 J Black Tensions in the Regulatory State (2007) Public Law 58-73 p61-62 hereinafter Black (2007)
81 Black (2007) p58 The rise of the meta-regulatory agency is also being accompanied by a thickening at

the centre: an expansion and enhancement of the role of the core executive, notably the Treasury and the
Cabinet Office, in monitoring and attempting to direct the activities of regulators to whom it nonetheless
wants to give a reasonable degree of operational autonomy
82 Black (2007) p63; For work on co-regulation see C Marsden Internet Co-Regulation: European Law,

Regulatory Governance and Legitimacy in Cyberspace (2011, Cambridge University Press) and C Marsden and I
Brown Regulating Code (2013 MIT Press)
83 Black (2001) for in-depth analysis of self regulation
84 Black (2001) p105 with command and control regulation, rightly or wrongly, the term is used to denote

all that can be bad about regulation: poorly targeted rules, rigidity, ossification, under or over enforcement,
unintended consequences
85 Braithwaite (2000) p225
86 Scott (2003)
87 L Urquhart and E Luger Smart Cities: Creative Compliance and the Rise of System designers as

Regulators (2015) Computers and Law 26(2) http://www.scl.org/site.aspx?i=ed42790


88 Black (2001) p126
89 e.g Scott 2003 - a post regulatory state might look beyond hybrid forms which loosen command based

legal control, but as with responsive regulation, retain it in at least some residual form such that ends are
ultimately set and determined by the sovereign state.; Black describes this as decentred regulation which
sees movement in the locus of the activity of regulating from the state to other, multiple, locations, and
the adoption on the part of the state of particular strategies of regulation.

12
Working_Paper_March 2016

utility of law, whether or not it is actually used or effective.90 Self-regulation and more
autonomous social ordering occurs at the bottom of the pyramid but following failures,
regulatory responses can be escalated up the pyramid to stricter legal oversight and
sanctions.

Despite weak, underfunded or fragmented regulatory agencies, regulators should


normally be in the bottom of the pyramid, speaking softly while carrying a big stick,91 warning
and advising, but threatening to prosecute, fine, or revoke licenses if necessary.92 This
picture fits for the UK Data regulator, the ICO, who often issue high monetary fines to
councils, hospital trusts and telemarketing firms93 and also issuing extensive technical
guidance on data protection for organisations and the public.94

Pyramid of Regulatory Strategies 95 Pyramid of Sanctions96

An issue for system designers as regulators is their lack of authority to regulate.97


Traditional government have detectors (means to pull in information) and effectors (means
to change the outside world),98 to draw on the information and foster change. 99 As
Hood and Margetts100 highlight, the nodal, central authoritative position of government
to oversee, accumulate and control information and their authority to officially to demand,
forbid, guarantee, and adjudicate101 are key regulatory tools. If we take this further, authority
is what separates government from other actors. Traditionally legitimate authority is
derived from many factors like democratic process (eg elections) or public accountability
(eg holding politicians to account) for different arms of government like the legislature,
executive and judiciary.


90 Scott (2003) p12 One might say that responsive regulation brings the idea of law back into governance,
irrespective of whether law is actually invoked or actually perceived as a reason for cooperating with
eregulators or making self regulation work
91Braithwaite (2000)
92 Scott (2003) p13
93 ICO Website https://ico.org.uk/action-weve-taken/enforcement/
94 ICO Guidance for Organisations https://ico.org.uk/for-organisations/
95 Ayres and Braithwaite (1992) Figure 2.3 p39
96 Ayres and Braithwaite (1992) Figure 2.1 p35
97 Hood and Margetts (2007) p126-127
98 Hood and Margetts (2007) p3 & Figure 1.1
99 Hood and Margetts (2007) p5 but it is not enough simply to know what is going on. No control system

is worthy of the name unless it is capable of taking some action on the basis of that knowledgeit must
have some means of trying to adjust the state of the system to which it relates. Here we come to the
business end of government - a range of tools which vary from the gentlest of blandishments to
extremely blunt instruments
100 C Hood and H Z Margetts The Tools of Government in the Digital Age (New York: Palgrave MacMillan

2007) hereinafter Hood and Margetts (2007)


101 Hood and Margetts (2007) p5

13
Working_Paper_March 2016

However, for non-state actors, like system designers this does not exist. For example,
Yeung questions their political responsibility102 where opaque rulemaking and limited
stakeholder input can undermine accountability103 and the non-state regulatory agenda may
be in conflict with state values (e.g. valuing profit over human rights of the citizens). 104

As Reidenberg noted nearly 20 years ago technological capabilities and system design choices
impose rules on participants 105, and thus [t]he technical community, willingly or not, now has become
a policy community, and with policy influence comes public responsibility.106However, the nature of
that public responsibility has yet to become settled. System designers are not currently
held to the same principles of public accountability, oversight or adherence to the rule of
law107 as governments institutions are. Whilst responsible innovation frameworks and
codes of ethics have a role to play in identifying responsibilities of designers, the nature
of this responsibility not yet settled.108

Standards of legitimacy for regulation through design differ if it is state or non-state led.
Leenes argues state use is legitimate if there is respect for human rights, choice, transparency and
accountability. 109 However, for non-state actors, like system designers, the picture is more
complex110 and if a legal obligation to follow rules may be lacking, different standards of
legitimacy are required.111 For non-state actors, contracts are often used to justify how
someone interacts with a technology. Yet these terms need to be communicated and
transparent to the user because the norms may be legally null and void and hence not legally bind
individuals, yet as long as the norms remain embedded in the technology they in fact do regulate
behaviour.112

Taking another angle, we briefly unpack responsibilities of designers to maintain users


moral freedoms. Brownsword113 using design for regulation114 impacts users

102 Yeung (2014) p12 accountability usually involves not just the provision of information about
performance but also the possibility of debate, of questions by the forum and answers by the actor, and
eventually of judgment of the actor by the forum
103 Yeung (2014) p15 (drawing on Lessigs longstanding concerns about code as law discussed below)

again disagree with that point what about using methods to get individual input (user centred design etc)
104 Yeung (2014) p15
105 Reidenberg (1998) p544
106 Reidenberg (1998) p584
107 M Hildebrandt Legal protection by design: objections and refutations, (2011) Legisprudence 223-

248
108 IEEE Code of Ethics s1; ACM Code of Ethics and Professional Conduct s1.1; Stilgoe et al

Developing a Framework for Responsible Innovation (2013) Research Policy 1568-1580


109 Leenes (2011) p159
110
R Leenes Framing Techno-Regulation: An Exploration of State and Non-State Regulation by
Technology (2011) Legisprudence Vol 5:2 143-169 at 143 hereinafter Leenes (2011)
111 Leenes (2011) p168
112 Leenes (2011) p168
113 In R Brownsword What the World Needs Now: Techno-Regulation, Human Rights and Human

Dignity (ed) Human Rights: Global Governance and the Quest for Justice (Oxford: Hart Publishing
2004) hereinafter Brownsword (2004); he has multiple other works in this domain R Brownsword Code,
Control and Choice: Why East is East and West is West (2005) Legal Studies Vol 25 No 1 hereinafter
Brownsword (2005); R Brownsword Neither East or West: Is Midwest Best (2006) 3:1 Script-Ed
hereinafter Brownsword (2006); R Brownsword So What Does the World Need Now? Reflection on
Regulating Technologies in Yeung and Brownsword (2008) hereinafter Brownsword (2008) see discussion
of state stewardship p45-47; R Brownsword Rights, Regulation and the Technological Revolution (Oxford,
OUP 2008)
114 He uses the term techno-regulation where regulators, having identified a desired pattern of behaviour

(whether morally compliant or not), secure that pattern of behaviour by designing out any option of non-
conforming behaviour... where techno-regulation is perfectly instantiated, there is no need for either

14
Working_Paper_March 2016

empowerment and their ability to make choices. 115 Removing the ability of users to
choose to obey or disobey a rule has knock on effects. When reflection or awareness on
rules shaping behaviours are hidden, and the ability to justify/explain/challenge these is
removed, moral freedom is impacted. Importantly if the system only ever lets users do
the right thing it could remove the notion of good and bad.116

With regulation through design, rules can be perfectly enforced and non-compliance may
not even possible. As Leenes states, where norms are embedded into technology,
sanctions do not exist, instead enforcement and sanction coincide.117 This is in contrast to
traditional legal approaches to regulation where enforcement often involves costly civil
litigation or criminal prosecutions.

For Yeung, regulation using design in this way means individuals are no longer being
called upon to offer an explanation of our reasons for action118 and technologies are enforcing
behaviours without relying on moral reflection. 119 This is in contrast to traditional
regulation because as Leenes has noted, citizens have to know that their behaviour is regulated
and how it is regulated120 but unless the regulators purposes are transparent, there can be no
meaningful debate about the acceptability of the measures taken.121 At a minimum then, system
designers have a responsibility to provide some transparency to users on how their
systems are regulating their behaviours. This lets them at least know they are being
regulated in some capacity.

In terms of retaining moral reflection by users, Yeung argues for allowing individuals to
make choices and exercise their judgement and agency.122 Thus, by a combination of
showing individuals how they are being regulated and letting them choose if they accept
this, it goes some way towards legitimising one aspect of designers as regulators.


correction or enforcement. For more on techno regulation see R Brownsword Rights, Regulation and the
Technological Revolution (2008 Oxford University Press, Oxford) p247; Fantastic overview of the issues
see R Leenes Framing Techno-Regulation: An Exploration of State and Non-State Regulation by
Technology (2012) Legisprudence 5(2); L Bennett Moses How to Think About Law, Regulation and
Technology: Problems with Technology as A Regulatory Target (2013) 5(1) Law, Innovation and
Technology 1-20
115 He frames this in terms of human dignity i.e. Brownsword (2004) p211 ones capacity for making ones

own choices should be recognised; that the choices one makes should be respected; and that the need for a supportive context for
autonomous decision making (and action) should be appreciated and acted upon
116 Brownsword (2006) - The Clockwork Orange and Ludovico Technique
117 Leenes (2012) p147 and p168
118 Yeung (2011) p5; The basic responsibility which underpins our moral freedom is being called upon to

offer an explanation of our reasons for action.


119 Yeung uses a definition of an informal public system applying to all rational persons, governing behavioral that

affects others and has the lessening of evil or harm as its goals from Eshelmann Moral Responsibility in Stanford
Encyclopedia of Philosophy; Yeung (2011) p12 the importance of moral responsibility derives from that of basic
responsibility and to this extent, it can be understood as a mark of basic responsibilitya morally responsible agent is one
who acts on the basis of moral reasons, that is, reasons for action that pertain to the interest of others; see p24 Yeung
120 Leenes (2012) p159
121 Problem is twofold 1) regulated subjects dont know difference between right and wrong, or their

regulation towards right 2) if there is no option to pick wrong path, what is the merit in picking the right
one? Brownsword (2005) p14 this relates to his broader arguments around the value of morality and
giving moral agents choices (see Brownsword (2005) discussion p17 to 20)
122 She gives interesting example of Drachten Experiment (town without traffic lights) where design was

changed but still required agents to exercise judgment instead of removing it


(http://www.telegraph.co.uk/news/uknews/1533248/Is-this-the-end-of-the-road-for-traffic-lights.html)

15
Working_Paper_March 2016

We conclude this section by noting definitions of regulation are becoming broader, with
a shift away from just thinking in terms of the state but towards incorporating more
actors (including non-state actors), broader goals (including non-public policy) and
loosely defined approaches (including but not limited to standard setting or information
gathering). Regulation is now much more fragmented, pluralistic, and reliant on non-state
actors acting as regulators. We argue that these shifts accommodate a view of system
designers acting as regulators. How these regulators are held to account is not yet settled,
but transparency, accountability and choice for users are important elements.

Importantly, we are not arguing for self-ordering. The state retains a key role and it needs
to guide and act as necessary, for example protecting fundamental rights such as privacy.
Whilst the state is reassigning central functions, it is strengthening control through
agenda setting, retaining core central authority/hierarchy and deploying meta-regulation
at a distance. By situating the role of designers within this view of regulation we can
unpack where they may fit in. We now look in more detail at underlying theory as to
how technology design is currently framed within regulation.

Beyond Code is Law

Historian Kranzberg stated technology is neither good nor bad, nor is it neutral123 back in 1986,
and political theorist Winner argued that technology in a true sense is legislation. It recognizes
that technical forms do, to a large extent, shape the basic pattern and content of human activity in our
time124 back in 1980. Similarly, legal scholars have long recognised the importance of
technology to shape behaviour, and its use in regulation.125 Lessigs126 pioneering
aphorism code is law127 has become a central tenet128 in technology regulation circles,
recognising the regulatory power of code and those who create it. The underlying
argument is four interdependent modalities (see diagram)129 are involved in regulating the
liberty and behaviour of individuals online: market forces, social norms, law130 and


123 M Kranzberg Technology and History: Kranzbergs Laws (1986) Technology and Culture 27
124 L. Winner, Do Artifacts Have Politics? (1980) 109(1) Daedalus 121.
125 J. Reidenberg 'Lex Informatica' (1998) 76 Texas Law Review 3 The growth of Lex Informatica, the system

of rules largely defined by the technical community, which govern online behaviour (see table on p569 for
its features) based on analogy of Lex Mercatoria which emerged as a body of law from traders and
governed commercial fairness in transactions.
126 L Lessig Code and Other Laws of Cyberspace (1999) hereinafter Lessig (1999); L Lessig Code Version 2.0

(New York: Basic Books 2006) hereinafter Lessig (2006); see also L Lessig The New Chicago School
(1998) 27 The Journal of Legal Studies.
127 From Lessig (1999) In real space we recognize how laws regulate through constitutions, statutes, and other legal codes.

In cyberspace we must understand how code regulates how software and hardware that make cyberspace what it is regulate
cyberspace as it is. As William Mitchell puts it, this code is cyberspaces law. Code is Law. Ref to WJ Mitchell, City of
Bits: Space, Place, and the Infobahn (The MIT Press, Cambridge MA 1995)
128 There is a broader debate as the Internet emerged in the mid 1990s between the cyber-libertarians,

typified by the position of Perry Barlow, Johnson and Post, and cyber paternalists of Lessig, along with
Reidenberg, Wu, Goldsmith and others see D. Johnson and D. Post 'The Rise of Law in Cyberspace' 48
Stanford Law Review (1995-96) at 1367-1402 see p1370-1375 and p1400-1402; D. Post 'Against 'Against
Cyber-anarchy' 17 Berkeley Technology Law Journal (2002) 1365 -1387 J. Goldsmith and T. Wu, Who Controls
the Internet? Illusions of a Borderless World, (New York, OUP, 2008); J. Goldsmith 'The Internet and the
Abiding Significance of Territorial Sovereignty' 5 Ind. J. Global Legal Studies 475 (1998); J Perry Barlow A
Declaration of Independence for Cyberspace (1996)
129 Lessig (2006) p123
130 Lessig (2006) p72-74 East Coast Code, in reference to the legislature/executive/judiciary being on the

US East Coast.

16
Working_Paper_March 2016

architecture.131 Importantly citizens are not given agency in Lessigs model, unlike
Murrays who we consider below.

Architecture, or code, as the manmade system of


hardware and software, has no inherent or
immutable values/features and thus can be
redesigned to different ends. As he says, technology
is plastic. It can be remade to do things differently.132
Code is law expresses the importance of how
technology is built and how code can be
functionally comparable to law as a regulatory
mechanism.133

How these modalities interact is of great


importance. With law acting on code, Lessig is concerned about the scope of creating
perfect regulation, where control and state power over individuals can be realised
through code enforcing legal norms.134 For Lessig, Code is an efficient means of regulation.
But its perfection makes it something different. One obeys these laws as code not because one should; one
obeys these laws as code because one can do nothing else.135Equally, he is wary of the values being
embedded into code by industry, and the disconnection in purpose between industry and
democratic values.136 As Lessig said, if code is a lawmaker, then it should embrace the values of a
particular kind of lawmaking.137

More poetically, as the world is now, code writers are increasingly lawmakers. They determine what
the defaults of the Internet will be; whether privacy will be protected; the degree to which anonymity will be
allowed; the extent to which access will be guaranteedHow the code regulates, who the code writers are,
and who controls the code writersthese are questions on which any practice of justice must focus in the
age of cyberspace. The answers reveal how cyberspace is regulatedits regulation is its code, and its code
is changing.138 This could just as easily be applied to the Internet of Things, as it is
cyberspace.

Despite the influence of his work, many others have built on Lessigs work. Zittrain, for
example, focused on the impact of code on generativity.139 His140 argument is that the
original Internet architecture (PCs and end to end infrastructure) has been highly
generative in allowing users to tinker and create new positive innovations.141 However,
the counter issue is that generativity enables negative innovations such as malware.142
Accordingly, the response can be to change the code to reduce functionality, control
what users can do and ultimately create less generative spaces of tethered appliances and

131 West Coast Code, in reference to Silicon Valley being on the US West Coast.
132 See is-ism fallacy p32-37
133 Lessig (2006) p5 they carry different internal rationale for; law reflects societal interests, code does not

expressly.
134 More scenarios p74
135 L Lessig The Zones of Cyberspace (1996) 48 Stanford Law Review 1403 at 1408
136 Lessig (2006) p40-60
137 Lessig (1999) p224
138 Lessig (2006) p79
139 J Zittrain Future of the Internet and How to Stop It (Penguin Books: 2009) hereinafter Zittrain (2009) p70

A system's capacity to produce unanticipated change through unfiltered contributions from broad and
varied audiences
140 J. Zittrain ' The Generative Internet' Harvard Law Review 119 (2006) p1974
141 Zittrain (2009) p30
142 Zittrain (2009) p54-57

17
Working_Paper_March 2016

walled gardens to order and retain control over a users behaviour.143 These controlled
environments protect users, but reduce their freedoms; hence his work highlights
another facet of the trade off in regulating through code.144 The need to question the
values underpinning design are increasingly important. How might Zittrains concerns
play out in the domain of data driven domestic IoT, where the new walled gardens and
tethered appliances that mediate human interactions are not just our laptops, mobile
devices or set top boxes, but devices to manage temperature or home security embedded
in our daily routines?

Another attempt to extend Lessigs work is Murray and Scotts four modalities of
control. Their modalities are broader than Lessigs,145 opting for hierarchy (law),
competition (market), community (norms) and design(code).146 Hierarchy, as opposed to
law, reflects the broader bases providing authority for control;147 community, instead of
norms148 reflects group based, socially driven control mechanisms (eg fear of ostracisation
or disapproval as a behavioural stimulus);149 competition, not market, reflects how rivalry
and competition provide a form of control in environments where there is no identifiable market; for
code, they prefer design, as it is broader.

Murrays network communitarianism150 takes the model in a different direction. 151 In


contrast to Lessig, citizens in Murrays model are not pathetic dots152 that are passively
controlled by law, code, market and norms. Instead they are active participants, with
agency, in a networked environment where actions to and by them affect others in the
environment. 153 These dots form a community, which determines whether or not a regulatory
intervention is successful or if it fails.154


143 Zittrain (2009) p59-60; See also Zittrain in Yeung and Brownsword (2008)
144 Zittrains has not been actively extended to an IoT context but the implications of locked down
functionality for embedded, networked technology is even greater than in the web based world. Sales and
service contracts control our interactions with these technologies, impacting our agency.
145 A Murray and C Scott Controlling the New Media: Hybrid Responses to New Forms of Power

(2002) Modern Law Review 65:4 491-516 hereinafter Murray and Scott (2002) Systems/cybernetics approach,
these modalities consider the notion of a control system, which sets standards, gathers information and
modifies behaviour see p493
146 See C Hood Control over Bureaucracy: Cultural Theory and Institutional Variety (1996) 15 Journal of

Public Policy 207 considering variety in how risk regulating regimes work and how the regime analysis makes it
transparent that the various functions which contribute to viable control systems can be widely dispersed among state and non
state actors, even within a single regime, and can be assembled in mixed or hybrid forms; Murray and Scott (2002)
p502
147 The richer conception of hierarchy looks to the form of control rather than the source and thus

includes non state but hierarchical actors


148 Which they advocate as being the generic term for standards, guidelines and legal and non-legal rules

following P Drahos and J Braithwaite Global Business Regulation (Oxford: OUP 2000) p20
149 Murray and Scott (2002) p503
150 One aspect focuses on how the environment can be designed in a different manner to the physical

world, so called Socio-Technological-Legal Theory - In the STL model we can exploit regulatory settlements
which design the environment. Understanding that regulatory discourse may include technology is another step in understanding
regulation p300;
151 A Murray Conceptualising the Post-Regulatory (Cyber) State in R Brownsword and K Yeung (Eds)

Regulating Technologies: Legal Futures, Regulatory Frames and Technological Fixes (Oxford: Hart Publishing 2008)
hereinafter Yeung and Brownsword (2008)
152 Lessig (2006)
153 Murray (2008) p301
154 Murray (2008) p302

18
Working_Paper_March 2016

Top down, state centric, command and control regulation is not enough. Instead, Murray
argues regulation must understand and mimic changes in the community,155 using a
symbiotic approach. Such symbiotic regulation requires looking at how nodes156 of the
system communicate with other sub systems.157 Information about communications
between nodes can be used by regulators in designing and targeting an intervention; in
monitoring the response of nodes (positive or not); and in altering the intervention based
on feedback from nodes.158

To put this in another way, Murray, whilst still very much in the cybernetic tradition,
advocates incorporating agency into the mix, and regulation responding to actions of
citizens. By seeing regulation as something that has to engage with citizens who have
interests, needs and reactions to is a step in the right direction.

We conclude this section by extending Murrays recognition of the importance of user


agency in regulation further. Moving towards regulation by design, we recognise system
designers have an active role in shaping new technologies, and their methods for
understanding interests of users is key. Accordingly, we need to move past the aphorism
of code is law and abstracted systems approaches to framing the relations between
humans, technology and regulation (eg code, norms, market, regulation). If technology
design is truly a forum for regulation we need to unpack the detail and look to a more
contextual, interactional level.

Regulation theory needs a turn towards developing rich, situated understanding of the
user in context, and learning how regulatory concerns play out in these settings. As we
will see below, HCI has a developed a range of tools built for this purpose of
understanding users in everyday life. By seeking to understand complexities of daily life
in order to design better systems, the focus is on understanding how people really interact
with technologies. System designers can expose daily values of family life, politics in the
home, the nature of domestic social dynamics, family conflict management and how
technology may disrupt established order in the home. The same tools used by system
designers to inform them of the social context they are designing for can also be
repurposed to consider the regulatory issues.

However, currently they are not focused on engaging with regulatory concerns of the
users like parental worries of who has access to audio data from a smart baby monitor in
the childs bedroom or if an insurance company can see historic BMI data from smart

155 Murray (2008) p309 Use contemporary modelling techniques to predict where tensions will arise within the regulatory
matrix and to design a regulatory intervention to avoid such tensions and to instead harness the natural communications flows
within the matrix: in other words to mimic organic regulatory developments; See A Murray The Regulation of
Cyberspace: Control in the Online Environment (Routledge Cavendish Abingdon 2006) Chapter 8
156 Murray (2008) p311 Communication in autopoietic systems is not a process directed by the actions of individuals but is

rather a system in which they act as the nodes temporarily located within the communication. People are unable to alter the
course of communications as they have formed a self referential loop within which actors play their part rather than write it. In
this way, social systems effectively have a life of their own that gives direction to the thought and activity of individuals
157Murray (2008) p311 - The system maintains its form by only incorporating information from

communications of other sub-structures that pertain to itself Through self reference, and the memory of previous
selections a subsystem focuses on only specific communications as among the possible social connections there are only a few that
are relevant or compatible with its identity. Functionally differentiated subsystems within the social systems are thereby
concerned and can only be concerned with communications that are relevant to their functioning, autonomous of one another.
Thereby communicative acts effectively say nothing about the world that is not classified by the communication itself.
158 Murray (2008) p309-315 Murray draws on Luhmann, Mingers Self Producing Systems: Implications and

Applications of Autopoeisis, Contemporary Systems Thinking (New York: Plenum 1995); J Forrester Industrial
Dynamics (Waltham MA, Pegasus Communications 1961) etc

19
Working_Paper_March 2016

bathroom scales. Understanding the home setting, the associated human values and the
impact of a technology is necessary to realise this notion of regulation by design,
especially for PbD and domestic IoT.

When regulation has moved from top down state centric models, to non-hierarchical,
polycentric models, we need to draw on the rich understanding of the social. It is no
longer enough for regulatory approaches to exist purely at the abstracted cybernetic
modalities level. Regulation by design through technology, brings in a new set of
challenges. Equally it brings in a new set of regulators, but luckily they might have
suitable tools for the job. The complex part is how to begin the process of repurposing
these tools. We turn to two frameworks that could be useful candidates: participatory
design and value led approaches.

Part III: Human Computer Interaction and Design



HCI as Theoretically Porous

Firstly, we look at the boundaries between regulation/law and design/HCI to see how
fuzzy they are. Accordingly, we consider to what extent HCI as a discipline has already
engaged with regulatory issues.

As a discipline, Rogers argues HCI has incorporated and borrowed theory from many
other areas as it has grown.159 Cognitive and ecological approaches;160 activity theory;161
external cognition;162 distributed cognition;163 situated action;164 and ethnomethodology
have all been fruitful resources.165

She also notes it takes time for theory to transfer to practice and many theories are very
hard to apply in practice e.g. activity theory.166 Furthermore, theory often has an indirect
role because system designers have their own practical approaches and thus the value of
theory-informed approaches must be seen in relation to current design practice.167 Dialogue between
communities, insight into supporting design practice and more innovative knowledge
transfer through developing common vocabulary are necessary. Metaphors like Stars

159 p88 Y Rogers New Theoretical Approaches for Human Computer Interaction Annual Review of
Information Science and Technology 38:1 (2005) 87-143; cognitive sciences were an early resource (see discussion
p91-98)
160 Gibson The Ecological Approach to Visual Perception (Boston: Houghton Mifflin 1979) for articulating

certain properties about objects at the interface in terms of their behaviour and appearance Rogers (2005)
p105
161 Analytic theory that originally looked to explain cultural practices (eg work, school) in the

developmental and historical context in which they occurred, by describing them in terms of activities
Rogers (2005) p103; see S Bdker A Human Activity Approach to User Interfaces (1989) Human
Computer Interaction 4(3) 171-195; A N Leontiev Activity, Consciousness and Personality (Upper Saddle River:
Prentice Hall 1978)
162 See Rogers (2005) p106-p112
163 See E Hutchins Cognition in the Wild (Cambridge: MIT Press 1995) looking at socio-technical systems

broadly including agents and environment interaction eg in air traffic control see Rogers (2005) see p112-
115
164 L Suchman (1987) p71 considering accounts of relations among people, and between people and the

historically and culturally constituted worlds they inhabit see p115-118 Rogers (2005)
165 Garfinkel (1967); R Anderson Representations and Requirements: The Value of Ethnography in

System Design Human Computer Interaction 9, 151-182 (1994)


166 Rogers (2005) p131
167 Rogers (2005) p129

20
Working_Paper_March 2016

boundary objects168, and pattern languages are means Rogers suggests for theories to
gain practical utility.169 Consequently fuzzier barriers between law and HCI need
lightweight theories, development of common language and understanding practices of
designers. We are working towards this in our arguments presented here.

HCI as a discipline is also undergoing an expansion through an epochal shift of sorts.


Back in 2006170 Bdker asserted we are now in the third wave of HCI. 171 The second
wave theory considered work settings and interaction within well-established communities of
practice. Situated action, distributed cognition and activity theory were important sources of theoretical
reflection, and concepts like context came into focus of analysis and design of human computer
interaction.172However, in the third wave, whilst these second wave concepts and issues
persist the use context and application types are broadened, and intermixed. Computers are
increasingly being used in the private and public spheres. Technology spreads from the workplace to our
homes and everyday lives and culture. New elements of human life are included in the human computer
interaction such as culture, emotion and experience, and the focus of the third wave, to some extent, seems
to be defined in terms of what the second wave is not: non-work, non-purposeful, non-rational....173
Much like the broadening of regulation as a practice, HCI is similarly growing.

We also find Harrison, Sengers and Tatars notion of the third paradigm useful here.174
They argue a range of epistemological perspectives within HCI like critical design,
participatory design, values sensitive design etc cumulatively form a single epistemological
framework, which treats interaction as a form of embodied meaning making in which the artefact, its
context and its study are actually defining and subject to multiple interpretations.175 Importantly,
Sengers et al argue that critical reflection is a central part of HCI, and technology design
practices should support both system designers and users in ongoing critical reflection about technology and
its relationship to human life.176
They use critical in the philosophical critical theory sense 177 of bringing unconscious aspects
of experience to conscious awareness, thereby making them available for conscious choice,178 thus there
is a role for highlighting and questioning assumptions, ideologies, and beliefs of design.
For them, system designers need to holistically reflect on their role and support users
reflection.179 Similarly, Grimpe et al argue reflexivity of designers (i.e. the ongoing


168 G Bowker and SL Star Sorting Things Out: Classification and Its Consequences (Cambridge MA: MIT Press

1999)
169 C Alexander The Timeless Way of Building (1979)
170 S Bdker Third Wave HCI, 10 years later Participation and Sharing ACM Interactions 22(5) 24-31

p30
171 S Bdker When Second Wave HCI Meets Third Wave Challenges NORDICHI 2006; See J Grudin

The Computer Reaches out: The Historical Continuity of Interface Design CHI 1990 261-268. On
historical trajectory of computing leading to the emergence of the user interface
172 S Bdker When Second Wave HCI Meets Third Wave Challenges NORDICHI 2006
173 p1-2
174 S Harrison, P Sengers, D Tatar Making Epistemological Trouble: Third Paradigm HCI as Successor

Science (2011) Interacting With Computers 385-392


175 p385
176 p50 for those concerned about the social implications of the technologies we build, reflection itself should be a core technical

design outcome for HCI


177 Their definition is at p50 Critical theory argues that our everyday values, practices, perspectives, and sense of agency

and self are strongly shaped by forces and agendas of which we are normally unaware, such as the politics of race, gender, and
economics. (eg Frankfurt School, Marxism, Feminism, Gender Studies, Media Studies etc )
178 Sengers et al (2005) p50
179 p53-55; The author ran a workshop at CHIPlay 2015 called Interdisciplinary Reflections on Games and

Human Values which draws on the critical approach outlined above see D Darzentas and L Urquhart
DIGIMINT Interdisciplinary Reflections on Games and Human Values CHIPlay 2015

21
Working_Paper_March 2016

situated process of reflection on position, knowledge, impact etc) is a key mechanism in
fostering responsibility during development of new technologies.180

Wed argue for adding a regulatory interpretation to the mix. This would require
designers to become more reflective on their role as regulators. 181 As Sengers et al state,
A critical designer designs objects not to do what users want and value, but to introduce both system
designers and users to new ways of looking at the world and the role that designed objects can play for
them in it.182If critical reflection by designers is part of their practice this should extend to
regulatory implications of their work. During this transition in HCI to engagement with
assumptions and deeper epistemological questions, wed argue now is the time to reflect
on how knowledge from the traditional regulatory community may, or may not, influence
the field. This reflection process should not be a one off even and to some extent we
already see reflection leading to practical engagement with social issues and public policy,
as we discuss below.

Engagement with regulatory issues by designers

Back in 1990 Shneiderman called on the influence, moral leadership and responsibility
for the technological future of researchers, system designers, managers, implementers, testers and
trainers of user interfaces and information systems. 183 His focus was on finding ways to enable
users to accomplish their personal and organisational goals whilst pursuing higher societal goals and
serving human needs.184

The HCI community has actively engaged with broader societal concerns over the years,
like development,185 world peace,186 and activism187 to universal usability,188 accessibility,189
and privacy.190 Hochheiser and Lazars191 analysis of the field frames engagement in social
issues from the HCI community as influences prompting responses.192 For example,
influences could be accessibility requirements, (evaluation and design of interfaces for a
spectrum of disabilities both physical and cognitive);193 or universal usability (designing for

180 B Grimpe, M Hartswood, M Jirotka Towards a Closer Dialogue Between Policy and Practice:
Responsible Design in HCI (2014) CHI 14
181 For a recent consideration of the term see J Pierce et al Expanding and Refining Design and Criticality

in HCI (2015) CHI 15 http://dl.acm.org/citation.cfm?doid=2702123.2702438.


182 P Sengers, D Boehner, and J Kaye. Reflective design (2005) 4th Decennial Conference on Critical

Computing, 4958 p50


183 B Shneiderman Human Values and the Future of Technology: A Declaration of Empowerment ACM

SIGCAS Conference (1990)


184 Shneiderman (1990) My concern is on how users are empowered by new technologies, how they apply their growing

power, and the choices that researchers and developers can make to influence user interfaces. I believe that we can choose to
build a future in which computer users experience competence, clarity, control and comfort and feelings of mastery and
accomplishment (p2);
185 S Dray et al Human Computer Interaction for Development: Changing HCI to Change the World

in J. Jacko The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging
Applications Third Edition (CRC Press, 2012) p1369 1394.
186 http://www.hciforpeace.org/
187Busse et al CHI at the Barricades: An Activist Agenda? (2013) CHI 13
188 B Shneiderman Universal usability (2012) CACM 43(5), 84-91.
189 J Lazar
190 V Belotti and A Sellen Design for Privacy in Ubiquitous Computing Environments (1993) ECSCW

93 77-92
191 H Hochheiser and J Lazar HCI and Societal Issues: A Framework for Engagement (2010) International

Journal of Human Computer Interaction hereinafter H&L (2010); they look at social, political, ethical and
societal issues, particularly privacy, accessibility, universal design and voting usability.
192 They do provide a rather broad reaching summary, so for space we can only paint the big picture.
193 H&L (2010) p342

22
Working_Paper_March 2016

diverse users, technologies, levels of understanding, cultural contexts). Responses could
be design and development (building solutions to problems); evaluation (across different
domains to learn lessons eg usable privacy and security); models and theories (like
participatory design, value sensitive design, participatory action research);194 reports,
testimony and related public policy activities (working with policymakers and institutions) and
calls to actions (prompting the community to make change).

We concur with their argument that the HCI community can play a constructive role in
responding to concerns and questions raise by policymakers, citizens and other stakeholders. However,
proactive engagement aimed at addressing concerns before technologies are widely developed and
implemented will arguably have greater impact.195 Indeed, this is the premise of privacy by
design approaches.
However, like with privacy by design, implementation does not follow an easy path.
Despite willingness196, the community may not be fully prepared to engage.

Firstly, Shneiderman argues keeping designers interest is complex as hard-core computing


professionals often have little patience with grand social visions. To capture their hearts and minds
requires practical and realisable steps. This expectation is legitimate and even helpful..197 Secondly,
engaging in political, normative discussions and evidence based policymaking can court
controversy, something designers arent used to. As Flanagan, Howe and Nissenbaum
argue anyone can be political: the question is whether it is in ones capacity as a designer that one is or
is not. We hold not only that system designers are system designers but that it is their duty as good system
designers to embrace this dimension of their work, even if they are not able to prevail against the tide of
countervailing forces.198 Separately Nissenbaum has gone as far as arguing for engineering
activism as to ignore such implications of work is untenable because systems and devices
will embody values whether or not we intend or want them to. Ignoring values risks surrendering the
determination of this important dimension to chance or some other force.199 Lastly, Nissenbaum also
highlights epistemic challenges of unfamiliar terms, approaches and theories meaning
even system designers who support the principle of integrating values into systems are likely to have
trouble applying standard design methodologies, honed for the purpose of meeting functional requirements
to the unfamiliar turf of values.200 We return to concerns around values below.

For now, we argue that if the HCI community can come to terms with these issues, they
have much to offer. Hocheiser and Lazar argue, more contentious issues might involve wading
into charged debates, but appropriate empirical studies and innovative designs can play a constructive role
in providing guidance to policymakers201 and separately Lazar concludes, our research, our


194 H&L (2010) s3.3.
195 H&L (2010) p340
196 J Lazar et al HCI Public Policy Activities in 2012: A 10 Country Discussion Interactions May and June

2012, 78-81; Back in 1999 there was a panel discussing the role of SIGCHI in shaping public policy,
debating whether CHI should align with USACM attempts, or work separately see J Johnson et al
SIGCHIs Role in Influencing Technology Policy SIGCHI (1999) 349. Longstanding ACM Interactions
column http://interactions.acm.org/archive/view/september-october-2015/public-policy-and-hci
197 Shneiderman 1990 p5; p4in short, expanding our philosophical horizon can lead to better science
198 See Flanagan, Howe and Nissenbaum Conclusions
199 p119
200 p lxvi
201 Section 4; they also state unpopular stands on controversial topics may involve risks to research funding and career

advancement. Professional standards and conduct can help in these circumstances, as analyses grounded in successful research
frameworks, experience in design and implementation and empirical analyses can provide a solid scientific basis for
participating in policy discussions. (Section 6)

23
Working_Paper_March 2016

data and our design expertise to help drive public policy those things should drive policy, not politics or
public opinion.202

The willingness to engage with the kind of issues regulators deal with is a huge step in
the right direction. But we argue that if system designers themselves are now regulators,
hence instead of just supporting or engaging with traditional public policy makers, we
want to understand how they can be guided in their role as regulators.

Accordingly, we look to the role of values in HCI and design, and unpack models that
allow system designers to engage with these values. Our argument is for the extension of
these concepts to reflect values underpinning law and regulation for example those raised
in part 1 of transparency, accountability and choice.

The Value Led Approaches

In this section we consider existing work on human values in design, before looking at
the framework of value sensitive design, and our inclusion of legal values.

Cockton argues the centrality of value in HCI where it is a binding and motivating
element in the design process, where a designers job should be to understand what is valued
by a systems stakeholders and support them in delivering this value.203 Such value is not just
monetary but may be political, personal, organisational, cultural, experiential or spiritual.

More specifically, for the kind of values we are interested in, Nissenbaum204 argues
technology should not just focus on instrumental values like effectiveness, efficiency,
safety, reliability, and ease of use but also substantive values like liberty, justice, privacy,
security, friendship, comfort, trust, autonomy and transparency.205 She argues that
engineers/scientists need to look beyond technical concerns to consider social, ethical
and political criteria, and equally, humanists/social scientists have to now go beyond
theory and consider intricate technical details, and how they interact with
values.206Knobel and Bowker state the values of designers should not be imposed on
others as the key features of values, is that different people hold different values, and often hold to those
values very strongly.207 Similarly, Muller et al,208argue instead of the dominance of values
like productivity and efficiency as metrics of success alternatives from human needs like
impacts on health, quality of life or user empowerment should be considered. 209


202 p81
203 G CocktonValue-Centred HCI (2004) NORDICHI 04 p155
204 H Nissenbaum Values in Technical Design C Mitcham Encyclopaedia of Science, Technology and Ethics

(New York MacMillan 2005) lxci-lxx; See also C Knobel and G Bowker Computing Ethics: Values in
Design Communications of the ACM (2011) 54:7; Nissenbaum Values in Design Council Bibliography -
http://www.nyu.edu/projects/nissenbaum/vid/library.html
205 Embody or at least not undermine such values - alxvi
206 H Nissenbaum How Computer Systems Embody Values Computer March (2001) 118-120
207 C Knobel and G Bowker Computing Ethics: Values in Design (2011) CACM 54:7 26-28
208 M Muller, C Wharton, W McIver and L Laux Toward an HCI Research and Practice Agenda Based on

Human Needs and Social Responsibility (1997) Proceedings of SIGCHI 1997, Atlanta GA. 155-160
209 p158

24
Working_Paper_March 2016

More holistically, Sellen, Rogers, Harper and Rodden 210 outline an influential future
vision of HCI211 where they define three practical ways for HCI to engage with human
values. Firstly, folding human values in the design and research cycle by bringing values into
studying, designing, building and evaluating technology with users in addition to an explicit
preliminary step of understanding to focus on choosing the values being designed for.212
Secondly, new cross-disciplinary partnerships are needed because the nature of values is
too much for HCI alone to deal with.213As they say HCI professionals needs to engage in
discourses that may at one time have seemed distant, if not entirely alien to them.214 Lastly, they
advocate redefining the H, C and I in HCI. With (H)uman, definitions of users need
broadened as computers are now used by many for a multitudes of reasons beyond just
improving workspaces. With (C)omputers, a physical-digital ecosystem now exists where
technologies are embedded in our daily lives. With (I)nteraction, the spaces and forms of
interaction have changed vastly, thus so must the meaning of the term.215 Development
of a common language between researchers is important, as is considering users more
holistically. Therefore, HCI must also take into account the truly human element, conceptualising
users as embodied individuals who have desires and concerns and who function within a social, economic
and political ecology.216 We argue this should be extended to the legal ecology.

Similarly, Flanagan, Howe and Nissenbaum217 have proposed four practical


considerations for getting values into design. Discovery requires listing the values involved
in a project; translation looks to practically operationalise such concepts for design;
resolution looks to use philosophy to address conflicts between values; and lastly verification
checks if values actually have been included adequately in the final design. This
methodology is to supplement normal design methods, improving the moral credentials
of designs and to put values alongside functional efficiency, reliability, robustness,
elegance, and safety218

However, one of the most well developed approaches to thinking about how to
practically incorporate values into design is value sensitive design (VSD). For Friedman
et al, VSD is a theoretically grounded approach to the design of technology that accounts for human


210 A Sellen, Y Rogers, R Harper and T Rodden Reflecting Human Values in the Digital Age (2009)
Communications of the ACM 52(3). 58-66; R Harper, T Rodden, Y Rogers and A Sellen Being Human: Human
Computer Interaction in the Year 2020 (Microsoft Research: Cambridge 2008)
211 The HCI of today is exploring diverse new areas beyond the workplace, including the role of

technology in home life and education even delving into such diverse areas as play, spirituality and
sexuality. HCI is now more multidisciplinary than ever, with a significant percentage of the community
coming from the design world. This shift has caused the fields practitioners to think more broadly about
their design goals, taking into account not just how technology might be functional or useful but also how
it might provoke, engage, disturb or delight p60
212 p64-65; Key here is that the analysis should not just take into account peoples interactions with

computer technology but also with the environment, with everyday objects, with other human beings and
with the changing landscape that the new tech brings into the world p65
213 p65
214 p65; hence why we now have work like this thesis analysing how law can inform design
215 for example, interactions on and in the body, interactions between bodies, interactions between bodies

and objects and interactions at the scale of kiosks, rooms, buildings, streets and other public spacesp66
; A Greenfield Everyware: The Dawning Age of Ubiquitous Computing (Preachpit Press: New York
2008) Sites of interaction
216 p66
217 More detailed version of arguments, see - M Flanagan, D Howe and H Nissenbaum Embodying

Values in Technology: Theory and Practice in J Van Den Hoven and J Weckert Information Technology and
Moral Philosophy (Cambridge: CUP 2008) F, H and N (2008)
218 F, H and N (2008) p329-330

25
Working_Paper_March 2016

values in a principled and comprehensive manner throughout the design process.219 By comparison,
Van Den Hoven defines it as a way of engaging ICT that aims at making moral values part of
technological design, research and development. It assumes that human values, norms, moral
considerations can be imparted to the things we make and use. 220

In terms of what values actually are, for Friedman et als they are what a person or group of
people consider important in life, so called values with ethical import. These values centre
on human well-being, human dignity, justice, welfare and human rights. 221

Practicalities of doing value sensitive design are intricate, but broadly involves
establishing risks, benefits and costs; finding direct and indirect stakeholders, looking at
value conflicts and their resolution in a specific context.222 VSD uses conceptual,
empirical and technical investigations in an iterative way to unpack how values are
involved in a system. 223 Conceptual work looks more abstractly at philosophical questions
of establishing values, balancing competing values and potential impacts of the system224;
Empirical work grounds analysis with specific examples, to look individuals, groups, or
larger social systems that configure, use or are otherwise affected by the technology
whereas technical work examines the technology context.225

Having recognised the importance of thinking about values in design, how they are
actually embodied in a technology is important, Friedman and Kahn argue for three
ways.226

Firstly, the embodied position of system designers inscribe their own intentions and values into the
technology with a hard version holding the very meaning and intentions of system designers and
builders bring to their task literally become part of the technology and the soft that objects themselves
do not literally embody an intention of value...system designers themselves are shaped by organisational,
political, and economic forces.227

The exogenous position is that societal forces that involve for example, economics, politics, race,
class, gender and religion significantly shape how a deployed technology will be used.228 Their

219 B Friedman, P Kahn and A Borning Value Sensitive Design and Information Systems in K Himma

and H Tavani The Handbook of Information and Computer Ethics (Wiley and Sons 2008) hereinafter F,K & B
2008; See also Friedman Value Sensitive Design: A Research Agenda for Information Technology: A
Report on the May 20-21 1999 Value Sensitive Design Workshop August 23 1999
220 J Van Den Hoven ICT and Value Sensitive Design (2005) in Goujoun et al The Information
Society(Springer 2006) p67 and it construes information technology (and other technologies for that matter) as a
formidable force which can be used to make the world a better place, especially when we take the trouble of reflecting on its
ethical aspects in advance
221 Values with ethical import are distinct from usability, whilst the two might coexist, they can also oppose

(see discussion p1180)


222 F,K & B See detail on section 4.6 p87-94
223 F&K 2006 p1187
224 Friedman and Kahn characterise ethical questions in HCI as falling into either what ought I to do and

what sort of person I ought to be with the former deontological and consequentialist approaches being
an obligatory theory of right; and the latter virtue ethics approach being a discretionary theory of good.
(p1181) In addressing challenges of universal morality they state generally designs need to be robust enough to
substantiate the value under consideration and yet adaptable enough so that different cultures (or subcultures) can use the
designs in their own way (P1183)
225 F,K & B 2008 p71-73
226 B Friedman and P Kahn Human Values, Ethics and Design in A Sears and J Jacko The Human

Computer Interaction Handbook 2nd Ed (CRC Press 2006) hereinafter F&K 2006
227 F&K 2006 p1179
228 F&K 2006 p1179

26
Working_Paper_March 2016

favoured interactional position, which aligns with other perspectives in this chapter is that
whereas the features or properties that people design into technologies more readily support certain values
and hinder others, the technologys actual use depends on the goals of the people interacting with it,
where properties of the technology or socio-organisational context will determine how
they are interacted with.

VSD has also been subject to critique, largely focusing on what (or whose) values are
considered, and how they are formulated. Alongside arguing for reflexivity of researchers
and greater reflection of views of participants in reporting VSD studies, Borning and
Muller argue VSD often over-claims.229 For example, they say it should not impose a
position of universal values as default, a discussion point in philosophy, and instead
should be more pluralistic and open to different value systems from different cultural
contexts eg Alsheikhs work on use of technologies in Arabic countries.230

Le Dantec et al231 and Borning & Muller have also raised the question of how the values
VSD protects are formulated. Acknowledging where these values come from is an
important step, in this case Western Liberal democracies with commitments to human
rights and civil liberties. Similarly listing values can make them seem overly prescriptive.
The non-exhaustive list of 12 values Friedman et al propose include human welfare,
ownership and property, privacy, freedom from bias, universal usability, trust, autonomy,
informed consent, accountability, identity, calmness and environmental sustainability.232
These values are not meant to be a definitive list, and as Borning and Muller recognise,
pragmatism dictates the need to situate the concept of VSD within some concrete values.
Interestingly, the high level values law seeks to protect are not too dissimilar to those
listed by Friedman and Kahn. Law is a mechanism for protecting many human values,
for example with a user using a smart watch:
- Ownership and property - protection through intellectual property law in the device
design and contract for its sale;
- Privacy - data protection, consumer protection and human rights law seek to
ensure transparency, accountability and fairness in how user data is handled,
used, traded etc.
- Environmental sustainability - International trade laws control finite rare earth
minerals or planning laws control where smart watch retailers can build stores.

Importantly for us, Le Dantic et al argue that whilst abstract lists of values sensitise
designers to values generally, early empirical observation of local values are necessary to
really discover what values are at play in situ i.e. where VSD has taken a discursive approach
to values, we argue for an exploratory approach where empirical investigations treat values as local
phenomena expressed in a local vocabulary. This in turn enables the development of a local classification
of values which begets a local heuristic against which to evaluate systems and social interactions that arise
from their use.233


229 Borning and Muller Next Steps for Value Sensitive Design (2012) CHI 12
230Alsheikh et al http://research.microsoft.com/apps/pubs/default.aspx?id=149025
231 C Le Dantec, E Poole and S Wyche Values as Lived Experience: Evolving Value Sensitive Design in

Support of Value Discovery (2009) CHI 09; See also Halloran et al The Value of Values (2009)
CoDesign 1-29 looking beyond generic values to how complex, dynamic, emergent etc local values of user
232 B Friedman, P Kahn and A Borning Value Sensitive Design and Information Systems (2008) Himma

& Tivani The Handbook of Computer and Information Ethics (Wiley and Sons 2008)
233 Le Dantic et al (2009)

27
Working_Paper_March 2016

Building on the importance of human values in design, and critiques of VSD, we
recognise regulatory and legal values are informed by a European philosophical and legal
tradition. The nature of these values will be dependent on legal jurisdiction and differing
underlying value systems (eg religious, cultural, social, economic, political, etc).
Consequently, we are not arguing for universalism, but acknowledging a pluralistic
position.

With law in particular, reflection and questioning of the sources, institutions and
processes is possible through more critical methods of law eg socio-legal, Marxist,
postmodern, feminist etc, but nevertheless law has practical importance by virtue of
being law. Need for compliance, fear of sanctions and existence within a legal system
means law cannot be ignored, no matter how much, on reflection, we disagree with its
foundations.234 It has to be considered, and accordingly the philosophical values
underpinning it have a means of being brought into the design process. Law is not just a
static box of rules and regulations that can be translated directly into a system. Instead it
is a set of, often conflicting, high level values, rights and responsibilities that need to be
balanced (eg in law, in balancing freedom of the press and individual privacy rights,
unpacking the public interest is a key process). Indeed, legal processes for balancing
such competing rights could help resolve and manage conflicts of values in design.235

Wed argue for a hybrid approach for dealing with legal and regulatory values in design.
Given law is underpinned by human values, a more high-level, abstracted sensitisation is
needed for designers initially. But, like Le Dantic et al argue for understanding human
values in design, we argue in order to realise regulation by design in practice, a more
situated, local understanding of how these legal values exist (or not) is needed. How to
best obtain such an understanding needs discussion, development and action from both
communities over time. However, for now, we tentatively argue the role of participatory
design, specifically the Scandinavian School, as a means of practically considering legal
and regulatory values in design.

Remembering the Political Roots of Participatory Design: The Scandinavian
School

The political roots of the Scandinavian School of participatory design (PD) are
particularly valuable for us. As they draw on regulatory requirements and values to
inform design i.e. the goal of democracy improving quality of life (for workers). Whilst
PD has since become a broader field, we argue that remembering the political roots, and
the mechanisms for accommodating for regulatory considerations, albeit in specific
context, are incredibly valuable for a more situated understanding of legal values.
Furthermore, whilst focus on values guiding designers is important, PD is a mechanism
for incorporating views, values of users within the process of design. We argue that PD
is useful as it aims to keep participation of end users central and recognises end users
citizens have legal rights too.

Whilst these theories emerged from formal work contexts, as Trpel et al say, we live in
an age of embedded computing, to focus exclusively on waged work as the context of design and


234 Although of course people challenge or disobey law, as an institution it continues to exist despite
challenge.
235 See L Urquhart Privacy and Freedom of the Press from 2004-2015: From Campbell to Leveson In L

Edwards Law, Policy and the Internet (Hart Publishing: Forthcoming)

28
Working_Paper_March 2016

use might mean to miss relevant practices, meanings and relations. Different kinds of work, unwaged
work, leisure, and recreation are now potentially relevant design spaces involving both individuals as well
as formal and informal groups of people. 236 Kyng has also noted PD has moved beyond
workers and workplaces.237 Thus wed argue these theories are more broadly relevant
than first appears, eg to the context of designing for the home.

In general, participatory design, as outlined by Trpel et al, is about the direct participation of
those whose (working) lives will change as a consequence of the introduction of a computer application.
Participation potentially relates to all aspects, phases and activities of development, for example, decisions
making, designing, developing, deployment and the further development in use.238 PD is used for
pragmatic reasons (eg workers are experts in their work and hence should be part of the
process), analytical reasons (eg using ethnomethodological approaches), and political
reasons (eg Swedish School on industrial democracy and worker labour rights).239

Banslers table shows divisions in the Scandinavian theoretical schools and the associated
underlying attributes. Our focus here is the critical model, or collective resource
approach but its important to note the shift away from more systems led approaches
and why. As seen in part 1, technology regulation models are still very much in the
systems theory/cybernetic approach.
240
Table 1 Bansler

Systems Theoretical Socio-technical Critical


Knowledge Interest Profit Maximising Job Satisfaction, Industrial Democracy
Participation
Notion of the Cybernetic System Socio-technical System Framework for Conflicts
Organisation
Notion of the Labour Objects (system Subjects (individuals) Subjects (groups)
Force components)
Notion of Common Interests Common Interests Opposing Interests
Capital/Labour
Relations

The systems school looked to rationalise work processes by using computer based information
systems to eliminate waste in labour, time and materials.241 The work of Langefors242 was
influential here, viewing organisations as machines made of components that should
behave in an orderly way according to prescribed rules. Humans being the less certain
component are given minimal influence and agency overall, instead being
programmatically factored into the overall functions of the machine.243 As Bansler
highlights, this shares similarities with Taylorism,244 and thus draws similar criticisms for
treating humans in a mechanised manner that is bad for health, minimises job
satisfaction, and ignores human intelligence and initiative.245In the context of regulation,
the cybernetic models described in Part II, whilst lacking the Taylorist intent, apply a

236 Trpel et al Section 2.5
237 Kyng (2010) p57
238 B Trpel, A Voss, M Hartswood and R Procter Participatory design: issues and Approaches in

Dynamic Constellations of Use, Design and Research in A Voss et al Configuring User-Designer Relations:
Interdisciplinary Perspectives (Springer: London 2009)
239 Section 2.2
240 Bansler (1989) p5
241 Bansler (1989) p5
242 B. Langefors Theoretical Analysis for Information Systems (1966)
243 Bansler (1989) p8-9
244 F W Taylor Principles of Scientific Management (1911)
245 Bansler p9 and p10

29
Working_Paper_March 2016

similar reductionism of society to mechanistic modalities, where the role of the human in
the system is similarly limited.

Cases of rejection/hostility from workers towards new payroll/accounting systems the


1960s246drove a recognition that humans job satisfaction and knowledge had to be
considered as a meaningful part of new technical organisational systems, leading to the
socio-technical school.247 Trpel et al define it as for social and technical systems to be locally
optimised within a specific organisation, especially in their interplay, and in the ways that are beneficial
for the workers/employees, for example in terms of work satisfaction and working conditions.248
Unlike the collective resource approach, it assumes societal harmony in the
organisation between workers and employers, which is a source of criticism.

As a response, critical researchers wanted to strengthen the position of employees and unions vis
a vis managers and capital owners249 leading to the collective resource approach.250As
Trpel et al state, the approach is built on the assumption of inherent and pervasive conflict,
struggle and unequal power relations where workers rights need to be supported to seek
industrial democracy and enable them to push their agendas forward against exploitative capital
endowed with far greater power and resources.251 The goal was to obtain workplace democracy
and improve quality of workplace life, but reflecting the conflict driven nature, Bansler
noted a truly democratic working life will only become a reality after a prolonged struggle by the unions
against the dominance of capital over labour.252

Bdker et als outline of four key principles in doing PD are useful for showing how
committed the process is to looking beyond top down interests. 253 Firstly, defining what
the vision of change is by reflecting the technology, organisation and importantly the
people; secondly, getting genuine user participation and not just relying on the company
vision for the system;254 thirdly, getting hands on experience with the work practices on
the ground to understand practices; and lastly, maintaining continuity with the
established vision and not changing it in order to keep stakeholders on board.255

The commitment to such overt political goals within organisations has not always been
retained in PD generally. As Bjerknes and Bratteteig highlighted 20 years ago, practicing
PD was getting further from politics and instead was being seen as ethics. 256 Their


246 Bansler p10
247 E Mumford The story of socio-technical design: reflections on its successes, failures and potential
(2006) Information Systems 317-342
248 Torpel et al Section 2.3.1
249 Bansler p6
250 P Ehn and M Kyng The Collective Resource Approach to System Design (1987) in Bjerknes et al

Computers and Democracy - A Scandinavian Challenge. (Aldershot, UK: Avebury) 1758


251 2.3.2; C Floyd et al Out of Scandinavia: Alternative Approaches to Software Design and System

Development (1989) Human Computer Interaction 4 253-350. A lengthy analysis of the Scandinavian
Approach to computer system development.
252 Bansler p16
253 K Bdker, F Kensing and J Simonson Participatory Design in Information Systems Development in H

Isomaki and S Pekkola (Eds) Reframing Humans in Information Systems Development (Springer London 2011)
254 See J Blomberg and A Henderson Reflections on Participatory Design: Lessons from the Trillium

Experience (1990) CHI 90. 353 explores divisions between users and system designers in a project that
looked like PD but was not in practice as it did not involve iteration, proper collaboration or improve
quality of work life
255 Bodker et al (2011) p123-128
256 G Bjerknes and T Bratteteig User Participation and Democracy: A Discussion of Scandinavian

Research on System Development (1995) Scandinavian Journal of Information Systems 7(1) 73-98

30
Working_Paper_March 2016

depiction of the role of the system developer in PD reflects this i.e. political system
developer is an emancipator, carrying out an action programme to give the weak parties knowledge they
can use to increase their power whereas the ethical system developer is mainly responsible towards
their own individual ethical codex which might happen to be political. Ethical individuals act morally
in the particular work situations in which they find themselves, promoting workplace democracy through
engagement in systems development situations.257

It is important to point out that the trade unions and law were important drivers in the
Scandinavian research agenda. In Norway, for example, the Worker Cooperation and
Working Environment Act had provisions on keeping workers informed, educating them
on systems, and giving them control and responsibility for their work.258 Furthermore,
trade unions were influential funders of research to investigate alternative systems that
better reflected worker interests and adequate quality of work.

Such factors led many to question how far the Scandinavian School of PD has traction
beyond the Nordic context. 259Floyd et al noted that whilst the trade union element is
important to consider, it does not limit the approach as the theoretical and methodological
approaches, the system development models, and the concepts encouraging user participation with a view
toward humanising technology and work design have broader appeal.260 Interestingly, Floyd
et al also highlighted that whilst computer science in US has largely been driven by
military research, in Scandinavia the focus was on using computers for human benefit261
thus changing the underlying ideological commitments. Kraft and Bansler argued the
environment of the US, where trade unions have limited power, means the collective
resource approach cannot translate.262 In contrast for Muller, Wildman and White claims
of its limited spread outside Scandinavian legal environments are wrong, although
commitments to workplace democracy are certainly less prominent outside
Scandinavia.263

Similarly Blomberg argued that by focusing on the goals of PD (e.g. improving work life,
involving users in design, iterating designs) as opposed to procedural intricacies, its
appeal could spread beyond Scandinavia.264It seems both perspectives have been right
with users in design is now a central tenet of system design. However, the historical
trajectory shows that the commitment to the political, conflict driven roots of the model
have not persisted in the same way.


257 Bjerknes and Bratteteig p85 more detail in paper
258 Bjerknes and Bratteteig p76; P Kraft and J Bansler The Collective Resource Approach: The
Scandinavian Experience (1992) Proceedings of the Participatory Design Conference MA, US 6-7 Nov 1992 also
stress the importance of Norwegian Labour law
259 M Muller et al Participatory Design in Britain and North America: Responses to the Scandinavian

Challenge (1991) CHI 91 389-392


260 Floyd et al (1989) p340-341
261 Floyd et al (1989) p264
262 P Kraft and J Bansler The Collective Resource Approach: The Scandinavian Experience (1992)

Proceedings of the Participatory Design Conference MA, US 6-7 Nov 1992 p132 For different reasons, notably
the disintegration of the US trade union movement and the nearly unchallenged position of US managers
to control the work place, CRA seems even less likely to serve as a useful model of genuine worker
empowerment in the United States.
263 M Muller, D Wildman, E White Taxonomy of PD Practices: A Brief Practitioners Guide (1993)

Communications of the ACM 36(4). 26-28 p27


264 M Muller, J Blomberg, K Carter, E Dykstra, K Halskov, J Greenbaum Participatory Design in Britain

and North America: Responses to the Scandinavian Challenge (1991) Proceedings of CHI 1991 389-392.
p390

31
Working_Paper_March 2016

More recently, Halskov and Hansens extensive review of PD literature led them to claim
five elements typify contemporary PD. These include: subtle politics of multiple voices
being involved in design eg civic engagement; framing of PD for people generally not
just users; looking beyond traditional workplace contexts of PD to healthcare settings,
hackerspaces etc; continued heavy focus on methods for PD; and lastly a shift towards
PD improving quality of life generally, not just through products/technologies.265 As
Kyng stated in 2010, values like democracy are less prominent in PD, and instead
elements like user involvement and prototyping dominate. Trade unions role has
diminished and the contexts of PD have broadened from workers/workplaces to society
at large.266

The point about politics is important, because the conflict driven model seen in the
collective resource model has not really persisted. Beck called for increased political
focus in PD, engaging with ethical, social, cultural implications of computer design, and
moving away from PD as just a method and looking to its roots of challenging power
asymmetries or dominance. 267 Whilst our focus differs, Becks call recognises that
technologies do not exist in a vacuum. They have social implications and for designers to
not engage with the politics is not sustainable. Equally, systems exist in a regulatory
context, and accordingly, designers need to engage with that domain more actively.

We argue that Scandinavian School PDs recognition of law, politics, conflict and human
values is a powerful means of bringing legal and regulatory values into design. Key to this
is how PD views users. They are citizens with legal rights eg workers with labour rights.
Understanding and responding to the legal rights of users is key in the Internet of
Things, for example. Repurposed to focus on regulatory concerns of users in a setting,
for example around control over IoT device data flows, trust in backend infrastructure,
or, concerns about opaqueness of a IoT device functionality, PD could give invaluable
insight into how regulatory issues manifest in context. Such situated knowledge gives
designers invaluable insight into how the regulatory concerns of users manifest in situ
and respond accordingly. Whilst such perspectives are not a key focus during system
design, designers are in a powerful position to refocus on such issues and realise a model
of regulation by design that understands and empowers the users rights.

Part IV: Proposals



Following this discussion, we conclude with three proposals for action, and a brief
summary of our key points. As this is an emerging area, much of this is framed in
questions, and possible routes forward for action.

Whilst coming at these from different directions, both traditional regulators and system
designers both have the interests of individuals at their core. Traditionally, for regulators
it is the citizen, and their fundamental human rights, for system designers it is the user
and their context. Acknowledging the commonality of their goal, and reiterating that
users dont exist in a legal vacuum but are citizens with a raft of legal rights, is a useful
step.

265 Halskov and Hansen The Diversity of Participatory Design Research practice at PDC 2002-2012
(2015) Int. J Human Computer Studies 81-92
266 M Kyng Bridging the Gap Between Politics and Techniques (2010) Scandinavian Journal of Information

Systems 22(1) 49-68


267 E Beck P is for Political: Participation is not enough (2002) SJIS 02 13: 7-20

32
Working_Paper_March 2016

Implementation may differ and interests may not be obtained or acted on e.g. if
stakeholders, like citizens or civil society groups, are not consulted by regulators or
correct users, like elderly or disabled, are not spoken to by designer. Nevertheless,
fundamentally, for system designers acting as regulators, it helps show the move into this
new role is not so much of a leap and could draw on their existing skills for
understanding and reflecting the interests of their users.

Beyond Systems Theory



Social contexts are messy, and system designers may want to create order and manage
the social complexity. The process of formulating understanding of the social dynamics
can clearly have significant impacts on how the system works in practice (as we saw in
discussions of PD where systems that neglect the interests of users/workers are unlikely
to be welcomed or fit with actual work practices).

HCI has seen a shift from systems theory to more situated perspectives. We see this in
the many of the discussions of Part III but a key example that highlights this shift is
Suchmans objection to Winograd, Flores et als268 system for holding people accountable
to tasks in the workplace by formalising what they said as indicative of action.269

For Winograd270 building a system relies on pragmatism, forming typologies and order,
such as speech as indicative of action.271 Suchman272 argued against this and instead for
obtaining a more situated273 picture of a social setting, to challenge assumptions e.g. a
plan =/ action. Instead a system should reflect on how people actually use plans to
formulate actions, perhaps as resources. This shift from more top down abstracted
approaches to more situated, nuanced understandings of the contexts of design has had
impacts across HCI. Design ethnographies, in the wild approaches and user centric
methods feed into designers moving to more situated approaches to understanding the
social.

Technology regulation models are dominated by systems theory/cybernetic approaches


(eg Lessig, Black, Scott and Murray). Whilst these have been influential in sensitising the
regulatory community to the importance of design, it is time to move towards the detail

268 T Winograd and F Flores Understanding Computers and Cognition: A New Foundation for Design (Norwood,
NJ 1986); F Flores, M Graves, B Hartfield, T Winograd Computer Systems and the Design of
Organisational Interaction ACM Transactions on Office Information Systems 6(2) 153-172 (1988)
269 The Coordinator which uses speech act theory (i.e. taking what people say as indicative of action - a

dominant method in CSCW systems at the time) to make workers accountable for what they said they
would do. The Coordinator would remedy the carelessness of organization members regarding their commitments to
each other through a technologically based system of intention accounting (p180)
270 T Winograd Categories, Discipline, and Social Coordination (1994) Computer Supported Cooperative

Work 2: 191-197
271 p193
272 L Suchman Do Categories Have Politics? The Language/Action Perspective Reconsidered (1994)

Computer Supported Collaborative Work 2. 177-190. p178; L Suchman Speech Acts and Voices: Response to
Winograd et al (1995) Computer Supported Cooperative Work 3 85-95 where she reiterates her argument in
response to Winograd and other critics
273 L Suchman Human Machine Reconfigurations (Cambridge: CUP 2007) it is an ethnomethodologically driven

approach that underscores the view that every course of action depends in essential ways on its material and social
circumstance. Rather than attempt to abstract action away from its circumstances and represent it as a rational plan, the
approach is to study how people use their circumstances to achieve intelligent action.273 Suchman 2007 p70

33
Working_Paper_March 2016

of what designers actually do. Limitations of systems approaches, for example lack of
user agency in Lessigs model, are significant. To understand behaviours, practices or
beliefs of citizens/users, we need tools from HCI. Design, long ago, recognised the
limits of a top down systems theoretical model. More situated methods to understand the
needs of users, and how to design technologies that reflect those needs are key for the
design goal of better understanding contexts to build better systems for users (i.e.
citizens). Regulation works within these same social systems, and any system of regulation
by design needs to incorporate how designers really work. Regulatory models need to
recognise this reality in order to move regulation by design beyond just rhetoric and into
practice. A first step is for regulation models need to move past systems theory, seeing
the world in terms of four modalities, and delving into the detail. In this regard system
designers are well placed to take this transition forward.

Bringing Regulatory and Legal Values into System Design?

Drawing on trends in HCI towards engaging with public policy, recognising


responsibility of system designers and the importance of reflecting human values in
design, we stress the importance of system designers bringing legal values into the fore.

The legal values system designers need to be sensitised to will differ greatly depending on
context, hence creating a tick list of all regulatory values would add little, and likely
overwhelm system designers. Instead, fostering an awareness and engagement with
regulatory issues by system designers is the starting point. The existing work on bringing
human values in design from the design community is a suitable Trojan horse for
bringing in regulatory and legal values too. Keeping this in mind we look to a brief
example from IoT and PbD.
Example 1: Smart Security Cameras

The market for connected home entry and security systems is growing.274 These are often
combinations of internet connected door locks, home cameras, motion sensors and
phone apps. Tracking and logging who enters and leaves the home over time or
monitoring for unexpected movement in and around the home and notifying the
homeowner are two possible use cases. A recent GSMA/KRC report cited above
showed that for connected security systems, owners cited benefits like providing piece of
mind (69%), feeling of protection against theft/hazards (67%) and automatic alerts to a
phone (64%) highly.275

As discussed above the home is a complex space of social relations, power dynamics and
routines. Ur et als study on teen and parent attitudes to use of home security cameras
and smart locks to audit home entry/exit, with particular reference to privacy and
surveillance is a useful example. They interviewed 13 teens and 11 parents found there
was broad support for connected locks due to remote control, improved safety and
convenience. 276 They also found trust between teens and parents could be damaged by
increased monitoring, and that teens would find ways to resist the monitoring (eg many

274 The GSMA/KRC recently surveyed 2000 technology enthusiasts across the US, Germany, UK and
Japan in 2015.
275 http://www.gsma.com/newsroom/press-release/internet-of-things-is-transforming-family-life/
276 Eg see B Ur, J Jung and S Schechter Intruders Versus Intrusiveness: Teens and Parents Perspectives

on Home- Entryway Surveillance (2014) Ubicomp 14 p129

34
Working_Paper_March 2016

teens said a security system would not deter them; they would leave the door or a window unlocked the
entire time they were out of the house.).277 Clearly the impact on social dynamics can be altered
by such technological interventions.

If we bring in regulatory and legal values, we see other issues. Firstly, if children are
bringing friends back to the home, their images are being collected on the cameras too.
Are they informed of this? EU DP laws require data processing to be communicated to
children in a way that they can understand.278 For example with consent, under new EU
DP laws parents may need to consent for children under 16 to use certain online
services279 and they have a particularly strong claim to the right to be forgotten280 Whilst
many of these will be managed by the homeowner deciding how these cameras are used
domestically, equally the images may be relayed to cloud storage or manufacturers for
analytics so the manufacturer needs to think about such issues.

Importantly, under the Rynes case281, homeowners formerly could legally exemption
from data protection laws when home CCTV also points to public spaces (eg adjacent
street, shared garden) this is no longer the case.282 Accordingly, they are now subject to
DP laws, in the same way as companies meaning, the parents of children visiting a home
with CCTV or playing in front of that house may request the data.

The broader ethical questions of recording children entering/leaving the home, what is
done with their data, who has access to this, how it is secured from hackers and so forth
are plainly problematic.

By thinking about these issues during design, perhaps safeguards can be built into the
system, such as privacy preserving video data (as we see with some public space
CCTV).283 Reflecting on legal values like proportionality or necessity of data processing,
or data minimisation may raise questions like, what level of detail is necessary about
home movements hourly, daily, weekly, monthly logs? How might the data be stored in
a temporary way? Should it be remotely deletable by parents of visiting children?

By reflecting on these legal values, perhaps systems can be designed, and installed in a
manner that respects rights of children and the dynamics of family life. Sensitising system
designers to the values that the law is protecting, like autonomy and dignity of children
or their rights to form identity may influence how a system is designed.

Participatory Design for Regulation


277 Consequences of Deployment Section
278 Recital 46 GDPR and Art 12
279 Article 8(1) GDPR
280 Recital 53 GDPR and Art 17
281 http://curia.europa.eu/juris/liste.jsf?num=C-212/13
282 Consequently the operation of a camera system, as a result of which a video recording of people is

stored on a continuous recording device such as a hard disk drive, installed by an individual on his family
home for the purposes of protecting the property, health and life of the home owners, but which also
monitors a public space, does not amount to the processing of data in the course of a purely
personal or household activity... (Paragraph 35)
283 ADD-PRIV Project

35
Working_Paper_March 2016

How could Scandinavian School of participatory design be brought to work as a tool to
help system designers engage with legal values, like privacy? Its established lineage in
considering values like workplace democracy, its interaction with politics and law, like the
trade unions and industry workplace laws of Scandinavia, make it a model for thinking
about regulatory and legal values in context. By reframing the focus and sensitivities
towards privacy by design, in settings like the home, and engaging with data protection
and privacy laws, it may foster privacy by design on system designer, and users, own
terms. The best methods for ensuring participation are beyond the scope of this paper,
for now we just pose this as a route for action between the system design and legal
communities. To make the point further we look at another brief example from IoT and
PbD.

Example 2: Smart Thermostats

Smart thermostats are increasingly popular for managing energy use in the home.
Leading example, the Nest Learning Thermostat monitors the temperature of a room,
builds up a picture of the behavioural patterns of the home occupants (through
movement sensing) and creates a schedule to adjust temperatures and switch heating on
or off.284 It is remotely controllable (eg from a smartphone), interacts with the boiler and
can communicate with other IoT technologies in the home like smart lights, plugs,
wearables, washing machines etc through the Works with Nest ecosystem. Nest devices
like the Cam Home Security Cameras are already designed to work together, for
example Cam recording when constantly when the Thermostat is set to Away from
home mode.285 For third party devices, Nest uses a networking protocol called Weave286
so devices communicate peer to peer and Nest speaks to its ecosystem of internet
connected services like device management apps or Nest Cloud.

The sharing, access and flows of data to third parties is mediated by terms and conditions
homeowners likely do not read. The complexity of the ecosystem grows as new services
and devices are added. Accordingly, whilst legally consent may be obtained, homeowners
are likely uninformed about the extent of data movement and cannot exercise rights like
the right to object to data processing or the right to be forgotten.287 These rights again
are underpinned by values like human agency, autonomy, and dignity.

Participatory design mechanisms could be refocused on ensuring homeowner rights are


reflected in the system design and involving those who will live with these IoT devices in
the design process. By reframing focus on this aspect of the setting, then system
designers can become informed about the regulatory concerns of users in a very
grounded way. In this regard, their tools actually involve users in the design of these
systems that have impacts on their rights. System designers have extensive tools like
design ethnographies, prototyping or in the wild approaches that can provide a rich
picture of a setting, user practices and attitudes and technological impacts. These are not
currently geared towards understanding regulatory dimensions of technologies, we argue
that as system designers already have these tools in their armoury, they are well placed to
repurpose these towards regulatory ends.


284 Nest website
285 https://nest.com/uk/support/article/Learn-how-Nest-products-work-together
286 http://threadgroup.org/Join.aspx - Samsung
287 See proposed General Data Protection Regulation Article 17

36
Working_Paper_March 2016

Concluding points

To conclude, we argue five overarching points:

1) Regulation by design involves prospective, as opposed to just retrospective,


application of law.

2) HCI methods need to be repurposed to engage with legal and regulatory


aspects of a system.

3) The legal framing of regulation and design is still anchored in systems theory.
Human computer interaction has a range of rich approaches for
understanding the social, and regulation by design needs to use these.

4) Designers are now regulators and this brings a range of responsibilities.

5) Design and human values perspectives in HCI need to be extended to legal


values and participatory design is a strong candidate for doing this.

The shift towards addressing regulatory issues before a technology is built is one way of
engaging with the challenges of the lag that exists between pace of legal change (slow)
and technological development (fast). As this paper uncovers, the shift from seeing law
as purely retrospectively applied, eg through litigation to a mechanism that informs the
design of a system from inception is important for understanding regulation by design.

System designers are now regulators and they have a rich set of tools/resources to use in
carrying out their role, particularly participatory design and value led approaches. The
systems theory approaches that dominate technology regulation literature are overused
and not fit for purpose of regulation by design. The design literature offers more
sophisticated means of understanding the social, through more situated perspectives.
Accordingly, for design and regulation to interact we need to look to these situated
models to understand regulation in context. A significant element of this is using and
extending existing design frameworks, like values in design, to regulation and law. The
goals are often common, with law as a mechanism for realising underlying philosophical
values.

We see increased reflection within HCI about engagement with regulatory issues,
awareness of responsibility and the importance of human values. We build on this
further to argue for direct engagement with regulatory and legal values, incorporating
these into their work and moving forward together with the legal community to build a
shared epistemic foundation that will inform practical action. By presenting theory in
Part II, we highlight how design is understood in regulation, and secondly to surface how
this understanding of design needs to advance. System design needs to be seen for all its
intricacies, divisions, methods and to not just be brought under the cybernetic aphorism
of code. Through part III we show how designers have the tools to move the practice
of regulation by design onwards. In HCI, a range of subtle, nuanced methodologies for
understanding the social, and designing for this exist. The field has moved past systems
approaches and formalism. Any cooperation between law and design needs to reflect
such shifts. We suggest that value led approaches and participatory design are suitable
mechanisms for reflecting these shifts and bringing regulatory and legal values into
system design.

37
Working_Paper_March 2016

We recognise that designers are in a very strong position as regulators due to their
methods which can be adapted to regulatory ends. Importantly, the ability of design to
engage with values in local, situated way when extended to legal values can only
strengthen the effectiveness of regulation at protecting interest of individuals. However,
designers are not alone in defining the regulatory agenda. Importantly, state regulators
define and steer non-state regulation, and provide designers the forum to be involved.
The state and law are key components in ensuring legal and human values are brought to
the fore in design and these need to be maintained, avoiding a position of un-fettered
self-regulation. We recognise these new regulatory actors, designers, have great potential
in regulation. However, to be legitimate regulators, their practices need to be guided and
informed by values like human rights, democracy, justice, freedom and so forth. At a
high level the state and law ensure enforcement and oversight of these values. Without
such interaction between state-regulators and designers, the legitimacy of designers as
regulators is challenged.

38

S-ar putea să vă placă și