Sunteți pe pagina 1din 11

Original Research Article

Big Data & Society


January–June 2018: 1–11
Algorithms as fetish: Faith and ! The Author(s) 2018
DOI: 10.1177/2053951717751552

possibility in algorithmic work journals.sagepub.com/home/bds

Suzanne L Thomas1, Dawn Nafus1 and Jamie Sherman2

Abstract
Algorithms are powerful because we invest in them the power to do things. With such promise, they can transform the
ordinary, say snapshots along a robotic vacuum cleaner’s route, into something much more, such as a clean home.
Echoing David Graeber’s revision of fetishism, we argue that this easy slip from technical capabilities to broader claims
betrays not the ‘‘magic’’ of algorithms but rather the dynamics of their exchange. Fetishes are not indicators of false
thinking, but social contracts in material form. They mediate emerging distributions of power often too nascent, too
slippery or too disconcerting to directly acknowledge. Drawing primarily on 2016 ethnographic research with computer
vision professionals, we show how faith in what algorithms can do shapes the social encounters and exchanges of their
production. By analyzing algorithms through the lens of fetishism, we can see the social and economic investment in some
people’s labor over others. We also see everyday opportunities for social creativity and change. We conclude that what is
problematic about algorithms is not their fetishization but instead their stabilization into full-fledged gods and demons –
the more deserving objects of critique.

Keywords
Algorithm, fetish, ethnography, social contract, exchange, culture

This article is a part of special theme on Algorithms in Culture. To see a full list of all articles in this special theme,
please click here: http://journals.sagepub.com/page/bds/collections/algorithms-in-culture.

Introduction optimize our fitness routines. They are the very stuff
of everyday life.
Fetish discourse always posits this double conscious- This slippage between algorithms as shiny objects of
ness of absorbed credulity and degraded or distanced value, their taken-for-granted ubiquity and their signa-
incredulity. (Pietz, 1985: 14) ture technical complexity challenges social scientists
If fetishism is, at root, our tendency to see our own who research them. Paul Dourish (2016) calls for
actions and creations as having power over us, how more precise use of the term, algorithm, within software
can we treat it as an intellectual mistake? Our actions studies and notes that in technical circles it has a spe-
and creations do have power over us. This is simply cific meaning even if its implementation makes it nearly
true . . . The danger comes when fetishism gives way to indistinguishable from code. An algorithm, he clarifies,
theology, the absolute assurance that the gods are real. is an ‘‘abstract, formalized description of a computa-
(Graeber, 2005: 431) tional procedure’’ whose on-the-ground effects depend

Algorithms in recent years have become a catchword: a


1
focus of public fascination, a rarified artifact that com- Intel Labs, Intel Corporation, USA
2
mands extraordinarily high salaries for those who make Intel Corporation, USA
them, a lightning rod for business secrecy and ‘‘a magic
Corresponding author:
black box’’ for those use them. At the same time, Suzanne L Thomas, Intel Corp, 2200 Mission College Boulevard, MS: RNB
we take them for granted. They order our news feeds, 6-61 Santa Clara, CA 95054-1537, USA.
personalize our wish lists, turn on our heaters and Email: suzanne.l.thomas@intel.com

Creative Commons CC-BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License (http://
www.creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further
permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-
at-sage).
2 Big Data & Society

on how it is written into code, with what infrastructures when a senior technologist admitted surprise that the
and with what data (Dourish, 2016: 3). Samir Passi and medical robot his team managed to build after six years
Steven Jackson (2017) add that algorithms are rule- of work actually surpassed their expectations.
based, not rule-bound, procedures and therefore their Their mix of belief and disbelief in what algorithms
enactments are inextricably situated and contingent. could do led us to the fetish. In 20th-century anthropo-
Jenna Burrell (2016) highlights the particular opacity logical thinking, fetishes are not indices of false think-
of deep learning algorithms given unpredictable input ing, as they are in vernacular usage. They are, rather,
data and a gap between how people value these data material objects that stabilize complex and ongoing
versus how the code handles them. All recommend social relations because people invest them with this
social scientists be empirically precise and careful effect. What made the fetish so apt in anthropological
when talking about what algorithms are and what thinking is how this investment is also marked by what
they do in the world. William Pietz described as ‘‘this double consciousness
We agree. Taking ‘‘the algorithm’’ or ‘‘algorithms’’ of absorbed credulity and degraded or distanced
as givens blurs the details of how algorithmic capabil- incredulity’’ (Pietz, 1985: 14) and what David Graeber
ities register as outcomes, how these outcomes generate (2005) then summed up as a socially generative leap of
promise and how these promises invite new possibili- faith. The value of analyzing algorithms as fetishes,
ties. With each step, we lose sight of so much – the data then, goes beyond understanding how people invest
preparation, the coding, the learning and application of algorithms with efficacy to understanding that they
all those rules, the repeated experimentation and testing also do so productively.
and the debates over what the algorithms actually do In the next paragraphs, we start with a brief primer on
and then what they do in someone else’s hands and with fetishism and explain how we use the concept as a heur-
someone else’s intentions. istic for analyzing contemporary algorithms. Then, we
We social scientists are not the only ones at risk. turn to our fieldwork. In the case of computer vision
In fieldwork with computer vision professionals and professionals, we show how specialized algorithms
the Quantified Self (QS) community, our research par- stand front and center as traded emblems of only their
ticipants also slipped between referring to algorithms as inventors’ disciplinary expertise. Yet fierce debates over
reified things with promise and then recounting the trick- what makes a computer vision algorithm good enough
ier nuts and bolts of how to work with them. They spoke point to shifting social and professional contracts
of false promises and lost opportunities when algorithms between who makes and who uses them. In our second
did not deliver, and of the magic and faith necessary case, that of the QS community, no one technology or
when they did. This blurring of what algorithms did technique defines self-tracking as a practice. Instead, QS
and what was promised caught our attention. participants ask what outcomes can algorithms really
In the following paragraphs, we revisit our two field bring about. How might desired outcomes best be
studies to ask who and what are involved in rendering effected – as an algorithm to count steps, a sensor that
algorithms (and ultimately data) as tangible and trade- increases energy levels, a ‘‘nudge’’? Their experiments
able objects of promise. Our discussion centers on with what ‘‘works’’ in practice are critical unpaid intel-
Thomas’s study of computer vision developers, rooted lectual labor, as significant to making self-tracking algo-
in 43 in-situ, semi-structured interviews with computer rithms work in the real world as the paid design and test
vision professionals across North America and East cycles of formal device production. We then conclude by
Asia.1 We also draw on Nafus and Sherman’s ethno- calling attention to what we gain and lose in the slippages
graphic research in the QS community, a three year between what algorithms do, what they promise and the
project including participant observation, semi- faiths and possibilities that can result.
structured and open-ended interviews and the ongoing
co-design and development of biosensor data sense-
Why fetishes?
making tools (see Nafus and Sherman, 2014).2
In neither study did we set out to investigate We did not start our research looking for the fetishiza-
algorithms. Thomas focused on the changing work tion of algorithms. It emerged as we sought to explain
practices of computer vision development. Nafus and how our research participants granted algorithms
Sherman documented the cultures and practices of self- powers – the capacity to act in the world, to ‘‘know’’
tracking. Yet in both studies, our research participants things, and to make things happen. In our interviews,
wrestled with claims to algorithms’ powers – what algo- calling algorithms ‘‘magic black boxes’’ had become an
rithms could do both in fact and potential. Questions accepted fact more than an accusation. Computer
about who and what constituted algorithms’ efficacy vision algorithm developers, in effect, admitted this
surfaced when QS participants debated what sort of when they dubbed their work a ‘‘black art,’’ as did
‘‘nudge’’ might make them floss their teeth more, or one interactive game developer, or relished in their
Thomas et al. 3

rarified expertise, as did many PhD research scientists. By the 18th-century, African fetishes were landing in
It is true, the latter would equivocate, that their algo- private collections and museums in the West. The his-
rithms had yet to mature, or their deep learning neural tories of their creation and use were largely erased by
nets remained opaque, but these were simply calls for equivocal claims to magical powers in the writings of
more work not dismissal. Early QS participants, who prominent 19th- and 20th-century critical thinkers.
had little access to the analytical workings of self-track- Karl Marx saw fetishism in ‘‘the whole mystery of com-
ing devices, debated which devices and what output modities, all the magic and necromancy that surrounds
resulted in a useful insight or a company’s wrestling the products of labour on the basis of commodity pro-
away of control. While QS participants and computer duction’’ (Marx, 1977[1887]: 169). Sigmund Freud
vision professionals might quibble about the pro’s and (1961[1927]) used it to explain sexual deviance.
con’s of a particular analytic process, few questioned Fetishism came to describe a socio-cultural mechanism
that, in general, algorithms did and should work. through which objects accrued value, meaning and effi-
More difficult to explain was how research partici- cacy through a process of substitution and misrecogni-
pants slipped between talking about algorithms’ techno- tion. As the term gained widespread adoption, naming
logical efficacy and social accountability. In one a thing a fetish came to carry with it a provocative
example, we asked Gerald, an interactive virtual reality ambivalence, a simultaneous affirmation and doubt
product manager at a multinational corporation, what about its effects in the world.
more he wanted from computer vision algorithms. He More recently, African art collectors and historians
retorted, ‘‘Intellectual honesty.’’ He went on to explain, have sought to recover lost histories of fetish objects’
creation and use. In a description of a Congonese nail
When you’re talking about stuff that takes place in a fetish statue, Thompson (1987) explains,
black box, the agendas of the people that are control-
ling the activities of the black box are all the more To decode the meaning of the blades and nails is to
important. You [algorithm developers] know what’s expand our understanding of the world of the famed
possible, what’s not possible, what can be done, what lawcourts of Kongo. Each blade or nail is a mambu.
cannot be done. Really, be honest with yourself, ‘cause A mambu is a legal matter or problem, nailed-in, liter-
there’s the right approach and the wrong approach. ally and metaphorically, in the search for restitution of
what is right and just, between two or more parties.
Gerald described a crisis of belief, not the breakdown
of an algorithm. The algorithm that technically worked Based on work such as Thompson’s, David Graeber
for him constituted evidence of the good faith of the (2005) amends Pietz’s (1985) historiography to locate
developers, not just their skillful production. the powers of fetishes in the hands of their makers as
Repeatedly, we heard that working with algorithms well as their traders. In Congonese communities,
required trust in those who made and used them. fetishes acted as social contracts and justice systems
When that leap of faith backfired, the black boxing, encapsulated in wood and metal. Their ability to con-
more than the black boxes, were to blame. tractually bind also made them good to trade with for
both Africans and Europeans. Europeans could side-
step the arbitrariness of their desire for gold by pointing
A term with baggage to what they saw as the fetishes’ erroneous values. Per
In contemporary writing about algorithms and data, Graeber (2005), this misconstruing enabled commen-
the term fetish is rarely used with more than a glancing surability and, as a result, an exchange between quite
look at its historiography (Chun, 2008 is an important different peoples. Graeber (2005) was most fascinated
exception). Yet, it is the history of the term’s usage that by this last move, the capacity for misrecognition to
we find so useful. Pietz (1985) traces the term’s origins catalyze social possibility, in this case, exchanges other-
to the 16th to 18th centuries Portuguese and Dutch wise unimaginable.
traders who went to West African trading towns look-
ing for gold. There they also found what they called
‘‘fetish.’’ For them, fetishes were objects of nominal
Fetishes as good to think with
material value imbued by their African trading partners Pietz (1985), Marx (1977[1887]), Freud (1961[1927])
with magical powers. As they saw it, fetishes’ powers and Graeber (2005) might disagree on exactly what
were capricious, arbitrary and constructed: in short, qualities of fetishism are heuristically most important.
products of false beliefs and mistaken attributions. Pietz (1985), for example, insisted on highlighting the
Yet, they also acknowledged how well the promise of replicability of a fetishes’ effect and the singular
these powers facilitated trade both locally and across accountability of the creator. Fetishes were crafted
continents, and traders used them to their own ends. objects, not idols. Marx (1977[1887]: 163-177),
4 Big Data & Society

in contrast, wanted to explain how commodity especially those who use them. In our research, algo-
exchange became a social fact and how the unique rithms more often were defined by what they did than
values of labor were erased by the exchange value of what they were. Their workings were undeniably con-
their product. Freud (1961[1927]: 147–157), in turn, crete. Lights on a screen. A robot arm moving.
focused on the mechanisms of such misrecognitions, A sensor-triggered video of one’s child. A haptic vibra-
but this time a body part (a nose or foot) as something tion to prod someone to start moving.
erotic and phallus-like. Graeber (2005), likewise, By starting with how and when our research partici-
explored the agency of misrecognition but in generating pants materialize algorithms, we echo the call for more
social creativity not social deviance. emic understandings of algorithms as practice
From their work, we distill four attributes of the (Dourish, 2016; Kitchin and Lauriault, 2014; Passi
fetish and how they distribute power as capability, and Jackson, 2017). We agree with Wendy Chun that
promise, faith and possibility: algorithms (in her case source code) must be considered
‘‘in media res’’ (Chun, 2008: 323) rather than as reifi-
1. The fetish is a material object imbued with capabil- cations of what they ‘‘really are,’’ an analytical mistake
ities that are not inherently properties or functions she calls fetishization. We extend Chun’s argument to
of the object itself. focus on the labor relations and contracts that make
2. These excess capabilities are generated at the point code ‘‘work’’ both in terms of machine execution and in
of contact between differently positioned people and terms of what it does for humans. We also build on the
thus widen the scope of their outcomes to the social, prior work of critical social scientists who identify
cultural and economic. how algorithmic work structures power relations
3. These social, cultural and economic outcomes are by enacting discrimination and social sorting
misrecognized or substituted as belonging to the (Barocas and Selbst, 2016; Pasquale, 2015), promulgat-
fetishized object as its promise. ing labor inequalities (Gray et al., 2016; Irani, 2015)
4. This substitution or misrecognition is itself effica- or shaping cultural production through algorithm
cious: it enables something to take place that might design choices (Gillespie, 2014; Hamilton et al., 2014;
otherwise not happen. McKelvey, 2014).
By analyzing algorithms as materialized and misre-
These four attributes organize our discussion of cognized social contracts, we also can tackle less
computer vision professionals, QS participants and straightforward enactments of power, ones where
algorithms. As a set of attributes, they also dispel accusations of magic and black boxing come into
other claims to power that haunt the algorithm. If we play. With fetishism as a heuristic, we can empirically
were to stop our analysis with the first two attributes disentangle how algorithms materialize as things in
of the fetish, algorithms could be imbued with a themselves, how people invest them with powers
technological determinism, where the technology itself to do things and how the promise of these powers
is credited with capabilities that directly lead to change. ultimately valorizes some people’s work and opportun-
If we were to only focus on the second two attributes, ity at the expense of others.
we might conclude that algorithms act as a kind of Finally, as we analyze algorithms as fetishes, it is not
technological sublime granted those people revel in to say that other people are naive to believe in algo-
their god-like promise and possibility. By analyzing rithms’ efficacy while we remain wiser. Rather, it is to
algorithms using all four attributes, we see a fuller pic- say that people position algorithms in ways that make
ture of how algorithms as material objects are invested algorithms promise more than they can deliver in
with and thereby gain powers as they change hands. strictly material terms. For Graeber (2005) and us,
We can see algorithms as traded talismans that invite this is the moment of social creativity when faith in a
slippages between their effect, promise and possibility, promise delivers possibility.
and not as artifacts of a technological determinism or
sublime. Computer vision expertise and
There is some disagreement about what constitutes
the materiality of algorithms, particularly if we com-
algorithms that can ‘‘see’’
pare them to Pietz (1985) and Freud’s (1961[1927]) Computer vision remains a nascent, but not new,
fetishes (noses, amulets and so forth). Algorithms disciplinary field. It marries sensors and image signal
sometimes manifest as math, sometimes as lines of processing with a growing and diverse span of human
code and sometimes the parsed data or visualizations digital work to make sense of light for humans and
they produce. They are, as Josh Berson (2015) puts it, machines. Although it emerged in academia over
representationally promiscuous. They shapeshift in the forty years ago, the technical challenges remain daunt-
hands of those who design and engineer them and ing. There is much room to invent new mathematical
Thomas et al. 5

and computational ways to envision light and even commonly available Python instead. For-profit and
more room to make these algorithms relevant and nonprofit organizations increasingly curate algorithms
useful in commercial and industrial products. For into libraries, be they open-source, such as OpenCV,
those we interviewed, computer vision is simultan- custom built pipelines or included as part of product
eously an ambitious technical project inspired by software development kits (SDKs) or development
notions of artificial intelligence, an academic discipline, toolchains, such as Matlab. In all cases, each algorith-
an increasingly in-demand technological capability, a mic unit and pipeline comes with user, licensing and
qualifier for a well-paying job and the belief that com- intellectual property agreements.
puters can, one day, ‘‘see’’ as if or better than humans. Vision algorithms materialize as artifacts for and
outcomes of computer vision professionals’ labor.
They are products whose uses are contractually gov-
The materiality of computer vision algorithms
erned and whose promises build careers and reputation.
By definition, vision algorithms digitize and analyze In the past, some algorithms were named in honor of
light captured by cameras and make it available to do those who created them, but now they more often are
a particular task, such as identify a landmark, towards a named for what they achieve, such as the simultaneous
particular end, say, locate a robot. In this way, they act localization and mapping (SLAM) or structure from
like other algorithms (Barocas et al., 2014; Dourish, motion (SfM) algorithms. Here, we begin to see how
2016). Yet computer vision professionals use the term algorithms gain their powers. Like magic incantations,
more loosely. It interchangeably indicates a single math- these names spell out the promise of what the lines of
ematical or logical step as well as a series of such steps, code or series of mathematical functions should do
as in an imaging or vision processing pipeline. One com- once they leave the hands of their creators to those
puter vision PhD explained that she cobbles together chartered with their use.
prior algorithms, as units and as pipelines, to tackle
whatever vision problem she has at hand. For her, all
algorithms are amalgamates of prior algorithms.
Algorithm makers versus algorithm users
In this way, vision algorithms are both the media for Theorists concur that the fetish gains its powers in the
and products of computer vision work. Charles encounter and exchange between diverse peoples, such
Goodwin (1994) distinguishes between what profes- as the mid-century African and European traders
sionals work with and how they materially represent (Pietz, 1985), the laborer and the capitalist (Marx,
that work to others. In computer vision, algorithms 1977[1887]) or the child and his mother (Freud,
are both. In this tight, competitive world of using algo- 1961[1927]). In the case of computer vision algorithms,
rithms to create new ones, careers hinge on staking their promises materialize in the change of hands from
claims to the novelty of algorithmic invention, particu- makers to users. Careers, reputations and commerce
larly for academics as well as corporate research and divide those who create and those who use computer
development research scientists. These so-called vision algorithms. Yet, significantly, the distinction
‘‘pure’’ algorithm developers race to benchmark the between the two is more often social and economic
comparative performance of their algorithms against than pragmatic.
same class but prior art. Presentations of these state- When describing their day jobs, computer vision
of-the-art vision algorithms are white-knuckled affairs, professionals parse their work into a series of familiar
with audience members often challenging the testimony tasks: make image data accessible and available, curate
to competitive performance or unique design. At stake images into data sets, design algorithms, code algo-
is not just what the algorithm is or does but what it rithms, optimize algorithms (which means to ensure
promises to do. they on particular hardware software systems), make
The materiality of the algorithm gate-keeps who can them do something for someone and use the completed
build on its promise. A holographic display engineer solution with its vision and other capabilities. As activ-
complained that some computer vision academics ities, it is difficult to know where exactly the labor of
only published their algorithms as mathematical making ends and using begins (as per activity theorists
proofs. He needed working C/Cþþ code, the tools of Engestrom and Miettinen, 1999; Goodwin, 1994;
his trade. Gale, a PhD computer vision research scien- Suchman, 1987). Add to this that the tasks are, in prac-
tist, explained that by writing algorithms as math in tice, portable. In some cases, PhD algorithm developers
Matlab, she broadens their industrial applicability spend months hand tagging video. Under deadline,
even if she must then optimize them for each applica- software developers will scrabble together sextant algo-
tion. Graduate students typically eschew such expensive rithms to automate horizon detection. Even system
algorithm development toolchains and turn to proto- engineers find themselves curating and logging image
typing computer vision algorithms in the more data sets.
6 Big Data & Society

Yet those we interviewed assign the tasks to specific algorithm developers willing to ‘‘get their hands
job roles in an idealized structure with socio-economic dirty’’ and both design and code algorithms that do
implications. Interns and crowdsourced workers, the something, not just promise to do something.
‘‘data janitors,’’ do the onerous and generally unrecog- We are at a time when the unquestioned prestige and
nized creative work of gathering, cleaning, labelling and expertise granted to algorithm developers is beginning
curating data sets for algorithmic modeling or neural to unravel. Vision product teams at large multinational
network training (Gray et al., 2016; Irani, 2015; Lohr, companies and startups alike liken the trade in state-of-
2014). Algorithm developers generate state-of-the-art the-art algorithms to older ‘‘waterfall’’ work practices
or reliably ‘‘good enough’’ vision algorithms and are vilified by software development norms like the agile
typically the stars of the show, commanding salaries movement. Managers talk passionately about how
that Peter Lee, head of Microsoft Research, once com- they have reorganized their teams to pair algorithm
pared with top NFL quarterbacks (Vance, 2014). Some developers with software developers and system engin-
of these turn to software developers to translate the eers. Their goals are to accelerate design, development
step-by-step series of their mathematical functions and deployment by making all algorithmic and engin-
into programming languages. Finally, system engineers eering work collective and collaborative. As a result,
make sure these algorithmic steps work optimally on the algorithm dissolves into a common code base.
the hardware and software technology at hand, such as Accountability for the algorithm’s performance and
a medical robot or car. optimization becomes shared. The distinction between
This social and economic division of labor further algorithm maker and user further blurs. Software
crystallizes as the algorithm comes into focus. On one developers gain near equal status on the team (although
side are the algorithm makers who have a vested inter- not necessarily an equivalent rise in pay). Significantly,
est in claiming the invention and therefore intellectual these small but nimble teams also bring back in-house
ownership of a state-of-the-art algorithm. Their work the ‘‘dirty’’ work of collecting, annotating and curating
includes, but rarely credits, the work of those who data. Not all firms thrive by introducing these agile- or
collect and curate data sets as well as those who lean startup-inspired work practices and some, notably
code the more mathematical algorithms. Instead, the algorithm developers, protest and in private admit
they nod to their academic training and fiercely to us researchers that they miss the familiar jostling for
defended disciplinary specialization. For them, vision prestige and resources.
algorithms remain crafted objects, not Marx’s Not surprisingly, we also see the valuation of algo-
fetishized commodities (Marx, 1977[1887]: 163–177). rithms shifting away from general purpose or ‘‘pure’’
Their novelty, and therefore the incommensurability algorithms to the proven performance of an algorithm
of their labor, stems from the still rarified craft of over time and at scale. Many we interviewed still wel-
their making. On the other side are the algorithm come ‘‘pure’’ algorithm invention but only if those
users: system engineers, application developers and algorithms also can reliably and robustly repeat the
software developers. Positioned as the inexpert same effect across their specific product lines. One med-
wielders of ‘‘magic black boxes,’’ they hold the cre- ical imaging startup founder joked that his relationship
ators accountable for the promise of an algorithm’s with academic computer vision was parasitic. He had
performance, as Gerald articulated earlier. no desire to pay the salaries of algorithm developers, he
The injustice of who and who does not get credit or just wanted someone, an academic or a large corporate
compensation for what an algorithm does is clear. Lilly R&D arm, to deliver him predictable, affordable and
Irani (2015) and Mary Gray et al. (2016) eloquently usable vision outcomes.
make this argument, although we would add contract As the professional distinctions between making and
software developers into the mix of disenfranchised using vision algorithms blur, the vision algorithm loses
professionals. But the degree of vitriol that accompa- its luster. With the skyrocketing popularity of deep
nies the distinction between algorithm makers and users learning methods, some professionals hazard that the
suggests more is at play. In a mid-project interview, we magic of computer vision might better reside in a well-
suggested to one well-known computer vision academic curated dataset. What makes the lens of fetishism so
that a team of PhD trained vision experts designing revealing is its ability to track in whose hands and to
localization and mapping algorithms for consumer whose advantages the materialization of professional
robots did work very like his. He snapped, ‘‘That’s privilege occurs.
not computer vision, that’s system engineering.’’
When we recounted this story to the founder of an
Faith in algorithms
interactive game development studio, he laughed and
then quipped, ‘‘We don’t hire prima donnas.’’ He pro- For Graeber (2005), the social creativity of fetishism
ceeded to explain that he only hires computer vision arrives at the last misrecognition, the vesting of the
Thomas et al. 7

algorithm (or the dataset it produces) with possibility. of various self-tracking technologies through their
No one in our conversations ever doubted the efficacy experiments on and through their bodies. Their atten-
of algorithms, be they mathematical functions, code or tion to exactly where, how and what ‘‘works’’ for whom
parsed data. But when we look at how people believe in widens the space for many faiths many different kinds
the promise of algorithms, we hear echoes of Pietz’s of technologies, sometimes algorithms and sometimes
(1985) simultaneous belief and disbelief. Jules, the not. In these venues, the efficacy of algorithms along-
COO of a medical robot manufacturer, explained, side that of spreadsheets, sensors and data is questioned
and debated.
When I look back to six years ago, and look at the QS participants’ commitment to self-experimenta-
system we have today, I could not have imagined tion and learning slows down the contractual clarity
then what it could do today. It’s so much better than that underwrites full-scale belief in particular self-track-
what I thought. Because we didn’t cast ourselves into ing technology, like a mindfulness pill, a behavioral
an idea of that’s what we going to have. We instead cast nudge or an algorithm that counts steps. In the QS
ourselves into ‘we’re going to do the best we can, every community, few technological promises or market
time, in incremental steps. values rest assured. QS experimentation instead exposes
the fragile consensus that underwrites the emerging,
Jules called these incremental steps ‘‘leaps of faith’’ and self-tracking consumer commodity market. It muddies
the heartbeat of their collaborative work. In doing so, ‘‘the whole mystery of commodities, all the magic and
he granted both the leaps of faith and the resulting necromancy’’ (Marx, 1977[1887]: 169) that companies
product, here a surgical robot, the ability to surprise use to sell commodity products. It is ironic, then, that
him and his team. We cannot but agree with Graeber some critics accuse QS participants of fetishizing self-
(2005) that there is an everyday magic to how we, as tracking technology (see Sharon and Zandbergen, 2016
humans, grant our creations the capacity to do things. for an excellent overview). We counter that QS com-
To believe that vision algorithms do things vests munity attentiveness to and faith in the particular and
them with social power beyond their capabilities as diverse possibilities of self-tracking as technology and
math or code. It obfuscates some labor to the credit practice actually temper the pace of what Marx
of others by vesting the algorithm, not its trade, with (1977[1887]) would call commodity fetishism.
function and effect. It also secures a broader social and The QS community consists of people who gather
economic commitment to vision as a promise endemic online and in major cities around the world to discuss
to the algorithms. With this commitment, vision algo- what they can learn by collecting and analyzing data
rithms gain materiality and agency to operate outside about themselves. Some participants join out of curios-
of and independently from the professionals who ity, and some to tackle a medical problem. Some join
design, develop and deploy them. This is how Jules because they (also) work at a technology manufacturer,
can be surprised by what he and his team built. academic research institute or medical organization
The promise that computers can ‘‘see’’ fuels profes- interested in productizing self-tracking technologies.
sionals to continue their work, to be a part of making Despite their differences, they meet with the explicit pur-
this magic happen. It allows them to forget, for a pose of discussing the use and efficacy of self-tracking
moment, the hours of labor onerously drawing bound- techniques and technologies, such as activity tracking,
ing boxes on video footage or rewriting camera APIs to stress detection, microbiome tests and more. To keep
ensure different camera feeds can be similarly analyzed. the focus on the practices of self-tracking, meeting
It allows them to mistake their and others’ labor for the protocol requires participants to speak about what
workings of a truly powerful, awe-inspiring algorithm they learned as individuals and not deliver product
(even when it does not work, as Gerald reminded us). pitches or make broad scientific claims (Berson, 2015;
But as they do so, they (and we) risk forgetting the Sharon and Zandbergen, 2016). In this way, QS meet-
specificities of their work in the name and promise of ings are more than communities of practice interested in
algorithms that can ‘‘see.’’ furthering a rich body of shared knowledge (Lave and
Wenger, 1991). They are communities of encounter,
Efficacy and awareness in the Quantified much like Pietz’s (1985) 17th-century Gold Coast trad-
ing towns, that trade in ideas, methods and claims about
Self community the potential worth of this or that technology.
Where computer vision professionals lose sight of the
particularities of some people’s labor in the broader
Recalibrating efficacy
social and economic commitment to vision, the QS
community, in contrast, celebrates these particularities. In an early paper (Nafus and Sherman, 2014), we
Participants publicly register the specific effects argued that QS participants creatively reworked the
8 Big Data & Society

capabilities proposed by existing self-tracking devices, In some domains, this relationship between heart
like algorithmically parsed steps or sleep, as well as rate and energy might be plausible, but in the QS com-
their promises, such as ‘‘improving health.’’ They inter- munity, a different experimenter might not accept this
rogated manufacturers’ marriage of product and prom- correlation and instead register energy levels in terms of
ise. We called this reworking a ‘‘soft resistance,’’ in the capacity to write many words on a page. Another
short a process of disentangling what algorithm-con- experimenter might argue that the absence of high heart
taining products can do materially from what they rate be considered evidence of a conditioned heart, as is
promise to do bodily, mentally or socially. the case with ‘‘energetic’’ athletes who have low heart
QS participants chipped away at products’ promises rates. In Dougherty’s case, she created a situation in
by asking two questions: does a particular technology which she had to believe in the efficacy of the placebo
work and does it work for me. To explore the former, and believe in the correspondence between her specific
participants head-to-head compared products that mental constructs and the sensor data in order to arrive
claimed to sense the same thing and then measured at a conclusion about her self-tracking experiment’s
how well they actually did so. To get at the latter ques- efficacy. She also had to put faith in the sensors them-
tion, they pitted the output of the technology, whether selves. It would have been a failed experiment if the
it worked via an algorithm or not, against whatever sensors produced no data at all, or their algorithms
purposes the self-tracker defined. These purposes parsed it into implausible data as often happens with
could have little to do with what the product and algo- bodily sensors.
rithm engineers intended as outcomes. With these duel- These were fine and calculated parsings of belief and
ing questions, QS participants effectively broadened the disbelief, more scrutinized than those that Pietz (1985)
scope of what self-tracking technologies could and associated with the trades in Gold Coast fetishes.
should do beyond what product designers and manu- Dougherty worked through this play of credulity and
facturers proposed. incredulity by temporarily granting the pill an excess
Participants turned to terms such as ‘‘mindfulness’’ of capability she knew it did not have. By design,
and ‘‘awareness’’ to locate the embodied effectiveness her experiment helped her pinpoint the effectiveness
of self-tracking (see Sharon and Zandbergen, 2016). of self-tracking as the method for ‘‘gaining energy’’
Consider, for example, a talk given by Nancy by using sensing technologies, here ingested sensors.
Dougherty (2011) who created what she calls ‘‘mindful- Dougherty’s work is an extreme example of the more
ness pills.’’ Dougherty made her own blister pack of common discussions of whether keeping a daily step
sugar pills, each containing an ingestible sensor. Not count really ‘‘made’’ someone take more steps or
coincidentally, Dougherty worked for an ingestible seeing streaks of consistent behavior displayed on a
sensor company at the time. She labelled each pill screen produced the desire to continue that behavior.
with a mental state she desired, such as ‘‘energy.’’ It demonstrates how carefully QS participants reflected
As she took a pill, the pill’s ingested sensor sent to on exactly what works, how, why and for whom. Such
her phone biostatistics about her body that revealed reflections were necessary in part because the tech-
that yes, indeed, she did bicycle much harder shortly niques and technologies often came from other, less
after taking the ‘‘energy’’ pill, which she herself knew to familiar domains – medical, alternative health, sensors,
be a placebo from the outset. ‘‘In fact all of my biggest algorithms and more. As a result, the rules governing
heart rate spikes were after ‘energy’ pills,’’ she their use were not self-evident. Making a placebo do
commented. work in the context of one’s personal life was not a self-
By including ‘‘mindfulness’’ as a measure of her pla- evident maneuver to make. Migrating technologies and
cebo’s efficacy, Dougherty unabashedly queried the techniques from established to new domains of use did
effects of belief and disbelief in her experimentation. open up new sites for realizing the efficacy of the algo-
Upon taking her pill, she tracked that her attention rithms that occasionally played a role. They also
was drawn to energy levels. That recognition was per- required, as we saw in Dougherty’s case, carefully
ceived as in and of itself effective in changing her energy tested leaps of faith.
levels. She was aware that her pills were placebos,
therefore ineffective by definition. Yet, she accepted
that the effects registered in the mind were real. The
Commodifying promise
pills were not ineffective in that the sensors in them This faith takes a different valence in the neo-behavior-
corroborated or refuted the existence of a mental ism that informs the production of self-tracking tech-
effect. The pills (of her own design) invited her to nologies. Silicon Valley companies position themselves
believe that what the sensors and algorithms measured, not as mere producers of data but as producers of their
such as an elevated heart rate, could be translated into users’ behavior change (Schüll, 2016). While QS par-
higher ‘‘energy.’’ ticipants asserted dominion by creatively repurposing
Thomas et al. 9

technologies, self-tracking companies seek to effect consumer health technology marketplace that proposes
more controllable and measurable outcomes in their the gods of behavior change are real and that someone
users, in particular algorithmically timed ‘‘nudges’’ could prove it if only they knew what talisman to bring
toward ‘‘healthy’’ choices, such as a fork that vibrates and effect to register. The QS injunction to experiment
when it infers a person eats too quickly. While this in a personal setting works against the formation of a
trope of nudge-based ‘‘awareness’’ has made its way totalizing theology of ‘‘health.’’3 But it, too, begins with
into the QS community, in commercial circles the a leap of faith that some technology somewhere could
active learning and more expansive questioning that work. This faith is not an intellectual mistake. Self-
QS participants value all but disappear. Products like trackers are giving technologies and their producers
activity trackers are sold with the promise of keeping the benefit of the doubt, selectively and for a time.
the user ‘‘on track.’’ Schüll describes these nudges as a It is, however, an intellectual mistake for commentators
‘‘curious mechanism, for it both presupposes and to claim that only naı̈ve Others do so.
pushes against freedom; it assumes a choosing subject,
but one who is constitutionally ill equipped to make
rational, healthy choices’’ (Schüll, 2016: 12).
Conclusion
To achieve the kind of social and economic consen- In our two ethnographic cases, people divide and are
sus necessary to trade these nudges on a commodity divided into algorithm makers and algorithm users.
market, firms seek to solidify exactly what nudging Sometimes the divide is blurred, and sometimes sharply
technologies do to whom and how. Gone is the experi- defined. The division is social in that it generates
mentation. To nudge people ‘‘for life’’, as Schüll (2016) claims to status, expertise and community. It is cultural
puts it, eclipses the cognitive work and social negoti- in that each side crafts its own practices, rituals and
ations that consumers must do to make a call to action, knowledges. As Marx (1977[1887]), Pietz (1985) and
like ‘‘time to get more exercise,’’ plausible and effective. Graeber (2005) remind us, the divide is economic in
Instead, the trope of the nudge offers manufacturers a that it makes possible the exchange of algorithms
technologically buildable and institutionally scalable, in (and data) as materialized artifacts. These algorithms,
short trade-able, response to the QS debates over who like fetishes, enable parties to productively misrecog-
or what has the power to change me and my body. nize what the technology is and does and, as a result,
As we saw in Dougherty’s case, QS self-tracking and invites them to engage in their trade of promise and
experimentation rely on leaps of faith similar to the possibility.
leaps of faith embedded in the commercial commodities Using the fetish as a heuristic, we can lay out the
that Schüll critiques. In both cases, people build on the steps by which people vest algorithms with promises
possibility that a technology could ‘‘make’’ someone do and possibilities that extend beyond what the math,
something, even if they have radically different views of lines of code, steps or ingested sensors can do.
why or how. This suggests an uncomfortable prospect. It explains how slippages occur between human prac-
The self-tracking world trades in non-commodified and tice and human possibility, algorithmic capability
commodified technologies as well as in technologies and algorithmic promise and how they tender the
that will commodify or de-commodify over time. algorithms’ exchange.
Commodity fetishization in the full Marxian sense – The twist of fetishism that Graeber (2005) so artfully
the systematic erasure of human labor so that commod- reveals is that this economic segregation of us and them
ities can be traded and capital accrued – is never inev- happens at those historical moments when in practice
itable, but also never very far away. When self-trackers the division barely holds. Accusations of ‘‘fetish!’’ are
tinker with commodities in hopes that the objects actu- caustic in that they insist on the discriminatory act, a
ally fulfill their promises, or at least make them avail- critical stance that purports to unmask and yet reveals
able for repurposing, the broader cultural injunction to its own anxieties in the process. Yet in technology as
suspend disbelief about the conditions of their produc- elsewhere, there is also a creativity sparked by the leap
tion becomes a prospect, too. of faith that fetishism allows. Call it magical realism in
In the QS case, the proposed social contract between practice. Novice and expert vision product teams are
nudge-ers and nudge-ees is still nascent enough to make building computational systems that can ‘‘see,’’
room for QS participants’ soft resistance. Here, although perhaps only to distinguish a baby from a
we remember Graeber’s (2005) suggestion that the dog. QS participants do effect new modes of knowing
occasional leap of faith, taken with talisman in hand, about bodies in the name of awareness. These steps are
is less socially concerning than full-blown mythologies, possible, in part, thanks to the agency we grant
such as the constant references to ‘‘healthy living’’ that algorithms.
we see in advertisements for various self-tracking Lest we believe the idols of our own making, we urge
devices. There is a budding theology forming on the caution. Too easily the work of purported ‘‘users’’
10 Big Data & Society

of algorithms becomes lost in the noise of capitalizing Barocas S and Selbst AD (2016) Big Data’s disparate impact.
on invention and innovation. We do not think that California Law Review 671(104). Available at: http://www.
algorithms, and those who can claim to have invented californialawreview.org/wp-content/uploads/2016/06/
them, deserve all the credit for what algorithms can do. 2Barocas-Selbst.pdf (accessed 9 January 2017).
Berson J (2015) Computable Bodies: Instrumented Life and the
Our fieldwork testifies to the generative possibility of
Human Somatic Niche. London, UK: Bloomsbury
believing for a moment in the powers of algorithms, but Academic.
only if we stop short of demonizing or deifying them. Burrell J (2016) How the machine ‘thinks’: Understanding
opacity in machine learning algorithms. Big Data &
Declaration of conflicting interests Society 3(1). Available at: http://journals.sagepub.com/
doi/abs/10.1177/2053951715622512 (accessed 25
The author(s) declared no potential conflicts of interest with
December 2017).
respect to the research, authorship, and/or publication of this
Chun W (2008) On ‘‘sourcery,’’ or code as fetish.
article.
Configurations 16(3): 299–324.
Dougherty N (2011) Mindfulness pills. In:
Funding QuantifiedSelf.com, 7 August. Available at: http://quanti-
This research and article were completed while the authors fiedself.com/2011/08/nancy-dougherty-on-mindfulness-
were affiliated with Intel Corporation. We are grateful for the pills/ (accessed 11 November 2016).
Algorithms in Culture project at UC Berkeley, our anon- Dourish P (2016) Algorithms and their others: Algorithmic
ymous reviewers and, most importantly, all who so graciously culture in context. Big Data & Society 3(2): 1–11.
shared their time and wisdom during our research. Available at: http://journals.sagepub.com/doi/abs/10.
1177/2053951716665128 (accessed 11 March 2017).
Engeström Y and Miettinen R (1999) Introduction.
Notes In: Engeström Y, Miettinen R and Punamäki RL (eds)
1. In 2016, Suzanne L. Thomas conducted one-on-one, in Perspectives on Activity Theory. Cambridge: Cambridge
situ and semi-structured interviews with 43 computer University Press, pp. 1–16.
vision professionals spanning computer vision design, Freud S (1961 [1927]) Fetishism. In: Strachey J (trans) The
development and deployment employed at 13 companies Complete Psychological Works of Sigmund Freud, Vol.
ranging in size from multinational corporations to small XXI. London: Hogarth and the Institute for
startups. The companies spanned five industry verticals Psychoanalysis, pp.147–157.
from robotics, autonomous vehicles, medical imaging, Gillespie T (2014) The relevance of algorithms. In: Gillespie
interactive media development and various security and T, Boczkowski P and Foot K (eds) Media Technologies:
surveillance systems, from home and pet monitoring to Essays on Communication, Materiality, and Society.
industrial perimeter surveillance. Her goal was to identify Cambridge, MA: MIT Press, pp. 167–194.
the changing social and cultural practices of computer Goodwin C (1994) Professional vision. American
vision development work across industries. Interviews Anthropologist 96(3): 606–633.
lasted from 60 to 120 minutes. Thomas was joined by Graeber D (2005) Fetishism as social creativity: Or, Fetishes
computer vision and deep learning colleagues. With add- are gods in the process of construction. Anthropological
itional consent from participants, interviews were audio Theory 5(4): 407–438.
recorded and transcribed. All names used are Gray ML, Sur S, Shoaib S, et al. (2016) The crowd is a col-
pseudonyms. laborative network. In: Computer-supported cooperative
2. Between 2012 and 2016, Dawn Nafus and Jamie Sherman work and social computing, San Francisco, CA, 27
have conducted participant observation within the QS February–2 March 2016.
community. They participated in QS events in the US Hamilton K, Karahalios K, Sandvig C, et al. (2014) A path to
and Europe, learned how to self-track using other partici- understanding the effects of algorithm awareness. In:
pants’ methods, became involved in community debates, Human Factors in Computing Systems (CHI), Toronto,
shared a software prototype with the community in order Ontario, Canada, 26 April–1 May 2014, pp.631–642.
to elicit feedback and conducted individual, unstructured New York: ACM Press.
interviews with self-trackers in Oregon, California, Irani L (2015) Justice for ‘Data Janitors.’ In: Public Books.
Pennsylvania and the Netherlands. Available at: www.publicbooks.org/nonfiction/justice-for-
3. The QS community did, of course, emerge in a context data-janitors (accessed 11 November 2016).
that has other totalizing ideologies to contend with, such Kitchin R and Lauriault T (2014) Towards critical data stu-
as Western individualism. dies: Charting and unpacking data assemblages and their
work. In: Eckert J, Shears A and Thatcher J (eds) Geoweb
and Big Data. Lincoln, NE: University of Nebraska Press.
References Pre-print version of chapter 30 July 2014. Available at:
Barocas S, Rosenblat A, Boyd D, et al. (2014) Data & Civil SSRN: https://ssrn.com/abstract¼2474112.
Rights: Technology Primer, 30 October. Available at: Lave J and Wenger E (1991) Situated Learning: Legitimate
http://datasociety.net/output/data-civil-rights-technology- Peripheral Participation. Cambridge: Cambridge
primer/ (accessed 13 November 2016). University Press.
Thomas et al. 11

Lohr S (2014) For big-data scientists, ‘‘janitor work’’ is key 25 February–1 March 2017, pp.2436–2447. New York:
hurdle to insights. New York Times. 19 August 2014. ACM Press.
McKelvey F (2014) Algorithmic media need democratic Pietz W (1985) The problem of fetish. I. RES: Anthropology
methods: Why publics matter. Canadian Journal of and Aesthetics 9(Spring): 5–17.
Communication 39(4): 597–613. Schüll N (2016) Data for life: Wearable technology and the
Marx K (1977 [1887]) In: Fowkes B (trans) Capital: A design of self-care. BioSocieties 11(3): 1–17.
Critique of Political Economy, Volume One. New York: Sharon T and Zandbergen D (2016) From data fetishism to
Vintage Books. quantifying selves: Self-tracking practices and the other
Nafus D and Sherman J (2014) This one does not go up to 11: values of data. Big Data & Society 3(2). Available at:
The quantified self movement as an alternative big data http://journals.sagepub.com/doi/10.1177/
practice. International Journal of Communication 8(2014): 1461444816636090 (accessed 25 December 2017).
1784–1794. Suchman L (1987) Plans and Situated Actions: The Problem of
Pasquale F (2015) The Black Box Society: The Secret Human–Machine Communication. New York: Cambridge
Algorithms That Control Money and Information. University Press.
Cambridge: Harvard University Press. Thompson RF (1987) Kongo power figure. In: Perspectives:
Passi S and Jackson S (2017) Data vision: Learning to see Angles on African Art. New York: The Centre for African
through algorithmic abstraction. In: CSCW’17 proceed- Art.
ings of the 2017 ACM conference on computer supported Vance A (2014) The race to buy human brains behind deep
cooperative work and social computing, Portland, Oregon, learning machines. Bloomberg, 27 January 2014.

S-ar putea să vă placă și