Sunteți pe pagina 1din 278

IMPOSSIBLE RECOLLECTIONS

THE TROUBLED IMAGINARY OF MEDIATED MEMORY

By

Giovanni Tiso
Contact: giovanni (dot) tiso (at) gmail (dot) com Research blog: http://bat-bean-beam.blogspot.com/

A thesis submitted to the Victoria University of Wellington in fulfilment of the requirements for the degree of Doctor of Philosophy in English Literature

Victoria University of Wellington 2006

Abstract
This study is grounded in the belief that memory is one of the key areas of contestation in the current debates about technology and society. Its redefinition following the introduction of new technologies, the latest of which is the digital computer, has generated a landscape of dreams and anxieties that underlies complex attitudes towards which cultural products can or cannot be committed to memory, and who can or cannot have access to them. On the one hand, digitisation and the dissemination of information through networks such as the World Wide Web offer an infrastructure that appears on the verge of being able to make the sum of human knowledge available to all; on the other, the realisation of the strains, both cultural and technological, which are exerted upon this infrastructure gives way to visions of an impending breakdown of our ability to preserve, let alone transfer, this knowledge. These anxious imaginings are charted firstly along the axis that links the extremes of total recall and equally total forgetfulness, with an emphasis on the way in which these two narratives are played out against each other. A further exploration leads from the resonant notion of digitally documented life that informs so many current social practices to the idea that we might one day be able to upload our minds onto computer networks, only to find in that seemingly confident scenario another significant reservoir of anxiety, as well as a prime instance of the binary logic of exclusion that governs the construction and in part also our understanding of digital subjectivity. The figure of the excluded, undocumented person introduces in the last chapter an examination of the perceived threats to the functioning of collective memory and to its ability to fulfil the duty of remembering and passing on the most important events in our history. Finally, the study argues that the imaginary of anxiety just explored should be viewed not solely as a conservative reaction to social and technological change, but also as the means of grounding a more inclusive understanding of a society that is significantly inhabited, but not exhausted, by the digital.

Acknowledgments
Above all, this work benefited greatly from an ongoing conversation with Brian Opie and Linda Hardy, and it was a conversation I was privileged to be part of. I am grateful for the generosity of the folks at rec.arts.sf.written, it.cultura.fantascienza and it.cultura.libri, whose help with the initial gathering of texts was invaluable. I am similarly indebted to the very many friends who pitched in with suggestions and titles and plot summaries, but since it would be impossible to name them all I am going to limit the shout out to Marco Cultrera, who went above and beyond. Dougal McNeill, before he whisked himself off to Australia, was always there to remind me that I am a pre-post-Marxist, while Giacomo Lichtner was a great source of help and encouragement. Joseph and Lucia put it all in context (what do you mean you cant play now?). Finally, my thanks go to Victoria University, The JL Stewart Scholarship fund and the Georgetti Scholarship fund for their very generous financial support; and to Justine, for every other kind of support one could possibly wish for.

Table of Contents

INTRODUCTION Alien Marks Prosthetic Memories Of Cyborgs and Butlers CHAPTER ONE PLACES OF AMNESIA Into a Digital Dark Age Migration Dissemination Bouvard and Pcuchet in Cyberspace Art in the Age of Digital Reproduction Metaphors of Memory The Soldier of the Mist Winter Sleepers Memento Remember Sammy Jankis www.otnemem.com One Frame a Day Noise The Amnesia Epidemic The Metavirus The story, Lost CHAPTER TWO - THE HORROR OF THE TOTAL LIBRARY My mind, Sir, is a garbage disposal

9 9 18 24 34 34 38 40 47 51 54 58 61 64 67 70 76 78 84 87 91 95 97

Can It Be Done? The Stickiness of the Database Perec and the Confines of the Archive Saving the Present Documented Lives The Glut of Information The Library of Babel in Cyberspace Information and Metaphor CHAPTER THREE THE NEW HOME OF MIND Recording Humans The Forever Network The Anti-Moravec Restoring the Body of the Machine Undocumented Persons CHAPTER FOUR THE TRANSMISSION OF MEMORY From Being Digital to Being Postmodern Memoricide Informatics, the Archive and the Holocaust The Great Moon Hoax Of Recovered, False, Post- and Prosthetic Memories A New Breed of Cartesian Demons Deconstructing Lilies Do You See What I See? BIBLIOGRAPHY

104 107 115 123 126 135 139 147 151 162 174 176 181 184 192 192 197 206 224 229 238 243 247 257

List of illustrations
Figure 1 Frame from the movie Memento Figure 2 Frame from the movie Memento Figure 3 Banner heading of the Memento website (www.otnemem.com) Figure 4 Screen capture of a Flash animation from the Memento website (www.otnemem.com) Figure 5 Screen capture of a Flash animation from the Memento website (www.otnemem.com) Figure 6 Screen capture of a Flash animation from the Memento website (www.otnemem.com) Figure 7 Example of machine-readable character set Figure 8 Frame from the movie The Matrix Reloaded Figure 9 Banner advertisement for ContentAudit (from http://www.contentwatch.com/ as of August 20, 2002) Figure 10 Photograph of Winston Churchill as a toddler (from www.europa-infoshop.de) Figure 11 Photograph of my grandmother, Ermes Magnoni Figure 12 Frame from the movie The Forgotten Figure 13 The Cover of Paolo Cherchi-Usais The Death of Cinema Figure 14 A computer artists impression of the Library of Babel in cyberspace (from http://www.caad.ed.ac.uk/~richard/MSc99/) Figure 15 Frame from the movie Tron Figure 16 Photograph of the National Library of Sarajevo from El Pais of October 9, 2004 (from www.elpais.es, article search El memoricidio de Sarajevo) Figure 17 The same photograph of Joseph Stalin with and without the chief of the secret police Nikolai Yezhov, from an electronic copy of Robert Conquests Inside Stalins Darkroom, published in The Los Angeles Times Book Review of January 4, 1998. 65 67 72 73 73 74 79 90 108 127 129 134 138

141 165

205

215

Figure 18 The footprint of astronaut Buzz Aldrin on the Moon from the website of the American Patriot Friends Network (http://www.apfn.org/apfn/moon.htm) Figure 19 Example of Bert is Evil photomontage (http://files.myopera.com/bcdc/albums/35555/osama%20bin %20laden%20(bert%20is%20evil)_jpg.jpg)

226

243

Introduction

Alien Marks
Since the dawn of culture, wielding a novel tool has always signified a sense of greater power eroded by an anxiety. Thus in the myth of Prometheus, as narrated by Hesiod in Works and Days, the Titans gift to the mortals of the secret of fire leads to the unleashing among them of a plague of a much larger scale, the opening of Pandoras box. As a result, the uncomplicated life of recreation and abundance enjoyed by the pre-technological humans is upturned, and they are plunged into an existence dominated by scarcity and toil. From the point of view of todays highly industrialised (if not altogether postindustrial) West, this parable is of course deeply counterintuitive: is not technology designed on the contrary to cure Pandoras evils, to liberate us from want and help ease our labour? Is not the very knowledge that had us banished from Eden the value that we pursue in our attempts to create a better habitat for ourselves, the horizon fixed on the dream of building a New Jerusalem, or a socialist utopia? Perhaps. But to this day the modern myth is traversed by the older and highly resilient counter-myth. At the root of this enduring dissonance is the problem of perceived boundaries. Any new technology which enters into a relationship with the body questions previous conceptions of where the limits of the human lie. The blind mans cane, following Descartes famous analogy in his Dioptrics (1637), is perhaps the most classic example of how an inert tool can be thought of as an extension of the organic body, leading Gregory Bateson to ask, in Steps to an Ecology of Mind (1972), if we should not properly 9

consider the cane part of the person who carries it. A dramatically growing sense of the high degree of reciprocal permeation between the human and the technological domains has helped make the cyborg one of the central characters of contemporary cultural narratives, ushering in what Katherine Hayles has termed the age of the posthuman. And nowhere is the problem of boundaries more keenly felt and yet more difficult to locate than in those technologies that are designed to amplify the mind, instead of the muscle. After all cyborg bodies would be nothing without cyborg minds, cyborg identities. And this, again, is nothing especially new. As Erik Davis writes,
[f]rom the moment that humans began etching grooves into ancient wizard bones to mark the cycles of the moon, the process of encoding thought and experience into a vehicle of expression has influenced the changing nature of the self. Information technology tweaks our perceptions, communicates our picture of the world to one another, and constructs remarkable and sometimes insidious forms of control over the cultural stories that shape our sense of the world1.

Plato had already framed the problem in precisely these terms. He understood that the great information technology of his time, writing, would not simply extend the mind, but enter with it into what nowadays we would call a feedback loop, changing it in subtle and largely unforeseeable ways. Most famously, in the Phaedrus he has Socrates recount the story of the meeting between king Thamus and the Egyptian god Theuth, inventor of many arts, including writing, whose virtue in his view would be to make the Egyptians wiser and improve their memory (274e). To which Thamus replied:
Most scientific Theuth, one man has the ability to beget the elements of a science, but it belongs to a different person to be able to judge what measure of harm and benefit it contains for those who are going to make use of it; so now you, as the father of letters, have been led by your affection for them to describe them as having the opposite of their real effect. For your invention will produce forgetfulness in the souls of those

Erik Davis, TechGnosis: Myth, Magic, and Mysticism in the Age of Information (London: Serpents Tail, 1999), p. 4.

10

who have learned it, through lack of practice of using their memory, as through the reliance on writing they are reminded from outside by alien marks, not from inside, themselves by themselves: you have discovered an elixir not of memory, but of reminding. To your students you give the appearance of wisdom, not the reality of it; having heard much, in the absence of teaching, they will appear to know much when for the most part they know nothing, and they will be difficult to get along with, because they have acquired the appearance of wisdom instead of wisdom itself. (274e-275b)2

This overtly cautionary tale rests on a strong affirmation of where the proper boundaries of the human should lie. To call the written word an alien mark3 implies that the spoken word is natural, organic, embedded in the mind; whereas the written word is artificial, external, separated from the body. Choosing to draw the line between nature and technology at the point when word is written, rather than uttered, enables Plato to fix in time a precybernetic state worthy of being preserved, defended against the threat of contamination by this alien and alienating practice. Memory in particular, a key to cognition and the construction of the self, is seen as the exposed, contested area where the contamination would take place. Fast forward to July 1945 and to the publication of As We May Think, the essay in which Roosevelts chief scientific advisor, Vannevar Bush, laid the conceptual foundations of what would eventually become the World Wide Web. Writing on the eve of Hiroshima, Bush believed that the global scientific community would soon be liberated from the bounds of military secrecy and in need of an environment that could facilitate the timely, effective and free interchange of ideas. The individual documents involved in this traffic, mostly textual4, would be fed by each user into a workstation,

Plato, Phaedrus, translation by C.J. Rowe (Warminster: Aris & Phillips, 1984)

A direct translation of the original . (My thanks to Giacomo Lichtner for his help with the Greek source.) Bush gave the memex some multimedia capabilities it could store pictures and diagrams as well as text but did not make provisions for the inclusion of audiovisual information.
4

11

the memex, enabling them to be compressed optically on improved microfilms and marked so as to establish connections within single documents and between one document and another (or many others). Over time, these connections would grow and generate an intricate system of associations, or web of trails, analogous in operation and complexity according to Bush to the process of memory formation and retrieval in the human mind. While this idea would not be realised in practice until the nineteen-eighties, when Ted Nelson who counted Bush among his main influences developed a working prototype of this hypertextual information environment on a digital computer, Bush saw the eventual fruit of his intuition in prescient detail:
Wholly new forms of encyclopedias will appear, ready-made with a mesh of associative trails running through them []. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his clients interest. The physician, puzzled by its patients reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology. The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior.5

The webs of useful, meaningful knowledge spun in Bushs imagining provide a direct counterpoint to Platos bleak vision of the future of a humanity decerebrated by writing. These diametrically opposite pictures, both predicated on the centrality of memory, are reflected in Bushs refusal to patrol the boundary between the human and the technological domains. Not only does he regard the memex as functionally equivalent to the human
However, his comments quoted on page 13 about intercepting the sensory flow imply that he regarded this limit as inherent in the (optical) technology at his disposal. Vannevar Bush, As We May Think, Atlantic Monthly (July 1945), pp. 101-108. This is the version of the article quoted throughout the chapter and available electronically at http://sloan.stanford.edu/mousesite/Secondary/Bush.html.
5

12

mind, an enlarged intimate supplement to [] memory, but he even envisages that the technology could one day become one with our organic instruments of sensation and cognition, eliding the boundaries altogether. All our steps in creating or absorbing material of the record, he writes,
proceed through one of the senses the tactile when we touch keys, the oral when we speak or listen, the visual when we read. Is it not possible that some day the path may be established more directly? The impulses which flow in the arm nerves of a typist convey to her fingers the translated information which reaches her eye or ear, in order that the fingers may be caused to strike the proper keys. Might not these currents be intercepted, either in the original form in which information is conveyed to the brain, or in the marvelously metamorphosed form in which they then proceed to the hand?

Implicit in this elision of boundaries is that memory consists in the storage of the datastream that our senses are exposed to, rather than in a complex process of interpretation of experience followed by digestion, manipulation and revision not to mention obfuscation and repression as the theories influenced by Freud would rather propose. Having reduced memory to a few bare constituents suited to be mechanically transmitted and reproduced, Bush was able to externalise the workings of the mind into an opticalmechanical apparatus without having to worry as Plato did that the value of the knowledge that this apparatus would channel might be in any way degraded6. *** With the benefit of twenty-four centuries of hindsight, it would be tempting to claim outright that Bush was right, and Plato was wrong. ITs have been a key engine of progress and culture for as long as anybody can remember, and it is in fact almost solely thanks to them that cultural and social memory
6

For a more detailed analysis of the analogy between the memex and human memory in Bushs essay, see Chris Locke, Digital Memory and the Problem of Forgetting, in Susannah Radstone (ed.), Memory and Methodology, (Oxford and New York: Berg, 2000), pp. 25-36.

13

on such a scale is even possible. Some, like neuropsychologist and cognitive scientist Merlin Donald (of whom more below), even include external forms of memory encoding in the definition of what constitutes our species, homo sapiens sapiens7, while Walter Ong put it perhaps most succinctly when he wrote that Technologies are artificial, but [] artificiality is natural to human beings8. Besides we know that Plato must have felt ambivalent, at best, about the position he articulated through Thamus. On the one hand, as Jay David Bolter reminds us, he considered the alphabet itself a techne 9. On the other, if he really aligned himself so strongly against the new technology as not only the Phaedrus but also the Seventh Letter articulate he would presumably have opted, like Socrates before him, to shun it altogether and formulate and disseminate his philosophy through speech alone. In his Preface to Plato, Eric Havelock follows this contradiction all the way to the surprising and contrary conclusion that it conceals a strong rejection on Platos part of oral poetry in favour of literate philosophy10, a position that Havelock locates at the conclusion of The Republic. According to Jay David Bolters interpretation, on the other hand, Plato was acutely aware of the tension between oral and written discourse, and this led him to create
a genre of writing that both embodies and profits from that tension. Platos dialogues combine the permanence of writing with the apparent flexibility of conversation. Each is the record of an impossibly artful philosophical discussion, and whatever its proposed subject, each dialogue is also about the difference between philosophy as conversation and philosophy as writing. (p. 110)

7 Or should it be Homo Sapiens Sapiens Cyber? Cf. Joo Pedro de Magalhes essay by that name at http://author.senescence.info/thoughts/hcyber.html. 8

Walter J. Ong, Orality and Literacy The Technologizing of the Word, (London and New York: Methuen, 1982), pp. 83-84.

Jay David Bolter, Writing Space The Computer, Hypertext and the History of Writing (Hillsdale: Lawrence Erlbaum Associates, 1991), p. 35. Eric Havelock, Preface to Plato (Oxford: Basil Blackwell, 1963).

10

14

It is tempting to side with Bolter and give some credit to Plato in this. In his speech Thamus makes an impassionate defence of the primacy of the acts of memory performed by the mind over those mediated by technology, but if we turn to some of the metaphors regarding memory found elsewhere in Platos work, we find a range of different perspectives on the embeddedness of technology in mental processes. In particular in the Theaetetus, 191a200d, Plato has Socrates put forward two images: in one of them memory and its content, knowledge, are respectively an aviary and the birds inside it; in the other, the mental storage is a block of wax in which memories are etched, some deeply, others only superficially. According to the first model, one might attempt to recall a given piece of information, represented by a bird, but in trying to grasp it seize upon the wrong one, because of the chaotic movements of the birds inside the cage; in the second model, accurate memories are distinguished from vague or wholly forgotten ones by the different kinds of impressions made by the original perceptions. While the idea of the aviary ties in with the idea of techne in the sense of an organised human activity, ornithology, the second model is among the first in a long line of important analogies (whose lineage would include Aristotles clean slate and Freuds mystic writing pad) in which the activities of the mind, and the encoding of experience in particular, are associated with pictorial technologies and writing. By drawing the analogy, Plato undermines the demarcation proposed in the Phaedrus and projects human memory into an external space of interaction, the space of society and technology. But if Thamus view is so mired in ambiguity, what is it that still draws so many to it? Besides the quite specific domain of the Derridean critique in Platos Pharmacy, which goes to the heart of the impossibility of definitively locating where the inside ends and the outside begins, what strikes contemporary critics appears to be not so much Thamus conclusion, a hopeless invective against writing that was implicitly defused by Plato at the very moment when he chose to write it down, but rather his articulation

15

of the transformative power of technology, a power that goes beyond the predictive ability of the god of invention itself. Thus in Technopoly cultural critic Neil Postman regards the parable as a key to the understanding of the power of technologies to cause shifts in the meanings of words in this case, chiefly of the word memory that are prelude to subtle but powerful reconfigurations of the societies in which these technologies are deployed; while Walter Ong, in Orality and Literacy, proposes that what makes Thamus position so resonant is its remarkable resemblance to the most common arguments against the uncritical acceptance of digital computing today. What has enabled literate cultures to cease to regard writing as an alien practice, claims Ong following Havelock, is that they deeply interiorized11 the technology, made it so much part of themselves that they ceased to regard it as such, in the same way that Thamus appears to have interiorised speech to the point of thinking it natural, relocating the boundary to suit. At the same time as he claims that computerization is the dominant inflection of contemporary social life12, Darren Tofts also argues with Ong that its interiorisation is far from complete, which would explain why the boundaries which are being reconfigured in the process appear to be much more fiercely contested than the language used by the marketers of the digital would suggest. Hybridity, in place of the either/or logic of the myth of Thamus, emerges both as a powerfully anxiogenic symbol of loss of control and mastery of the human over the inhuman and non-human domains, and through the aggressively resonant imagery of Donna Haraways Cyborg Manifesto as the more-than-outside chance that the age might in fact be presenting us with the means of crafting a new language in which to frame emancipatory political projects. Significantly, in that it connects with the imaginary of anxiety I am going to discuss in the next few

11 12

Ong, Orality and Literacy, p. 81.

Darren Tofts and Murray McKeich, Memory Trade A Prehistory of Cyberculture (North Ryde: Interface, 1998), p. 18.

16

chapters, Haraways cyborgs are the product of profound cultural trauma, imploded germinal entities, densely packed condensations of worlds, shocked into being from the force of the implosion of the natural and the artificial, nature and culture, subject and object, machine and organic body, money and lives, narrative and reality13. These quintessentially posthuman creatures of the imaginary are unleashed at such times of implosive reconfiguration, when we look at ourselves and ask what it is that we see, and one of their functions is to sensitise us, make us aware of the otherwise self-effacing intervention of new media and technologies on culture. But they also ask to be creatively appropriated as dense symbolic instruments of reinvention of the self (more specifically, of the female self) and of the social, in a move that has the potential to placate by offering a possibly more productive alternative the fear of otherness that is still an important feature of our contemporary encounters with technology14. The principal line of argument of this dissertation is that memory, the faculty whose technologically mediated transformation worried Plato as much as it exhilarated Bush, has become a crucial battleground in the highly charged process of rethinking boundaries in the age of digital computing, for it bears the full transformative power of the technology and is implicated at the same time in the unsettling of the pairs enumerated by Haraway natural and artificial, subject and object, narrative and reality whose reciprocal interactions are called into question whenever a new invention engages the culture so profoundly. Much recent work in cultural theory has been devoted to the important task of highlighting the entanglement of these terms, the refusal of each pair to organise around a stable binary axis. By exploring the narratives that offer a resistance or articulate a set of fears
Donna Haraway, Modest_Witness@Second_Millennium.FemaleMan_Meets_OncoMouse (New York: Routledge, 1997), p. 14. For a chronicle of the cultural import of Haraways manifesto, see Zo Sofolulis, Cyberquake: Haraways Manifesto, in Prefiguring Cyberculture: An Intellectual History, edited by Darren Tofts, Annemarie Jonson and Alessio Cavallaro (Cambridge: MIT Press, 2003), pp. 84-103.
14 13

17

with respect to the computerisation (and more generally the mediation) of memory, I am going to map an imaginary that is rather less hopeful than Haraways, but that nonetheless identifies a set of significant problems and excluded subjects that our theories are duty-bound not to ignore.

Prosthetic Memories

This is not like TV only better. This is life, a piece of somebodys life straight from the cerebral cortex... I mean, youre there, youre seeing it, youre doing it. Lenny (Ralph Fiennes) in the movie Strange Days

For Plato, as is evidenced in the Meno, to seek the truth meant to un-forget, to recall the things known by the soul before being turned from pure form into a material instantiation. To know was then to re-learn things once known; to be struck by beauty an act of remembrance of the beauty once beheld. Memory was therefore for Plato the key to cognition and to the work of the philosopher. But memory in Greek and Roman antiquity, as indeed in many oral or early literate cultures, was also a precious faculty to be carefully exercised: a crucial component of ones ability to participate in debates, tell and understand stories, function in complex social environments. The Rhetorica ad Herennium, an anonymous Latin work from the first century B.C. which used to be attributed to Cicero, lays out the tenets of a mnemonic system based on the loci, places to be imagined in meticulous detail and adorned with symbolic objects designed to stand for, say, parts of a speech to be delivered. At the appropriate time, the person giving the speech would re-enter the memory palace and pick up the mnemonic objects one by one, in their original order, triggering the desired recall. Albeit internalised and not reliant on external media, at the same time as it takes advantage of the concreteness of physical objects, this art of memory to

18

borrow the title of Frances Yates seminal study on the subject could appropriately be called a technology, insofar as the Greek root techne, as Bolter again points out could be an art or a craft, a set of rules, system or methods of making or doing, whether of the useful arts, or of the fine arts15. The distinction matters here because it further complicates the task of locating the boundaries. If language needed to be invented, then it is artificial, no matter how innate the mental modules which enable Homo Sapiens Sapiens to acquire it may be16. Furthermore, as everyday experience in highly symbolic societies comes to be couched more and more in language and other forms of symbolic representation, the content of ones mental repository could also be said to be a function of the same class of instruments of representation that generate photographs and police reports, documentary films, abstract painting and so forth. I will return to the problems inherent in defining what constitutes direct, lived experience later in this introduction. For now, I want to remark that in spite of how sensitive the culture has become regarding the embeddedness of media technologies in the processes that go on inside the mind, the distinction between inside and outside, between what is human and what is artificial, has not ceased to have meaning and is obstinately pursued in the face of the difficulties I have just described. As Katherine Hayles observes in My Mother Was a Computer, [b]oundaries are both permeable and meaningful; humans are distinct from intelligent machines even while the two are [sic] become increasingly entwined17. It is within this dynamic of entanglement and severance that in this dissertation I am going to write about mediated memories at the same time as I argue that the exact nature

15

Bolter, Writing Space, p. 35.

16 Steven Pinker reminds us that Darwin regarded language ability as an instinctive tendency to acquire an art. Quoted in Steven Pinker, The Language Instinct: The New Science of Language and Mind (Harmondsworth: Penguin Books, 1995), p. 7.

N. Katherine Hayles, My Mother Was a Computer: Digital Subjectivity and Literary Texts (Chicago: The University of Chicago Press, 2005), p. 242.

17

19

and precise coordinates of the mediation do not obey fixed structural laws, but are subject to a continuous and shifting process of negotiation. An exemplary discipline in which this always arduous and sometimes anxious work of demarcation is carried out is evolutionary psychology, for whose practitioners the problem of how to articulate the split between genes and social characteristics, nature and nurture is a crucial matter in any attempt to describe the modalities of our transformations as a species. An example that puts a particular emphasis on memory is Merlin Donalds Origins of the Modern Mind, a broad interdisciplinary study which attempts to bring together the biological and technological factors involved in the cognitive evolution of humankind. Donalds theory highlights three cultural stages, corresponding to different steps in the cognitive and especially mnemonic development of our evolutionary ancestors: the episodic culture of primates, who could only retain memories of specific events; the mimetic culture of Homo erectus, a species capable of intentional generative representational language, in the form of gestures and mime; and finally the theoretic culture of late Homo Sapiens Sapiens, made possible by the invention of symbolic language and propelled by external forms of memory storage. What was truly new in the third transition, writes Donald, was not so much the nature of basic visuocognitive operations as the very fact of plugging into, and becoming a part of, an external symbolic system18. The ensuing discussion gets to the heart of the problem of how, and if, we should distinguish between biological and artificial memory. While Donald maintains a separation between internal memories (those that reside inside the brain) and external ones (memories that reside in a number of different external stores, including visual and electronic storage systems, as well as culturally transmitted memories that reside in other individuals), he makes a crucial claim regarding the nature of the pictorial and written symbolic exchange: Visuosymbolic invention is inherently a method of external

Merlin Donald. Origins of the Modern Mind (Cambridge: Harvard University Press, 1991), p. 274.

18

20

memory storage. As long as future recipients possess the code for a given set of graphic symbols, the knowledge stored in the symbols is available, transmitted culturally across time and space. This change, in the terms of modern information technology, constitutes a hardware change, albeit a nonbiological hardware change (p. 308). Donald weaves this computer metaphor together with Leibnizs notion of the monad in order to argue that external memory systems enable individual minds to form networks which effectively expand the cognitive power of each participant, to an extent that the earlier oral networks could not achieve. The networked minds of literate individuals, according to Donald, share a common memory system; and as the data base in that system expands far beyond the mastery of any single individual, the system becomes by far the greatest determining factor in the cognitions of individuals (p. 311). To this shift corresponds, as Thamus had envisaged, a shift in the locus of performative memory, and hence of reasoning itself, but one that results in an enhancement rather than a weakening of the biological counterpart: Humans do not think complex thoughts exclusively in working memory, at least not in working memory as traditionally defined; it is far too limited and unstable. In modern human culture, people engaged in a major thought project virtually always employ external symbolic material, displayed in the EXMF [External Memory Field], as their true working memory (p. 311). The development of this process of externally encoded cognitive exchange and discovery (p. 343) is an innovation that in Donalds view surpasses the invention of the alphabet itself, and that he attributes precisely to the Greeks. These, according to Donald, are the origins of the modern mind, the apex of the parallel evolution of humans and human-made information processing tools. The resulting picture is quite orderly, the subdivision more or less aligned to the one proposed by Plato. It is intelligible to us, for it speaks to the intuitive knowledge that such a demarcation might exist. But when

21

Donald moves on from the landscape of the origins, and consequently has to delve more deeply into the particularities of his architecture, he finds that the boundaries are no longer fixed. Over time, the interaction between internal and external mnemonic elements has produced a feedback loop, scrambling the divide. Visuographic invention and the resulting growth of external symbolic memory media have altered the nature of working memory and the role of biological memory in humans (p. 358). As a result, the modern mind has become a hybrid, and the individual mind has long since ceased to be definable in a meaningful way within its confining biological membrane (p. 359). In a seemingly peculiar act of theoretical hara-kiri, a hypothesis predicated on the existence of a boundary has generated a theory that negates that very boundary. When it comes to this kind of exercise, however, demarcations that undercut their own conditions of possibility are the norm rather than the exception. Take for instance Alison Landsbergs definition of prosthetic memories, that is to say, memories which do not come from a persons lived experience in any real sense19, and the slippery related notion of memory of the unexperienced. Except in a very literal, and as yet science-fictional, sense Landsbergs initial example is a 1908 Edison film, The Thieving Hand, in which a prosthetic arm which used to belong to a thief keeps stealing after having been passed on to someone else how can someones memory not be part of that persons lived experience? I was certainly alive last night as I sat in front of my TV, watching a documentary on the habits of ancient Egyptians, and yet Landsberg claims precisely that [b]ecause the mass media fundamentally alter our notion of what counts as experience, they might be a privileged arena for the production and circulation of prosthetic memories (p. 176). Couched in the metaphor is an awareness that while the distinction between the experienced and the unexperienced, the

Alison Landsberg, Prosthetic memory: Total recall and Blade runner, in Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, edited by Mike Featherstone and Roger Burrows (London: Sage, 1995), p. 175.

19

22

authentic and the inauthentic, the simulated and the real may be ultimately meaningless, or impossible to draw conceptual labels such as Landsbergs can still help to focus ones thinking on the strongly perceived sense that there is a difference, if only of a matter of degree, in the experiential content of our memories, a modulation in the (geographical, temporal, cultural) distance between the subject who does the remembering and the event that is remembered. Much the same judgment is applied by Andreas Huyssen to his own notion of imagined memory. This, writes Husseyn, is problematic to the extent that all memory is imagined, and yet allows us to distinguish memories grounded in lived experience from memories pillaged from the archive and mass-marketed for fast consumption20. The work of these critics shows again that, in spite of the impossibility of locating an actual boundary line, it is still helpful to think along the lines of an internal / external dichotomy, or to focus ones thinking by means of a metaphorical tool such as the prosthesis, when the aim is to make sense of something as profoundly opaque and intricate as the relationship between biological memory and memory technologies. A possible way of analysing the various attempts to articulate this relationship would be in fact according to the different degrees of permeation they propose in relation to each area of demarcation, no matter how tentative or provisional. For Theuth, for instance, writing is an extension of memory, a way of enhancing the mind. For Thamus, on the contrary, it is an alien and alienating device which burdens memory, and will ultimately cripple it. In the interstice is a kind of membrane, to use Donalds richly suggestive word, variously defined and adjusted according to how one chooses to conceptualise the inside versus the outside. Joining in the metaphorical fun, we could say that Platos view as it emerges from the Phaedrus is that speech has passed through the membrane osmotically, without piercing it, and has become an organic

Andreas Huyssen, Present Pasts: Media, Politics, Amnesia, Public Culture 12.1 (2000), p. 27.

20

23

component of the rational mind21. Writing, on the other hand, is a material practice whose implements threaten to hack through the membrane and damage natural memory. Theuth proposes rather a prosthetic relationship: the mind reaches out to manipulate writing, and be manipulated by it, and the membrane readjusts itself around the interface, without tearing. Writing is still on the outside, it is only symbolic language that filters through; but this conceptualisation allows for a point of interface and the possibility that technology and the human might coexist productively and enrich one another. It is the model of the Donalds and the Bushes, and in spite of its apparent optimism as we shall see it does generate monsters of its own.

Of Cyborgs and Butlers


When we say that computer user plus computer equals a cyborg, we are also stretching the membrane to envelop the machine, and questioning the nature of the boundary22. Science-fiction is one of the most fertile locations for articulating the discourse around the interaction between technology, society and culture, and offers a rich variety of perspectives on the varying degrees of permeation between the mind and memory technologies from

21 A discussion of the extent to which biological memory depends on the internalised medium of language should include a mention of the person known to psychology as Brother John, a sufferer of paroxysmal aphasia brought about by sporadic epileptic seizures. During the attacks Brother John effectively lost the gift of language, and yet managed to develop ways to cope with complex situations (most notably, his arrival at a Swiss resort during an overseas trip), suggesting that sophisticated procedural thinking, even in a highly symbolic society, can be carried out at least in part independently of language. Crucially, Brother John also appeared to remember the events that occurred to him at the peak of each attack in such a way that he was able later to describe them through language. See Andre Roche Lecours and Yves Joanette , Linguisitic and Other Psychological Aspects of Paroxysmal Aphasia, Brain and Language 10 (1980), pp. 1-23.

Writing in 1990, at a time when computer viruses were a relatively novel concept and scientists debated whether exposure to computer monitors and electronic machinery might cause changes at the level of the DNA among users, Mark Poster proposed that a symbiotic merger between human and machine might literally be occurring, one that threatens the stability of our sense of boundary of the human body in the world. What may be happening is that human beings create computers and then computers create a new species of humans. Mark Poster, The Mode of Information (Cambridge: Polity Press, 1990), p. 4.

22

24

prosthesis to implant, from useful means of augmentation to effective weapon of annihilation. Philip K. Dicks stories of memory manipulation, to be examined in greater detail in the fourth chapter, constitute a prime demonstration of the degree of complexity attained by this branch of narrative fiction in treating changes at the level of the psyche driven by new technologies. Dick is not interested in the detail of the specific mechanisms by which memories are manufactured, implanted and exchanged, but rather in the reconfiguration of the self and society that these machineries enable. In his fictions the membrane has all but disappeared, and mnemonic identity flows freely inside and outside of bodies. This is true for instance of the novel Ubik, in which imaginary worlds are consensually created by the brains of people whose bodies are dead, but whose minds are kept in a frozen state from which they can from time to time be briefly recalled. Here the apparatus which keeps the brains alive inadvertently creates a medium for them to travel through, communicate and even mesh with one another, so that at the end the boundaries between each mind and the machine are hopelessly breached. Other authors, rather than proposing a radical configuration of the self in the Platonic vein, envisage more strictly functional and supplementary roles for memory technologies which follow a trajectory closer to Bushs vision. One such function is the repository, for instance in John Varleys Overdrawn at the Memory Bank and in Walter Jon Williams Voice of the Whirlwind, where devices are invented that allow the mind to be backed up and eventually restored into a new body as a way of cheating death. A variant of this pattern occurs in William Gibsons short story The Winter Market and in Vernor Vinges novella True Names, where the backing up occurs in cyberspace. Again, a more extensive treatment of these ideas, in and out of fiction proper, will be carried out in chapter three in a separate discussion of cyberspace and digital transcendence.

25

A further coupling of biological and artificial memory is achieved through another standard sci-fi issue, the implanted memory chip; this is generally not a carrier of pre-existing information, but an extension of the clean slate, a storage space enmeshed in the brain. Except that things do not always work as advertised. In Gibsons short story Johnny Mnemonic, for instance, the set-up offers a delicious irony, in that the chip does not even communicate with the brain, but rather inhabits it like a parasite; it is, in a very literal way, on the inside, and yet remains stubbornly alien. I had hundreds of megabytes stashed in my head on an idiot/savant basis, information I had no conscious access to23, says Johnny, a data courier, in a quip that captures the peculiar alienation of the mind coping with a proliferation of digital streams. Here too, as in Dick, technological pollination happens crossways, a process Johnny is very aware of: Were an information economy. They teach you that in school. What they dont tell you is that its impossible to move, to live, to operate at any level without leaving traces, bits, seemingly meaningless fragments of personal information. Fragments that can be retrieved, amplified... (p. 130). This inverse flow complicates the task of locating the boundary, by switching the perspective to the point that the human can be perceived as an extension of the machine, rather than vice-versa. This is an idea that is not entirely novel: writing in the pioneering days of computer communication, J.C.R. Licklider referred for instance to humanly extended machines, giving the example of a large information and control system in which the human operators are responsible mainly for functions that it proved infeasible to automate24. But it is in the last, informationally charged fin de siecle that it has found its deepest instantiations to date, culminating in the granting of US patent 6,754,472 Method and apparatus

William Gibson, Johnny Mnemonic. In Burning Chrome (London: Harper Collins, 2000), p. 15.
24

23

Quoted in Rhodes, Visions of Technology, p. 214.

26

for transmitting power and data using the human body which gives to the Microsoft Corporation exclusive rights to the applications aimed at using the human body not as a sender or recipient of information, but as a conductive medium25. Even when loss of control is not inscribed quite as intimately in the makeup of the cyborg, things can and will go wrong in ways that highlight how problematic boundary-crossing is. In Lois McMaster Bujolds Miles Vorkosigan novels, for instance, chief of imperial security Simon Illyan appears at first to carry his memory chip with self-assured nonchalance, and employs his truly eidetic memory in overseeing many a successful mission. The tables turn against him, however, in the novel entitled Memory. Here we learn, first of all, that the successful installation of the chip had come at a human cost during the experimentation phase, when 90% of the trialists had developed iatrogenic schizophrenia26; later, the chip is used as a carrier for a virus meant to kill Simon or at least disable him permanently. A character explains the nature of the attack:
[T]he problems not in Illyans brain, its in the damned chip. Its doing uncontrolled data dumps. About every five minutes it floods his mind with a new set of crystal-sharp memories from random times in the past. The effect is . . . hideous. Cause unknown, they cant fix it, removal will destroy the data still on it. Leaving it in will destroy Illyan.27

Williams et al., abstract of US Patent number 6,754,472 Method and apparatus for transmitting power and data using the human body. The full text of the patent is accessible on a number of archives, for instance by entering its progressive number in the search facility at http://patft.uspto.gov. There are indeed a number of examples in current fiction of great mnemonic powers coming, like prescience in classical mythology, at the expense of something else, usually sanity as in this case or ones self knowledge, as for the characters of Commander Data in Star Trek: The Next Generation and John Doe in the TV series by the same name. For the character of Latro in Gene Wolfes Soldier of the Mist, conversely, the ability to communicate with the gods comes at the expense of the ability to remember (this novel will be examined in Chapter One).
27 26

25

Lois McMaster Bujold, Memory (New York: Baen Books, 1996), p. 246.

27

This is but a slightly less metaphorical way of framing the warning of Thamus against the corrupting power of technology. One that operates on a cultural level, via informational overload, and on a physical level, insinuating itself into the thick meshing of brain matter and silicon implant via a bioengineered apoptotic prokaryote designed specifically to eat neurochip proteins (p. 336). In other words, via a synthesis of the biological virus and the computer virus, the perfect means of attacking the cyborg body. In these narratives, the removal of mnemonic prostheses is often associated with death, a total loss of pre-existing function, or at best intense physical pain; as if to say that the traumatic coupling of the human and the machinic has to be mirrored by an even more traumatic separation, underscoring the impossibility of switching the direction of progress. Pandoras box, once opened, cannot be sealed again. In the case of Andrew Worth, the protagonist of Greg Egans Distress, the forceful separation is described in terms suggestive of a grotesque birth in reverse:
I wrapped the fiber around my hand and started hauling the memory chips out of my gut. The wound left by the optical port was too small, but the chips capsule-shaped protective casings forced it open, and they emerged into the light one by one, like the gleaming segments of some strange cybernetic parasite which was fighting hard to stay inside its host. The farmers backed away, alarmed and confused. The louder I bellowed, the more it dulled the pain. The processor emerged last, the buried head of the worm, trailing a fine gold cable which led to my spinal cord, and the nerve taps in my brain. I snapped it off where it vanished into the chip, then rose to my feet, bent double, a fist pressed against the ragged hole.28

This tearing asunder mirrors the moment of insertion of the technological appendage into the bio-port which at times is highly carnal and sensualised, as in the movie eXistenZ (1999, dir. David Cronenberg), and at
28

Greg Egan, Distress (New York: HarperPrism, 1998), p. 340.

28

times overtly reminiscent of an act of rape, as in The Matrix (1999, dir. Larry and Andy Wachowski) and offers a remarkable contrast with the sanitised, streamlined couplings portrayed in the glossy world of the marketers of the digital. Here the end-user, a self-confident, empowered individual who derives power and/or pleasure from his or her technoappendages without fear of pain or loss, is the protagonist of a starkly different brand of fiction. Its overarching philosophy, brilliantly encapsulated in Nicholas Negropontes Being Digital (1995), is unashamedly utilitarian in nature, and surprisingly timid in its imaginings. In spite of grandiose pronouncements such as the one implicit in Negropontes title, which promises nothing less than a rewriting in binary language of what it means to be human, these narratives reduce technology to an inert tool, an engine of change but merely of lifestyle, advancing the reductionist notion that our lives will carry on just as they do now, only a lot more so. In the digital age, everything will be amplified, and we will all lead super-lives with the infinite resources of the world constantly at our fingertips. This is Negroponte at his most lyrical:
Early in the next millennium your right and left cuff links or earrings may communicate with each other by low-orbiting satellites and have more computer power than your present PC. Your telephone wont ring indiscriminately; it will receive, sort and perhaps respond to your incoming calls like a well-trained English butler. Mass media will be redefined by systems for transmitting and receiving personalized information and entertainment. Schools will change to become more like museums and playgrounds for children to assemble ideas and socialize with other children all over the world. The digital planet will look and feel like the head of a pin.29

Leaving aside why one might actually want ones cufflinks or earrings to communicate with each other, let alone via low-orbiting satellites, the revealing key to this passage is the reference to that old, amiable fictional character, the English butler. This eminently reassuring figure, perfect

29

Nicholas Negroponte, Being Digital (New York: Knopf, 1995), p. 6.

29

antithesis to the cyborg, is the marketers answer to the dark imaginings coming from quarters such as science-fiction and belligerent critics of technoscience such as Donna Haraway. His subservience and discretion, not to mention his ability to render himself invisible when the master requires it30, reassure us that the integrity of our individual identities will never be threatened, even by those consumer products, such as the personal digital assistant, that are specifically designed to deal with the intimate details of our day-to-day lives. By way of further contrast, what little information reaches the general public regarding IT advances that are not aimed at immediate consumption offers instead a glimpse of a research community that is unabashedly serious about blowing apart once and for all, some 5,000 years after the invention of writing, what remains of the distinction between internal and external memory. Two stories circulated in the media over the last couple of years suggest that the American military, for one, may be intent on doing just that. Among the projects currently funded by DARPA, the defence research agency that helped build the Internet, are the development of LifeLog, a device for capturing concurrently the most disparate streams of personal data from body temperature to sensory inputs and of a microscopic memory chip, the size of a human cell31. The aims of these inventions are downright murky or vague at best one of the stated purposes of LifeLog is to serve as a tool to study human behaviour (whatever that may mean), while the chip is said to have a variety of applications, including its inoculation in the blood stream for medical purposes. But nowhere does one get the feeling that a meaningful public debate might be underway regarding the staggering Orwellian vistas that these advances open up. As in the case
According to Negroponte, the principal aim of the developers working at the Media Lab in those days was to make the interface go away (Being Digital, p. 93). Cf. Michael J. Sniffen, Pentagon tool records every breath, Australian IT (http://australianit.news.com.au/articles/0,7204,6536889%5e15841%5e%5enbv%5e,00.htm l) and David Derbyshire, Chip is 400th the size of grain of salt, The Daily Telegraph Online, http://www.telegraph.co.uk/news/main.jhtml?xml=/news/2003/02/15/wsci315.xml, filed on February 15, 2002.
31 30

30

of the shift from the industrial to the postindustrial society, so in the case of the shift from the analogue to the digital or as Mark Poster would rather put it from the mode of production to the mode of information, the dominant view is that a massive transformation is underway (or has in fact already occurred), fuelled by the juggernaut of economic and technological progress, and that it can at best be described and analysed, perhaps even critiqued and creatively inhabited, but certainly not halted, much less reversed. I think there is an implicit risk, in studying science-fiction narratives, or cyberculture, or the works of futurologists or all these things at once, of being drawn into thinking that future ways of constructing subjectivity and conceiving of individual and collective memory will take the shape that the people who engage in these modes of enquiry and who are located decidedly at the forefront of the digital revolution are feverishly imagining and to some extent enacting. I resist the strong periodisations of some of the most sophisticated theoreticians, such as Poster, not because I do not think that there is such a thing as digital subjectivity or memory, but because in the process of attempting to argue that each constitutes a clean break the critic can overlook not only their genealogy but also the forces of resistance, the social inertia that works against the wholesale substitution of new modes for the old ones. What one can lose sight of is the tension, the friction generated where the enthusiastic and often creative adoption of new tools meets the reluctance, or the outright refusal (or inability) to take part. But these contrary forces do exist, and can be seen reflected in the marked anxiety that circulates in the culture regarding the effects (or, as it is often said and written in a metaphor that is itself anxiogenic, the impact32) of these technologies on both individual and collective memory. It is an anxiety that stretches from amnesia to hypermnesia, from the collapse of the

For a critique of this ballistic metaphor, see the first chapter of Pierre Lvys Cyberculture, translated by Robert Bononno (Minneapolis: University of Minnesota Press, 2001).

32

31

structures of signification to a total informational overload; that looks with horror at the time when the digital age might eventually halt our analogue history altogether, signalling the transition of humanity to another order of being; and that despairs of even being able to hold the self together within the boundaries of personal memory, fearing the trade in false memories and the construction of false identities. Structure of the work In this dissertation I will explore the imaginary of anxiety I have just outlined in what I believe to be its key dimensions. The principal of these spans the axis from the collapse of memory to its paralysing excess, and is the subject of the next two chapters, where I will tell two opposite stories that unexpectedly reach the same endpoint. At one pole is the first of these stories, which finds in the amnesiac Leonard Shelby its everyman, describes a society that knows nothing, whose ability to remember to turn experience into lasting knowledge has been obliterated; at the opposite pole is the second story, encapsulated by the character of Funes the Memorius and inhabiting the Library of Babel, which describes a society that knows everything, can retain every single bit of record, but cannot formulate thoughts for precisely that reason. These paradigmatic and contrasting fears lead to the same end result, a paralysis of the mind, and I will comment on this convergence. Elaborating on the notion of too much memory, and yet constituting a second dimension requiring separate treatment, is the subject of chapter three and perhaps the most exalted of digital narratives: the one that proposes that we might one day all become computer dwellers, there to find an end to disease and want, as well the means to immortality. However, under this optimistic veneer too we shall discover a festering anxiety, the sinking feeling that while that future might be the future of humanity, it will not be we who will inhabit it, for we will have had to forget too much about

32

what constitutes the human, and the social in particular, in order to prepare ourselves for the transition. Finally, in the fourth and concluding chapter I will explore the social exchange of memory as the last key dimension of anxiety: for it is by means of their dysfunctional transmission that memories can be implanted, falsified, or wilfully erased, and identities can be stolen; but it is also in the social that the negotiations which are destined to shape our analogue and digital futures are taking place. In this regard the memory exchange that takes place in the social also produces the means to cut through the disquiet that permeates the culture, and offers a number of vantage points from which to critically regard the nexus of memory and humanity in the digital age. I plan therefore to conclude my analysis by proposing that the imagery of anxiety and resistance to the digital mediation of memory can be viewed like Platos cautionary tale both as a reflection of the problematic process of coming to terms with profound change and redrawing of boundaries, and as an implicit attempt to claw back some ground from the more enthusiastic proponents and theoreticians of the digital, asking that the new modes of subjectivity and memory be viewed from a broader, more inclusive perspective than theirs alone.

33

Chapter One Places of Amnesia

Into a Digital Dark Age

Digital information lasts forever or five years, whatever comes first. Jeff Rothenberg

Historians will consider this a dark age. Science historians can read Galileos technical correspondence from the 1590s but not Marvin Minskys from the 1960s. Stewart Brand

Issues of digital preservation, previously the domain of the many experts and professionals who debated them with great intensity throughout the 1990s, have more recently gone mainstream. How to store and preserve digital cultural artifacts, as well as whether and within what kind of frameworks we should digitise our analogue heritage, are questions that have entered the arena of public debate and political action, often with some urgency. The sets of problems involved are partly familiar to non-specialists in that possessing a personal computer, as very many of us do, pretty much guarantees that at some stage we shall lose some personal information, quite possibly something that is dear to us or that it is difficult to replace. The sometimes traumatic experience of losing data is felt in other words both by 34

the individual and by society, and it reverberates through at the same time as it is coloured by the contemporary imagery and narratives of amnesia. Imagery of course is not something that occurs in fiction alone, as Stewart Brands suggestion in the quotation above that we inhabit a digital dark age makes abundantly clear. So before I turn to fictional accounts proper where a greater depth of perspective is to be found, I want to set the scene in which they are constructed and in which they circulate, inspired by a deliberately skewed outlook: for the talk about digital preservation has its own biases and trajectories, and in this chapter I am going to concentrate on its (apparently) darker, more pessimistic side, inhabited by those like Brand who argue that the age of computing is to some extent synonymous with cultural loss. Anyone who has had the experience of losing data stored on portable media, such as the CD- or DVD-ROM or on a hard drive, knows that digital information has a way of being there one day, gone the next. In most cases, when one finds that a file all of a sudden cannot be opened, it means that it is irretrievably lost. Not misplaced, like a diary left on a bus; not temporarily rearranged, like a book whose pages have come off the spine but can still be placed in their original order; not partially unreadable, like a page stained with coffee; but actually vanished. By comparison, a print document is a lot harder to disintegrate; when we actually wish to do it, and to make sure it stays gone, we have to use a special shredding machine, or employ the expensive services of a document destruction company. Portable media, which can be scratched or left too long in the sun, are objects that we are familiar with and by manipulating them we become aware that they can be damaged in several ways. The hard drive of a PC, on the other hand, hidden as it is from view save for its indicator light and familiar buzz, can be more easily mistaken for something immaterial, as if it too were made of bits of information and not subject to decay. But, as its

35

name suggests, it is in fact an all-too-material object, one of the very few inside personal computers still to feature old-fashioned moving parts: metalcoated platters operated by a motor and connected to read/write heads. The heads are not meant to touch the surface of the disk, but rather glide on it and change the polarity of the metal particles of the coating (the pattern of positive and negative charges represents the stored data in binary code); however, if the heads lose their alignment even fractionally they begin to scratch the surface, effectively mangling the data as they read it. Of the two most common ways for a hard drive to malfunction the other being a fault in the motor this is the one that best highlights the fragility of the most common digital memory medium, for it has an almost primitive, exquisitely mechanical flavour; it recalls the act of scribes scratching the surface of clay tablets in order to trace cuneiform characters. Except that here the scratch is one that reverses the flow of information, that undoes language. Nor is physical decay the only threat to digital documents: in a paper entitled Ensuring the Longevity of Digital Information, Jeff Rothenberg cites also loss of information about the format, encoding, or compression of files, obsolescence of hardware, and unavailability of software1. A print document in English that still exists in but one surviving copy, will be readable for as long as humanity retains its ability to comprehend the English language as it is spoken and written today, and the paper and ink hold together; a digital document in English whose only surviving copy is encoded in Word Perfect 2.0 on a 3 inch floppy disc formatted by a 1989 Apple Macintosh, on the other hand, requires knowledge of English, a 1989 Apple Macintosh, and a copy of Word Perfect 2.0. The software in particular is a key element, so much so that it becomes difficult even to think of the document as being distinct from it. Rothenberg explains:
As documents become more complex than simple streams of alphabetic characters, it becomes increasingly meaningless to think of them as Jeff Rothenberg, Ensuring the Longevity of Digital Information (originally published in Scientific American in January 1995. All the quotations are taken from the expanded version available as pdf file from www.clir.org/pubs/archives/ensuring.pdf), p. 3.
1

36

existing at all except when they are interpreted by the software that created them. The bits in each document file are meaningful only to the program that created that file. In effect, document files are programs, consisting of instructions and data that can be interpreted only by the appropriate software. That is, a document file is not a document in its own right: it merely describes a document that comes into existence only when the file is run by the program that created it. Without this authoring programor some equivalent viewing softwarethe document is held cryptic hostage to its own encoding. (pp. 10-11)

The first-level solution, storing the software along with the documents we wish to preserve, presents significant difficulties: software applications are written for specific operating systems, which are also subject to obsolescence, and these in turn are designed for specific hardware systems; we would therefore have to preserve the obsolete machines along with the documents, and be sure to always maintain them in working order, which compounds the problem. The requirement to preserve the software could be bypassed if successive versions of Word Perfect turned out to remain upwardly compatible, but even the compatibility within a single proprietary software package is unlikely to be reliable and consistent over long periods of time, say four or five releases down the track. Moreover, hardware changes are likely in the meantime to jeopardize our ability to read the document; versions of the Apple Macintosh as at the time of writing, for instance, no longer feature floppy disc drives, which have to be purchased separately as external units, and in PCs too they have become largely optional and soon conceivably will not be produced or sold at all. One needs only think in fact of the staggering number of media that have vanished over the last twenty years, many of which are enumerated by the Dead Media Project2, to come to the realisation that to store and forget is simply not an option in the way that it often used to be with older

This impressive inventory of dead media, which saw the involvement among others of Bruce Sterling, can still be browsed at www.deadmedia.org but has, somewhat fittingly, ceased to exist except as a by now outdated archive.

37

technologies. This incessant procession of storage media that promise ever more durable and faithful storage and yet come in and out of existence in the space of a few years is mocked quite unwittingly, for certain by the title of CD-ROM: The New Papyrus, a ponderous and, needless to say, overoptimistic manual published by Microsoft in 1986 in order to promote and support this new technology. Papyrus was the dominant storage medium for written texts for approximately five thousand years. CD-ROMs, initially marketed in the mid-eighties, are now near-obsolete thanks to the DVDROM, which exists in two not entirely compatible formats and will soon in turn be replaced by either the Blu-ray Disc or its chief rival, the HD-DVD. Together, these two new media threaten to re-enact the format wars between Betamax and VHS of the early nineteen-eighties. The record of the papyrus seems quite safe.

Migration
Problems with the compatibility and longevity of software and storage media suggest a second-order answer to the problem, namely that digital content needs to be periodically copied onto newer physical media and software formats in order to stave off obsolescence. This process, generally referred to as migration, is fraught however with difficulties of its own. Firstly, as Rothenberg points out, it often entails a translation of the data, a process in which some information pertaining to the content or structure of the original document may be irretrievably lost. The migration of a scholarly essay from one word processor to another, for instance, could cause the loss of its captions, or the scrambling of its footnotes. Short of comparing the two documents word for word before destroying the older one a task which would make the process even more resource-demanding than it already is one has no way to be sure of what and how important the losses have been. Secondly, in order to be effective the migration has to be systematic and continual. It cannot skip a generation of software tools, or leave files behind; nor can it afford to underestimate the length of the next 38

obsolescence cycle, for a single break in the chain is likely to render the information inaccessible forever. As Stewart Brand writes in Written on the Wind, an article first published in Civilization Magazine in 1998:
[f]ixing digital discontinuity sounds like exactly the kind of problem that fast-moving computer technology should be able to solve. But fastmoving computer technology is the problem: By constantly accelerating its own capabilities (making faster, cheaper, sharper tools that make ever faster, cheaper, sharper tools), the technology is just as constantly selfobsolescing. The great creator becomes the great eraser.3

It may follow from this line of argument that tying the existing technology into a single all-reaching network, the Internet, will compound the problem by several degrees of magnitude. Aligning himself with this hypothesis, Brand continues:
We are in the process of building one vast global computer, which could easily become The Legacy System from Hell4 that holds civilization hostagethe system doesnt really work; it cant be fixed; no one understands it; no one is in charge of it; it cant be lived without; and it gets worse every year. (p. 72)

Contrary to this bleak pronouncement there are those who think like the Librarian of Babel, a character we will encounter in the next chapter that the Internet is doing precisely the opposite; that it can be an instrument not only for the propagation of content, but also for its preservation. In fact propagation and preservation can be regarded as two sides of the same coin: it was in order to enable the long-distance sharing of data over different platforms that the common protocols which constitute the Internet were created in the first place. These protocols make it possible for people using all sorts of different computers and programmes to look at the same information. But and here is the link between propagation and storage in

3 4

Stewart Brand, Written on the Wind, Civilization (Oct/Nov 1998), p. 72.

In the jargon of the computer industry, a legacy system is an outdated hardware or software system which has to remain in use because of the time and money invested in it and because of the difficulties involved in migrating the data it contains.

39

order to look at this information, each computer needs first of all to retrieve it from its original location. Every time we look at a web page, in other words, it is because we have downloaded it first. At that moment, our computer is offering itself as a new repository, as an alternative place to store the data. The moment is fleeting, however: for the cache memory allotted by the browser to store each page we visit is periodically freed up in order to make space for new pages. If we wish to become custodians, we need to turn the cache into permanently saved files with appropriate names and locations so that we may know that the information is there and how to access it again in the future. This is the flip side of what I will discuss in the next chapter under the label of the stickiness of the database; that is to say the fact that computers have a way of holding on to information beyond the point when it has ceased to serve its intended use. Similarly, a Web page visited does not disappear as soon as we move on to another, but survives in the computers cache. And although indiscriminate caching is often perceived as a problem as testimonied by the myriad of products which offer to rid us of the potentially insidious or embarrassing data this fleeting repository can help turn ones machine into a tool for capturing and preserving the information that we deem to be of most value.

Dissemination
This characteristic of networked computers and of their underlying protocols provides a third-order solution to the problem of digital obsolescence: dissemination. Dissemination crosses the boundaries of hardware platforms, spreads the data over different software applications and creates copies that multiply the chances of the data in question of surviving the next obsolescence cycle. The term copy needs to be treated with a degree of caution, of course, because dissemination involves a migration and therefore, in most cases, a translation. But we may choose to 40

liken this state of affairs perhaps a little romantically to what used to happen at the times of the copyists working in monasteries, and opine that the data considered by most people to be of most value will be more likely to survive in its purest form; indeed, the dissemination of different versions of a certain file may give rise to a new form of philology, one that concerns itself solely with the textual information encoded therein as Lorenzo Valla did with the Donation of Constantine rather than with the physical medium as well, as in the case of handwritten and printed documents. One can observe this philological sensibility at work in the so-called version histories of many files and applications that circulate on the Web, in which often the minutest changes between one version and the next are carefully documented. Predictably, however, dissemination too is embroiled in a series of complex and shifting negotiations. Firstly, for dissemination to work, and for the loss of information involved in the data exchange to be minimised, the environment in which the process takes place needs to feature the highest compatibility between various software tools. The Internet has indeed put itself forward as one such environment over the last few years, and HTML, the language used by Web browsers to deliver the content across every conceivable platform in more or less the same form5, has not been superseded yet. It does however have its detractors, critics who in the main believe it is simply not robust enough to fulfil indefinitely its role of lingua franca for Web designers. The main criticism is that HTML defines what the various elements in a page should look like, but not what they mean. A heading in HTML is defined for instance as having a certain aspect, but not
5 For a discussion of the problems faced by Web designers in ensuring that their pages will be displayed correctly by at least the leading browsers on the market (at the time of writing these are Internet Explorer, Netscape Navigator, Firefox and Opera), see Warren Baker, Cross Browser Compatibility, webpronews.com, 17 January 2006. http://www.webpronews.com/expertarticles/expertarticles/wpn-6220060117CrossBrowserCompatibility.html. Baker observes that 100% compatibility is impossible to achieve, but that by producing squeaky-clean code that abides with all the standards of the W3C consortium and by working around some of the key idiosyncrasies of the leading Web browsers, there will be a good chance of producing text that can be viewed by most.

41

as fulfilling the role of introducing its topic. This becomes an issue when an HTML page needs to be updated or repurposed, for instance in order to be made accessible by the relatively new generation of browsers for mobile and handheld devices. To overcome this significant obstacle to establishing the Internet as a site for the creation of a universal or near-universal repository of content, the World Wide Web Consortium and other interested parties have had to put in a great amount of work over the last few years in order to promote and strengthen XML. XML is a meta-language, that is to say, a language used to describe other languages. The XML tags structure and describe the content, providing a meta-level for its translation into other mark-up languages used for its actual delivery. This means that every time the leading Web browsers should stop supporting a once-leading version of the mark-up language used to deliver a given Web page, such as HTML, it will not be necessary to retag the page, but only to update the translation tool that translates the deep XML tags into those understood by the delivery language. The key difference is that the task of updating the translation tool can be carried out by a single vendor or organisation, without having to actively involve each content provider. In these conditions, migration becomes a far more automatised and therefore less expensive and time-consuming business, thus increasing the chances of survival of digital content created in XML. What the supporters of XML can be said to be advocating is sustainable development in a digital environment. They take a longer view. To use XML means to slow down the production of content because the intelligent tagging has to be largely carried out by human operators, rather than by more or less automated authoring tools in exchange for the likelihood of a significantly longer life span. In this way XML offers itself as a remedy for the ever shorter cycles of built-in obsolescence described by Brand and others.

42

In spite of these advantages, XML is not without its drawbacks. For one thing, its higher demands on the creators of content mean that only those who can afford to subscribe to its philosophy will produce durable artifacts. But if one agrees with the widely held position that the documents produced in this pioneering phase of worldwide digital networks are of particular cultural and historical value6, then conservation initiatives cannot be targeted only at the texts produced using XML, because the sample would not be representative. It would be like a library that chose to acquire luxury editions made with expensive, long-lasting paper, and nothing else. The high demands of XML may be driving parties involved in promoting worthwhile digitisation initiatives to turn to less expensive, proprietary formats that share few of its benefits. Besides, XML is very good at handling plain text, but does not offer a real solution to the proliferation of sound, picture and video formats. This means that the XML description of a document may survive while some or all of its actual contents have become unreadable because they are encoded in a format which has gone out of use.

Close to home Penny Carnaby, National Librarian and Chief Executive of the National Library has been a vocal supporter of the need for comprehensive digital preservation in order to stave off the spectre of a dark age. When the consultation process was at its beginnings, she declared for instance: Libraries have always been places for storing information on paper about our world. But now that so much of our heritage is electronically created, libraries have to gear up to store multi-media digital assets. After all, in an environment where the average life cycle of a web site has been variously estimated as 44 or 75 days, the potential for loss of the nation's digital cultural heritage is enormous if no one does any electronic harvesting []. The National Library has begun work to ensure that the threat of a digital dark age does not eventuate. (National Library to lead electronic harvesting, National Library Media release, 26 September 2003, http://www.natlib.govt.nz/bin/media/pr?item=1064531843). In the next chapter I will attempt a critique of the logic of total acquisition and storage, with reference for instance to the endeavour by the Library of Congress in the United States to take monthly snapshots of the whole of the Internet. For now it is worth emphasising that the threat of disappearance of a significant (in every sense of the word) portion of the texts circulating in the first decade of the Internet is very real, and as the US task force on the artifact in library collections reminds us has a disturbingly close precedent in the history of another recent medium. Writes the group that produced the report: 80 percent of all silent films made in the United States are gone without a trace. Fifty percent of films made in the nitrate era (that is, before 1950) have also perished. Among those extant, a significant portion is not well preserved. Given that the materials that have vanished were not well documented at the time of their creation, the full extent of this loss will never be known. The Evidence in Hand: Report of the Task Force on the Artifact in Library Collections, edited by Stephen G. Nichols and Abby Smith (Washington: Council on Library and Information Resources, 2001), p. 5.

43

Lastly, XML data are of the standard digital kind, meaning that they have to be stored onto physical media that are prone to decay, on board systems that are prone to becoming obsolescent. The future of XML (or possibly Java, according to some commentators) may well hold the key to avoiding or at least deferring digital oblivion; but for now, we have come full circle: the survival of information which only exists in digital form is predicated to a large extent on acts of migration and dissemination, a constant and exhausting process of cultural death and rebirth. Only by incessantly copying the data onto new formats, and by distributing them across several platforms, can we hope to ensure their survival7. But every act of migration and dissemination requires a will: files do not spread themselves, only viruses do that. This extra burden on cultural subjects adds an important social layer to the problem of preservation: in order to survive, a digital artifact has to be deemed to be relevant by as many people as possible as frequently as possible. In this way the digital creates the conditions for a much greater involvement than in the past of people outside the main cultural institutions libraries, universities, publishers in the transmission and preservation of cultural knowledge. This is not to say that the efforts of traditional institutions have become less important, except in relative terms, but rather that they now operate alongside a peer-to-peer exchange which simply did not exist in this form when mechanical reproduction depended on large-scale efforts and was therefore rigidly commodified. This social reconfiguration, with its emphasis on the formation of groups of shared interest devoted to the circulation and maintenance of digital artifacts, adds a further dimension to Walter Ongs intuition that electronic
7

In the course of the last decade many other recipes have been proposed to this end along broadly similar lines as the one proposed by Rothenberg. Giulio Blasis summary of the literature on the topic points to five fundamental strategies for digital preservation, namely refreshing (that is, the simple substitution of the physical memory support), migration, emulation, standardisation and conservation. See his introduction to Giulio Blasi (ed.), The Future of Memory (Turnhout : Brepols, 2002), pp.10-11.

44

media may be ushering in an age of secondary orality. This idea has since taken on a complicated life of its own, partly due to the fact that Ong himself treated it only in passing in Orality and Literature, and left it to other scholars to pick it up and run with it, often in different directions. The sense in which I find Ongs dictum particularly suggestive in the context of the present discussion is linked to how the contemporary mnemonic practices described above are in some ways strikingly similar to those of primary oral cultures as he characterised them. Those too depended on networks of people, linked not by electronic means but through language and a highly sophisticated array of formulaic mnemonic techniques, common places for which Ong coined the magnificent epithet of mnemonically tooled grooves8. According to Havelock, the Muses were in fact not the daughters of inspiration or invention, but basically of memorization. Their central role [was] not to create but to preserve9. To which Ong adds the reminder that their mother was Mnemosyne, the titaness who embodied memory (p. 169). The moment of creation, in other words, was (and is) surpassed in oral cultures, according to this hypothesis, by the complex system of techniques that enables the future moments of recreation. This gives the oral text a foundational instability and a degree of shared authorship and interactivity that are arguably also found in electronic texts, particularly those that circulate on networks such as the Web. As Bolter writes:
The electronic reader plays in the writing space of the machine the same role that the Homeric listener played as he or she sat before the poet. Electronic text is, like an oral text, dynamic. Homeric listeners had the opportunity to affect the telling of the tale by their applause or disapproval. Such applause and disapproval shared the aural space in which the poet performed and became part of that particular performance, just as today the applause of the audience is often preserved in the recordings of jazz musicians. The electronic writing space is also shared between author and reader, in the sense that the reader participates in

8 9

Ong, Orality and Literacy, p. 35. Havelock, Preface to Plato, p. 100.

45

calling forth and defining the text of each particular reading. The immediacy and flexibility of oral presentation, which had been marginal in ancient and Western culture for over two millennia, emerges once again as a defining quality of text in the computer.10

This shared role applies not only to the performance of a text, but also to its transmission and preservation. After all, in order to become a Homeric bard, one needed to have been a Homeric listener first. Along similar lines, Erik Davis draws a detailed and convincing comparison between the ars memoriae of Greek and especially Roman antiquity and the organization of memory as a space of information in cyberspace, a three-dimensional realm thats outside ourselves while simultaneously tucked inside an exploratory space that resembles the mind,11 and reminds us that the anonymous author of Ad Herennium wrote specs for the visual imagery of ones internal memory places that seem tailor-made to describe the computer icons in use nowadays. Finally, Leah Marcus finds a closer analogy in time to our contemporary condition in the patterns of knowledge sharing and creation of early modern Europe, describing for instance Machiavellis and Miltons highly theatrical modes and auditory ways of relating with previous scholarship12 and claiming that [t]he computer and the noisy world of cyberspace allow us to recapture some of the sociable auditory elements of early modern reading and memory that the modern archive and library have suppressed under the caveat of Silentium (p. 27). It is my contention that the recent upsurge of interest in memory in the media, in the arts and in academia is at least partly due to this significant reconfiguration, or resocialisation, of the means of accessing and transmitting cultural knowledge. These new modalities will likely take a

10 11 12

Bolter, Writing Space, p. 59. Davis, TechGnosis, pp. 198-199.

Leah S. Marcus, The Silence Of The Archive And The Noise Of Cyberspace, in The Renaissance Computer: Knowledge Technology in the First Age of Print, edited by Neil Rhodes and Jonathan Sawday (London and New York: Routledge, 2000), p. 26.

46

long time to stabilise and be understood, but in the meantime analogies that look to more or less distant pasts, such as the ones suggested by Ong, Davis and Marcus among many others, are useful to illuminate aspects of these present times we so struggle to comprehend.

Bouvard and Pcuchet in Cyberspace

Bouvard is what is called a doclet, i.e. a Javadoc tool extension. Javadoc is a standard component of the JDK and is used to automatically generate documentation from Java source code files. It usually relies on a standard doclet to produce a series of HTML files illustrating the requested packages. These files are then read with a standard HTML browser. Bouvard is a substitute for the standard doclet that generates, starting from analogous commands, a few files in a custom format readable by the Pcuchet browser. Pcuchet is a Java Swing application expressly designed to offer a more effective exploration tool. Together they form a tool that tries to improve the understanding of software written in the Java language. Bouvard and Pcuchet homepage, http://web.tiscali.it/farello/bp/intro.html

That Bouvard and Pcuchet the foolhardy copy-clerks of Flauberts eponymous novel should lend their names to software tools for use with Java, another key programming language for the delivery of content over the Internet, is an oddly fitting tribute. Their obsession with exploring every conceivable field of study down to the level of encyclopaedic minutiae (one is reminded of the infinite depths of narrowness13 that Ronald Deibert attributes to the Internet), their haphazard, supremely amateur-like approach to human lore, make them model cybernauts. But it is a model that is rather more pessimistic than the one which harks back to functioning communities of living knowledge in our earlier pasts. Yet, if we accept that Web surfing

Ronald J. Deibert, Parchment, Printing and Hypermedia: Communication in World Order Transformation (New York: Columbia University Press, 1997), p. 198.

13

47

is one of the key cultural practices to have emerged in the last ten to fifteen years, we have to look at Bouvard and Pcuchet as prototypical cybernauts, for what they do throughout the novel is in effect to surf the accessible knowledge of their time, in the form of journals and treatises instead of websites and information portals, embarking on a series of endeavours that provide ample justification for Thamus worries about a technology which favours the acquisition of the appearance of wisdom rather than true wisdom. (This is a point that at times they themselves seem sometimes on the verge of grasping; after one of their most spectacular failures, in which attempts to develop a revolutionary culinary sauce end in an explosion that devastates their laboratory, the two wretches dejectedly ask themselves what might be the cause of so many misfortunes, and of the last above all. It is, perhaps, because we do not know chemistry!14 ventures Pcuchet.) Flauberts satire is not, cannot be aimed just at his contemporaries; its pessimism is far too cosmic. I would rather propose that it is a satire of any society that allows the free circulation of knowledge, and thus in essence a rewriting of the myth of Thamus. The folly of Bouvard and Pcuchet is made possible by the cultural and technical apparatus that brought to nineteenth century France more affordable books, the beginnings of a public education system, encyclopaedias and the popularisation of science. It is an indictment of the systems social, cultural, economical which enabled more and more of Flauberts contemporaries to acquire what he clearly regarded as the appearance of wisdom. Bouvard and Pcuchet are victims of this technocultural behemoth, helpless in that they cannot discriminate between useful and useless information, between raw data and knowledge. They have heard many things in the absence of teaching, as Thamus would put it, and as a result their heads have filled up compulsively, as if by virtue

Gustave Flaubert, Bouvard and Pcuchet, translated by D.F.Hennigan (London: H.S. Nichols, 1986), p. 81.

14

48

of a painful addiction (the more ideas they had, the more they suffered, p. 18) with notions whose value cannot be determined, for they turn into junk by sheer accumulation. Nothing describes this mental landscape better than its external manifestation in their house-turned-museum:
As soon as you crossed the threshold, you came in contact with a stone trough (a Gallo-Roman sarcophagus); the ironwork next attracted your attention. Fixed to the opposite wall, a warming-pan looked down on two andirons and a hearthplate representing a monk caressing a shepherdess. On the boards all round, you saw torches, locks, bolts, and nuts of screws. The floor was rendered invisible beneath fragments of red tiles. A table in the centre exhibited curiosities of the rarest description: the shell of a Cauchoise cap, two argil urns, medals, and a phial of opaline glass. An upholstered armchair had at its back a triangle worked with guipure. A piece of a coat of mail adorned the partition on the right, and on the other side sharp pikes sustained in a horizontal position a unique specimen of a halberd. [] The back of the door was entirely covered by the genealogical tree of the Croixmare family. In the panelling on the return side, a pastel of a lady in the dress of the period of Louis XV made a companion picture to the portrait of Pre Boucard. The casing of the glass was decorated with a sombrero of black felt, and a monstrous galoche filled with leaves, the remains of a nest. (pp. 140-141)

and so forth. This description takes the best part of four pages because, in order for the satire to work, the narrative space has to mirror the (mental, cultural, physical) space occupied by the protagonists. The novel is obsessive, redundant, chaotic, so that it can resemble the world of Bouvard and Pcuchet and replicate the conditions that have made them what they are. Flauberts death prevented him from finishing the novel, but his working notes end with a paragraph that suggests that, having abandoned their studies (perhaps for no other reason that they have by now tried their hand and failed at every conceivable human discipline), Bouvard and Pcuchet will retire to copy as in former times. For this purpose they ask a joiner to

49

help them build a special bureau with a double desk, then purchase the books, writing materials, sandaracs, erasers, etc., and finally (these are the last words in the authors draft) sit down to write (p. 458). The wording of this embryonic epilogue calls attention to the gestures, the apparatus, the technology of copying: special equipment is built, all the necessary tools are bought and laid out; at last the copying can begin. From this perspective it is easy to see Bouvard and Pcuchet as mere appendages, tools of their tools and indeed, modern mechanical reproduction techniques have made the profession of copy-clerk obsolete. But the synergy between human and machine cuts both ways; it gives flesh to the machine at the same time as it mechanises the flesh. Thus again it seems compelling to view the pair, even as they return defeated to their old profession, as inhabitants of cyberspace avant la lettre, prefiguring this time the many software agents and automatic routines that carry out so much of the work of assembling information and knowledge on the Internet. To picture Bouvard and Pcuchet in cyberspace is easy because of the way in which their endeavours mirror certain characteristics of the Web: insofar as the technology that underlies the Internet has favoured certain applications and types of social interaction by privileging interconnectivity and thus favouring the multidirectional and nonhierarchical flow of information it has contributed to turn its users into (among other things) so many Bouvards and Pcuchets. When we pursue specialised hobbies and eccentric interests; when we copy and circulate text documents, photos and videos; when we contribute to non-traditional bodies of knowledge such as the Wikipedia, when we send (often unwittingly) all kinds of information to a proliferation of databases, we enter the world of Flauberts novel. It is the technology that enables and to an extent compels us to do so.

50

I take this seemingly deterministic position because the act of copying and distributing our cultural artifacts in the digital environment is not, as we have seen, an option, but rather an imperative. As more and more of our once-analogue heritage becomes digital, and as new content is simply born in that form, the passive approach to its preservation is just not viable. We will not be able to lock a digital artwork away and hope to find it intact after ten or one hundred years. Nor should one be tempted to think that it is only the obscure artifacts that are threatened by oblivion. Paolo Cherchi-Usai reminds us for instance that when the creators of Toy Story went to make a DVD version of the film, they found that twelve percent of the digital masters had already vanished15. It is natural to assume that if it had not been for the studios act of migration (Bolter and Grusin would call it remediation, as we shall see), the remaining eighty-eight percent would have suffered a similar fate. But if a successful and expensive product of the entertainment industry is not safe from the ravages of not-so-long a time, then what if anything is?

Art in the Age of Digital Reproduction


Here is a little twist in our story: for one of the factors that endangered those digital masters is precisely that they were masters that is to say, something which by definition cannot be copied (for if it were, it would cease to be a master). Not only the aura but the very notion of what constitutes the original, which had already come under serious attack in the age of mechanical reproduction, ceases to have a meaning with the shift to digital. For the computer-inhabiting artifact, one would have to apply instead Baudrillards category of the simulacrum, the copy for which there is no original16.

Cf. Paolo Cherchi-Usai, The Death of Cinema: History, Cultural Memory and the Digital Dark Age (London: bfi, 2001), p. 100. Cf. Jean Baudrillard, The Precession of Simulacra, in Simulacra and Simulation (Ann Arbor: The University of Michigan Press, 1994), pp. 1-42.
16

15

51

A corollary of this state of affairs is that digital artifacts that are bound by exclusive or privileged ownership rights have a lesser chance of surviving. Give someone a copyright and you will by definition hinder everyone elses ability to copy the piece of intellectual property thus protected. But without copying, longevity is at risk, for the necessary acts of migration will only be permitted to happen as a result of a commercial transaction, placing further strain on an already fragile process. Will the Star Wars fans who bought the boxed VHS collectors set of the original trilogy also buy it on DVD? Many of them have. But the DVD is not just a copy of the film onto a new medium, it is a remediation: an enhanced version which exploits certain characteristics of the new medium, chiefly its ability to contain larger amounts of information (in the form of more highly defined pictures and sound but also of interviews, documentaries, multi-language soundtracks and so forth) and to enable the user to access this information selectively rather than in a fixed order, much like the files in a computer directory. The efforts shown by film distributors in this area strongly suggest that they do not consider the greater quality of sound and picture in itself enough to convince consumers to upgrade. In a similar vein, early audio CDs used to offer bonus tracks that took advantage of the greater storage space of the new medium, or software applications which exploited the ability of audio CDs to double into CD-ROMs, in order to encourage consumers to abandon the older technologies of magnetic tape and the LP. Jay David Bolter and Richard Grusin define remediation as the act of representing one medium in another17. Remediation does not strive to make the medium transparent, but rather to assert it and emphasise its ability to enhance the experience of accessing content. Thus a virtual tour of a museum will not simply consist of a gallery of images to be viewed on the screen (that would achieve a degree of immediacy and transparency), but

Jay David Bolter and Richard Grusin, Remediation Understanding New Media (Cambridge, MIT Press, 2000), p. 45.

17

52

will include a map of the museum be it a real one, like the Louvre18, or a wholly imagined one existing solely in cyberspace such as the MoWa (Museum of Web Art)19 and offer hypertext navigation with links to outside resources. Similarly, most e-books are careful to preserve and refashion certain aspects of print books, such as the front cover, the subdivision of the text into numbered pages, and so forth, while emphasising the new mediums ability to offer different ways of reading the text, either thematically or by searching for particular words. By remediating old content, the new media give the illusion of cultural continuity and at the same time vehemently reassert their prerogative of becoming the message. The trouble is that remediation happens constantly within different kinds of digital media, between different versions of the same software package, and as we view the same content with different browsers. Thus compounded, the problem of what we mean by original in the age of digital reproduction becomes quite absurd: the master copy is what is lost as soon as you switch on a computer, any computer. The widely articulated crisis of the original encourages again the drawing of analogies with historical periods that preceded literacy or the invention of print. In fact the pronouncements of both Ong and Brand on, respectively, the advent of secondary orality or of a digital dark age are grounded in times past, and share it seems to me much the same method of analysis of the present, including a high sensitivity for materiality which carries an implicit rejection of some of the conclusions of post structuralism vis--vis the absolute primacy of the text. Yet their enduring tension, which does not seem likely to resolve itself soon, causes wild mood swings in our day to day dealings with the digital in spite of differences of perspective that seem largely to depend on whether one chooses to see the proverbial glass as half-

18 19

http://www.louvre.or.jp/louvre/QTVR/anglais/ http://www.mowa.org

53

full, or half-empty. Both positions describe the workings of networked societies in broadly the same terms, and agree chiefly on one thing: that there is a lot of information out there, in constant and vastly chaotic flow. What differs is the basic value judgment: is this a good thing? The predominantly positive talk about a secondary orality has it that the newly instantiated, less hierarchical modes of information selection and transmission are a form of progress, and that, further, our inability to capture the flow in its entirety and reproduce it with fidelity is in fact a kind of healthy forgetfulness20. For without it we would suffocate in an ocean of information, falling into the unbearable state of possessing too much memory that will be the subject of the next chapter. Conversely, to suggest that we live in a digital dark age means claiming that digital memory technologies are in fact threatening our ability to remember. These technologies, as we have seen, demand that the things remembered be constantly remodelled, causing periodic shifts in the contents of our memory, and all the while threaten to push these recollections into an oblivion that is at once sudden and complete. In this highly charged context, we see forgetfulness emerge as a cultural topos, echoed in discussions about the Net, the prevalence of Alzheimers disease in a first-world population whose average life expectancy is well over seventy, and popular fictions that reveal an endless fascination for amnesia and false memory syndromes.

Metaphors of Memory
In one such fiction, Banana Yoshimotos Amrita, the protagonist is a young woman, Sakumi, who suffers an attack of amnesia after a fall down the stairs of her house. When she regains her memories, they do not seem truly hers; her past feels like a story someone told her in extraordinary detail.

20

See for instance Marc Aug, Oblivion, translated by Marjolijn de Jager (Minneapolis:

University of Minnesota Press, 2004).

54

Throughout the novel, Yoshimoto describes this second memory through an elaborate series of very deliberate technological metaphors:
My memory would eventually come back to me, but most of it happened gradually, like the words of a letter written in invisible ink slowly seeping through the lemon juice. 21 [] I felt like putting the events of that day away on a disk and saving the file forever. Yes, it was that sort of feeling. (p. 57) [] Oh, where was I... I sat thinking to myself, trying to recall the events of the night before. The information came to me slowly, like data retrieved from a floppy disk. (p. 74) [] After scanning the room, I confirmed to myself that it really was my friend in the corner smiling. I had an embossed image of her in my mind, and at first it felt weird seeing her there, so near to me now. Unsure if I would recognize her, it wasnt long before her eyes and nose snapped in my mind at tremendous speed to fit in the picture in my head like when I watch quiz shows and the answer to a question pops into my head. Because of her showiness she stood out even more. Somewhere behind the newly formed, thickened image was a softly drawn sketch of the Eiko I fondly remembered. (p. 75) [] Certainly I was exposed to the oceans scent now, in that time and in that place. It rushed through me, covering every inch of my behind, and I decided to file it away in my mind as one pure recollection. (p. 164)

In these passages one gets the sense that recollections that do not feel truly experienced a textbook case of prosthetic memories, in Landsbergs terms cannot be described except by borrowing from the discourse of various technologies of representation but also of memory (handwriting, etching,

Banana Yoshimoto, Amrita, translated by by Russell F. Wasden (New York, Grove Press, 1997), p. 37.

21

55

drawing, electronic writing). Of these technologies, the digital has a privileged place and is climactically called upon to convey Sakumis feelings when her memories rush back to her. It is worth again reproducing lengthy extracts from the novel, for together they build up to an impressive metaphoric crescendo.
One memory after another, a deluge rushing in. I was doing more than just articulating what had happened; the things I remembered couldnt be described with mere words. The flood of memories was more like perceived, sensual experiences. It was as though a soft spirit had crept into my soul and, with a miraculous touch of her hand, opened the database that was blocked in my head. Now information just surged through my mind. Within seconds the objects around me, which I knew really hadnt changed, seemed different. It was as though separate databases had formed in my mind, each containing information about a different item, and I suddenly had access to them all [] All those simple, minute things were popping back into my brain. I felt like my mind was the internet, and Id just gone on a random surf under the heading, http://www.bookcase-in-the-corner. My head was filled with so much information, both of quality and of quantity, that I couldnt keep it all straight. Everything was the same: http://www.scissors-in-my-desk; http://www.hallway.near.my.room; http://www.door-to-the-bedroom; http://www.pencil-in-my-hand. (pp. 244-246)

Yoshimotos insistence on these metaphors nicely illustrates the point raised by Postman in Technopoly, namely, that new technologies not only introduce new words, but also change the meaning of old ones. This, according to Postman, was the fundamental truth glimpsed by Thamus: that writing would change the meaning of the words memory and wisdom, and that this change would be obscured by language itself, precisely because the words still look the same, are still used in the same kinds of sentences. But they do not have the same meanings; in some cases, they have opposite 56

meanings22. To explore this semantic rift one needs to ask a number of questions which again are implicated in the setting and conceptualisation of boundaries. As we increasingly transfer to our machines the burden of remembering, how do our memory habits change? How does the perception of what it means to remember change? How do we draw the line between memory and intelligence, if there is a line to be drawn? And how does this boundary-drawing affect the notion of whether or not machines are intelligent, since in very meaningful ways they appear to be able to remember? The relevance of these questions, the place that they occupy in the Zeitgeist, is reflected in a peculiar strand of fictions that have been popular in the last two decades. These do not just happen to revolve around characters that suffer from amnesia, but are in fact concerned more precisely with the effects of the memory technologies employed to alleviate the affliction. By setting the variable of human memory to zero, novels like Soldier of the Mist and films like Wintersleepers and Memento paint complex and progressively darker pictures of the interrelation between technology and remembering where there is a failure at the level of the hardware, albeit the biological hardware of the mind. It is to these stories that I shall now turn, before examining in the later part of this chapter the failures of memory which reside at the level of the code, that is to say of language itself.

22

Neil Postman, Technopoly (New York: Alfred A. Knopf, 1992), p. 8.

57

The Soldier of the Mist

My name is Latro. I must not forget. The healer said I forget very quickly, and that is because of a wound I suffered in a battle. [...] He said I must learn to write down as much as I can, so I can read it when I have forgotten. Thus he has given me this scroll and this stylus of heavy slingstone metal. Gene Wolfe, Soldier of the Mist

The first in our gallery of amnesiacs is Latro, the protagonist of Gene Wolfes novel Soldier of the Mist. A soldier in ancient Greece who has lost both the knowledge of who he is and the ability to form new memories due to a wound suffered in battle, Latro travels with a slave, Io, and carries a scroll on which he has been instructed to write the events of each day, so that he will still be able to read about them after he has forgotten them. The quotation at the top of this section is from the books opening paragraph, which is also the beginning of the scroll, and contains the first, most important pieces of information that Latro has to re-learn with each new day: who he is, why he is different, how he must operate in order to remain a subject, a sentient being capable of rational and informed decisions. Drawing an analogy with the personal computer, this information is equivalent to a start-up sequence: during start-up, the central processing unit of a computer turns to the BIOS (Basic Input Output System), which in turns tells the CPU where to find the operating system and how to communicate with the peripherals. The information is checked against what is found in memory, and if everything is in order then the computer can proceed to load into its working memory the operating system and user applications. The start-up sequence (also called bootstrap, as in pulling oneself up by the bootstraps, because it is a self-contained, recursive process which needs no further decoding) contains basic information which gives the computer to stretch the analogy a little further a rudimentary sense of identity. Whatever parts of the human mind snap into action to perform 58

similar tasks as we wake up to face a new day, they have ceased to function in the case of Latro. He must therefore turn to the scroll for his own bootstrap, but he also needs further help: for the computer is designed to always access its own BIOS first thing in the morning, so to speak, whereas Latro needs the slave Io to remind him that he has to read the scroll, and to write something in it. Insofar as the scroll becomes an extension of Latros defective long-term memory, the soldier of the mist is at the most basic level a cyborg. He looks at the scroll as an extension of his own memory, and treats it sparingly, as a precious but finite resource: I have written very small, always, not to waste this scroll. Now my eyes sting and burn when I seek to read it in the firelight, and yet nearly half the sheets are grey with my writing.23 The coupling of soldier and scroll finds its modern equivalent in the coupling between the end-user and their computer; for this figure too uploads to an external memory device what s/he cannot commit to memory streams of images and sounds, flows of textual information coming so thick and fast that not even a rhetorician of Ciceros ilk could hope to remember them without some external help. Thus the ancient soldier who, with his diminished faculties, struggles to function in an text-poor environment, becomes the oddly fitting counterpart of the modern netizen whose intact mental faculties alone barely allow him or her to function in a text-rich one. And it is an analogy that highlights how fragile the relationship is, how finite and precarious the material resources, how open to manipulation the process of memory inscription, how important the bias of the technology in selecting what will be saved and what will be lost. Narratively speaking, there is an equivalence between the novel and the scroll: what we read are supposedly the contents of the scroll, Latros first hand account as translated by Gene Wolfe, finder of the manuscript. But Latro is not the narrator, the scroll is. It is the scroll which retains the
23

Gene Wolfe, Soldier of the Mist (London: Futura Publications, 1987), p. 151

59

information which Latro forgets in the very act of writing it down; it is the scroll which lays out, for any reader but its intended one to see, associations and causal connections between points of the narration too distant for Latro to be able to connect them. Latro realises that the adventure that he is living and recounting may be intelligible to everyone but himself. Not having memory, I lose myself, he writes. This day is a like a stone taken from a place and carried far away to lands where no one knows what a wall may be (p. 223). The scroll is no real help to Latro because as soon as it is written upon, it becomes a repository of meanings that Latro cannot retrieve, a situation that recalls the transfer (as opposed to the copy) of information between computers, and the peculiar alienation of Gibsons data couriers. There is an instance in which the severed link between Latro and the scroll is especially obvious, and it is the chapter written by Eurykles, one of Latros companions. The Spartans have captured the travelling party which includes Latro, and seized the scroll. But since it is written in a language they cannot understand, they become suspicious of it and prohibit him from writing in it. Therefore Io asks Eurykles, one of Latros companions, to write the days entry. Eurykles begins thus:
As requested by your slave, Io, I shall describe the events of the past night and day, turning her words into such as may be properly set down. She asks this because Eutaktos the Spartiate has forbidden you should have this book, thinking that writing in it as you do has disordered your mind. She wishes a record to be kept that she may read it to you when this book is restored to you, and I form the letters better than she, and smaller(p. 188)

The ensuing entry is easily construed by the novels reader as an open attempt from Eurykles to ingratiate himself with Latro in order to win his protection and future favours. But Latro may not be able to decode it as such, for unable to reread the whole scroll (which at this point of the novel would take too long for a single sitting) he cannot bear in mind at the same time both Eurykles chapter and the impressions of Eurykles he had previously gathered and recorded, which would enable him to take a critical 60

stance toward the others account. The episode reiterates the importance (rather obvious, and yet so easy to understate) of just who it is who has access to the writing technology, and is therefore in charge of producing a document; but it also illustrates that possessing a working memory function is necessary to the documents interpretation. Thamus belief that writing would impair the faculty of memory in its users stems from the realisation that making memories is an intelligent process, and that to relinquish it means to lose the ability to infuse recollections with meaning.

Winter Sleepers
In Tom Tykwers Wintersleepers (Winterschlfer, Germany, 1997) the scroll is replaced by the dark room. The films amnesiac, a cinema projectionist by the name of Ren, has lost his ability to form new memories as a result of an accident which occurred during his military service. His way of compensating for the handicap is to take a small automatic camera wherever he goes and shoot pictures that he later develops and collects alongside explanatory captions into dozens of albums. This external system, as well as providing an indispensable running account of his life after the accident, enables Renes brain to fix sometimes a fleeting image or sensation into a kind of new memory, alleviating the disorientation caused by the amnesia. Far from the ragged landscapes of an ancient, war-torn Greece, Ren leads, thanks to these expedients, a more or less normal existence in a quiet mountain village where daily life appears to repeat itself monotonously enough anyway. Until another accident takes place. Coming home drunk after a night at the pub, Ren passes a sports car left by ski-instructor Marco in front of his lovers house, the keys still in the ignition. He jumps into the car and takes off for a joyride after having snapped a picture of it of course. Further down the snowy road he crosses a station wagon with a man and his young 61

daughter in it. The two vehicles swerve to avoid one another and end up in opposite ditches by the sides of the road. When Ren wakes up, the car is almost covered in snow; confused and, at this stage, made twice as forgetful by the trauma, he manages to get out and starts to walk down the road toward the village, oblivious of what actually happened, not noticing the other vehicle lying on its side. The man inside it is regaining consciousness, sees the figure of a man leaving the scene, realises his daughter is badly hurt. By the time he finally manages to get her to a hospital, she is in a coma. The rest of the film takes the form of a rather unusual story of misplaced retribution. Marcos sports car lies under the snow, where nobody will find it until the spring. The father of the little girl insists that the accident was in fact a hit-and-run, but his memories of the event are so vague that nobody believes him. Finally, the girl dies. Ren, in the meantime, carries on with his life, in spite of the niggling feeling that something very important has happened to him. He studies the pictures of the car, of the chalet in front of which he stole it, but the memory cannot be restored: the record is insufficient, the snapshots have no apparent order or meaning. Oblivious that he has even committed a crime, he cannot step forward when the car is finally found. As a consequence it is Marco instead of him who becomes the object of the dead girls father thirst for revenge, and who is finally killed. The conclusion presents an ironic touch in that the film by now has set up Ren as a quiet, kind and sympathetic character, and Marco as a manipulative, cheating, utterly unpleasant one. Retribution, in other words, strikes the innocent we side against, while the guilty we empathise with will live happily ever after. While it may be to his ultimate benefit, the fact remains that Renes external memory system has failed him. Without a context, without a proper ordering sequence, the pictures are just scattered traces, and no record at all. In the debate about the semiotic nature of photography, Renes story suggests that

62

we should reject the view, famously espoused by Susan Sontag, that the photographic sign is governed by conventions and presents a coded version of reality to the same degree as drawing or painting. It would rather appear the photographic sign is iconic, or even indexical, bearing a straight simulacrum or a trace of reality, presenting it rather than representing it. The photographer can choose the angle, alter the light and contrast, apply a given depth of field, but ultimately not distort or omit existing elements of the scene, or introduce new ones (without entering the quite different realms of trick photography or photomontage techniques). Consequently, without an added context, without a spoken or written explanation, a photograph does not narrate. As Barthes writes, [s]how your photographs to someone he will immediately show you his: Look, this is my brother, this is me as a child, etc.; the Photograph is never anything but an antiphon of Look, See, Here it is; it points a finger at a certain vis--vis, and cannot escape this pure deictic language.24 By contrast, Latros narrative account was a highly coded one, and with all its inaccuracies and contradictions it did constitute a record. Among other things, it could be read in search of meaning, of connotation, and validated or refuted by comparing it with other records. A common observation that emerges from the disparate disciplines that study human memory is that the act of committing to memory complex events or procedures involves a process of selection and association. Human memory is not eidetic; learning every speech by rote and storing every image or sensation down to the minutest detail is beyond the capacity of the human brain. Like a computer, or in fact language itself, a brain faced with the task of storing large amounts of information needs to compress it, a goal achieved by leaving out the inessential and finding patterns of regularity. Selecting information and finding patterns is an intelligent process, the kind that enables us to remember the gist of a discussion rather than every word

Roland Barthes, Camera Lucida Reflections on Photography, translated by Richard Howard (London: Jonathan Cape, 1982), p. 5.

24

63

that was uttered. Extraordinary mnemonists such as Solomon Shereshevsky, the subject of a study by the Russian neurologist Alexander Luria, or his fictional counterpart penned by Borges, Funes both of whom we shall encounter in the next chapter did not in fact need to practice this kind of selection; but because they could memorise everything, they ultimately lost the gift of intelligence. To say that recollections are filtered by intelligence and subjectivity amounts to saying that memory is a narrative process. What makes Rens photographs meaningless to him is that he has forgotten the narrative principle which made him decide to take those particular shots, and not others.

Memento

Its different. I have no short-term memory. I know who I am and all about myself, but since my injury I cant make any new memories. Everything fades. If we talk for too long, Ill forget how we started. I dont know if weve ever met before, and the next time I see you I wont remember this conversation. So if I seem strange or rude, thats probably... Ive told you this before, havent I? Leonard (Guy Pearce) in the movie Memento

Leonard Shelby, the protagonist of Christopher Nolans Memento (2000), suffers from an even more severe case of short-term memory loss. He can retain information for no more than a few minutes at a time, and only by forcing himself to concentrate; the slightest distraction makes his mind go blank. What drives him to overcome this most disabling of handicaps is a ferocious determination to find the man who raped and killed his wife, caused his brain damage and managed not to be apprehended by the police.

64

To maintain a coherent grasp on his day-to-day life, Leonard too relies on memory technologies. He takes pictures, Polaroids, to which he adds captions and that he keeps in his pocket for quick reference. This enables him to remember where he lives, the car he drives, who the people in his life currently are and whether or not they are to be trusted. Like Latro, he also has the problem of being constantly reminded about his condition and what caused it. To have quick access to this basic information (you suffer from amnesia, your wife was murdered, here is what you know about the murderer, and so on) he has had it tattooed all over his body.

Fig. 1 - Frame from Memento

In his pursuit of revenge, Leonard becomes the pawn of Teddy, a policeman, and Natalie, the girlfriend of a drug dealer, who use his disability to manipulate him into serving their own ends. They achieve this by learning the workings of his external memory system, the portable database he takes everywhere he goes, and making plausible but deceitful causal connections between its data. Realising this, but knowing he will not be able to remember coming to this realisation, Leonard decides to also exploit the vulnerability of the system he has established in order to create another deceitful connection. This will eventually lead him to convince himself that it was in fact Teddy who killed his wife, and that he should murder him. The portion of Leonards story narrated in the film in a backward progression, from the epilogue to the beginning, coincides with the investigation which leads Leonard to find the clues he himself planted in order to find Teddy guilty of his wifes murder. 65

What leaves Leonard open to such deception and self-deception is the trust he places in the systems which supplant his defective memory. His belief in the exactness of the record, in the existence of hard facts that, unlike the memories he cannot make anymore, are not tainted by subjectivity, drives him to put his faith in a technology that is all too open to distortion and manipulation. You cannot trust a mans life to your notes and pictures, complains Teddy, sensing perhaps that the next shuffling of roles in Leonards investigation might cause him one day to become the target of the others revenge. Memorys not perfect. Its not even that good, replies Leonard. Memory can change the shape of a room or the colour of a car. Its an interpretation, not a record. Memories can be changed or distorted and theyre irrelevant if you have the facts. The problem, of course, is that to make sense of facts also requires an act of interpretation, and the selection of a narrative schema: the same records, the same iconic signs, can fit into different patterns of meaning. They can tell one story, or its exact opposite. On some level Leonard appears to realise this. As soon as he takes a Polaroid, he writes a caption on its back, instantly providing it with a context. Some of these captions are denotative, and serve only to give the picture a name or establish a direct association (NATALIE; MY CAR); others are connotative, the result of inferences drawn by Leonard regarding the subject of the picture. When Leonard first takes Teddys picture, for instance, he writes the others name underneath it, thus manufacturing the simplest of memories, equivalent to this is what Teddy looks like. Later, after an incident that makes him doubt Teddys trustworthiness, he adds a value judgement in the form of an instruction, DONT BELIEVE HIS LIES. And finally, having reached the (false) conclusion that Teddy is the person he has been looking for all along, he completes the caption with the words HE IS THE ONE, KILL HIM. Denotation has turned into the most dramatic of

66

connotations; what was a simple associative memory is now a complete narrative, a terse murder story.

Fig. 2 - Frame from Memento

Leonards system even has some room for testing inferences against other inferences made on previous occasions. When Teddy warns him that Natalie may take advantage of his handicap in order to find ways to manipulate him (as Teddy himself has been doing for some time) and pressures him to write a caption to that effect behind her photograph, Leonard appeases him by writing Dont trust her on Natalies photo, but does so in cursive, rather than in his block capitals, in order to warn himself that the oddly written information may come from an unreliable source. Later he reads on the back of Teddys picture DONT BELIEVE HIS LIES and, taking his usual writing style has a sign that the information is accurate, erases the note on Natalies picture.

Remember Sammy Jankis


In Memento Mori, the short story by Christopher Nolans brother Jonathan that inspired the film, the protagonist, Earl, wakes up each morning to a different message written in block letters and taped to the ceiling which reminds him where he is and what he is doing there. For Leonard, the message is always the same and is marked indelibly onto the skin on the back of his left hand. It reads Remember Sammy Jankis and serves to remind Leonard of a person he met before his accident, when he was 67

working as a claims investigator for an insurance company. Sammy was a client who, following a car accident, appeared to have lost his ability to form new memories, a fact that Leonard was called upon to verify. Sammys name triggers both the memory of the nature of the condition and the realisation of that he also suffers from it. It is the beginning of Leonards daily bootstrap. By tattooing this coded message on his body, Leonard makes a literal transition to the status of cyborg; he turns his skin into a book, or a bank of computer RAM. The rest of Leonards tattoos are a mix of instructions (PHOTOGRAPH:
HOUSE, CAR, FRIEND, FOE, BUY FILM),

rules of behaviour (MEMORY IS notes about past events

TREACHERY; DONT TRUST YOUR WEAKNESS; NOTES CAN BE LOST; THE CAMERA DOESNT LIE; CONSIDER THE SOURCE),

(JOHN G, RAPED AND KILLED YOUR WIFE; SHE IS GONE) and pieces of information regarding his wifes killer that Leonard has gathered during his investigations, and that he hopes will eventually lead to finding him (DRUG
DEALER; FIRST NAME JOHN OR JAMES; SURNAME G____; CAR LICENCE NUMBER SG1371U).

The tattoos too, like the photographs, are a way of

manufacturing memories, but altogether more violent and disfiguring, a way of forcing Leonards body to remember the things that his mind cannot. At the same time as they are written by Leonard, the tattoos write Leonards character by forcing him to accept the truth of their statements, in a textual disfigurement that is reminiscent of Kafkas short story In the Penal Colony, except that in this case it is self inflicted. While his suspiciousness protects him sometimes from the risk of incorporating false statements made by others into the database, as when Teddy tired to influence his opinion of Natalie, Leonard is ultimately defenceless against his own deception. The film ends its backward journey with the opening scene, in turn the epilogue of a previous, untold story. Leonard has just killed a man, but realises it is someone elses enemy. For weeks, perhaps months he has been chasing the false clues planted by Teddy in order to point him to a drug

68

dealer that he wanted to eliminate. Teddy himself admits as much. But he also forces Leonard to face the possibility that his wifes murder and the revenge plot may be a fabrication, and Leonard Shelby a fictional character he himself created in order to have a puzzle to solve, a reason to carry on. Searching into his own memories of before the accident, Leonard cannot come up with a definitive answer the amnesia has stretched like a shadow further into the past to call into question the accident, and all that happened before. The fiction is temporarily exposed. But Leonard knows he will forget all this. To make sure of it, he burns the Polaroid he took of the dead man, and spins the wheel again by writing a note in which he instructs himself to add the registration number of Teddys car to the tattooed list of things he knows about the mystery man, John G.. A new story begins, of which he has already written the conclusion. In terms of cosmic pessimism, Memento packs a rather heavy punch. The amputation of memory becomes the creeping malaise which corrupts Leonards universe, forcing him through a cycle of despair and murder destined to repeat itself after each apparent resolution. But Leonard also embodies the ultimate anxiety of homo technologicus: that of inhabiting a world replete with splinters of information that defy interpretation and fade incessantly into oblivion. His answer is to manufacture a world governed by few but immutable facts, those that define the identity of a killer who may have never existed. Such is his longing for constant, fixed meanings, that he resorts to the disfiguring practice of etching them onto his own skin, as if to become a living carrier of stable truths. It is only by reducing the number of variables to the barest of minimums, by training himself to see only a tiny portion of the spectrum of possibilities which a chaotic reality presents to him, that he can maintain the role of actor in spite of his handicap. As I shall argue in the next chapter, this shutting out of the unbearable richness of reality is a coping strategy that ties the amnesiac to the hypermnesiac, for these apparently opposite conditions produce in fact the

69

same inability to process the real, either because it is apprehended insufficiently, as in the case of Leonard, or excessively, as in the case of Borges Funes. Ultimately, both narratives can be interpreted as commentaries on the shattering of the semiotic foundations of a highly symbolic culture, where memory exists only in the form of a technology in charge of collecting data, and the ability to infuse meaning into these data to turn experience into knowledge is lost.

www.otnemem.com
On another level, Memento is a parable on the inadequacy of memory technologies to support subject-binding narratives, unified streams of autobiographical consciousness of the kind produced and sustained by the mind. The problem with Leonards pictures is not that they are deceitful (they are not), but that they do not tell the whole story, or perhaps that they can be used to tell too many stories, which ultimately amounts to the same thing. In a sign of the informational feedback loop that confuses the distinction between the bodily organ and the technological prosthesis, this shortfall stretches back into the time before Leonards accident, when his memory function was supposedly intact. Hence his most shattering moment of doubt, during a conversation with Teddy: could it be that Leonards wife, not Sammys, was diabetic, and that she caused him to kill her in order to test the veracity of his amnesia? This knowledge, rooted in long-term memory, should be readily available to Leonard. And yet he has to test it, try it out, see how it fits into the puzzle of his life story. This takes the form of a very brief flashback, framed by a nonsubjective camera angle. Leonard and Catherine are lying on their bed. Leonard pinches her, as in a previous memory already rehearsed in front of the camera. Except this time the cause of the pinch is a hypodermic needle. We cut back to the conversation between Teddy and Leonard. Leonard

70

shakes his head, dismissing the possibility he has just had to enact in his imagination. This is the chief moment when the film articulates its more profoundly disquieting vision: it is not just that Leonard suffers from amnesia, and that his nave reliance on inadequate prostheses makes matters a good deal worse. It may in fact be that the internal diary of his life, and by extension of anyones life, is ultimately just as unreliable as the bundle of snapshots that Leonard carries in his pocket; a work of fiction, a partial and arbitrary selection of the many possible life stories that could be told using the same information. This is but a splinter, a momentary detour on the way to the piecing together, in traditional whodunit fashion, of the actual story, which is achieved in the concluding scene not by the films perplexed protagonist, but by the viewer. And yet, like the Sammy Jankis plot as a whole, this splinter produces an interference that clashes with the linear reconstruction of the events, and prevents the comforting act of making sense of the film from being fully realised. This interference is worth mentioning because it represents an unusual turn. It is extremely rare nowadays for mainstream, heavily industrialized cinema to question the safest of assumptions regarding the means of construction and the unified nature of the subject25. True, there are a number of filmmakers, most famously the Wachowski brothers in the Matrix trilogy, who tackle the Cartesian problem of illusionary realities, but they invariably assume the existence of a kernel of identity, a tightly bound Self, which is present at a deeper level for those who know how to dig26. What I have in

By mainstream I mean included in the main channels of theatrical distribution, outside of art houses and film schools or museums. Memento is a so-called independent production, and the common estimate of its cost (marketing and distribution excluded) stands at five million US dollars. (As reported by the Internet Movie Database at http://www.imdb.com/title/tt0209144/business and by Andy Klein on Salon.com see footnote 27 below.) At least two more recent films, Vincenzo Natalis Cyhper (2002) and John Woos Paycheck (2003) play with identity shifts caused by brainwashing technologies, but they
26

25

71

mind is more akin to the Lacanian cinema that David Lynch practices from a well-established (and pretty much exclusive) place on the fringes, and that used to be the bread and butter of auteurs of old such as Alain Resnais, Alfred Hitchcock and Luis Buuel. In Last Year at Marienbad (LAnne Dernire Marienbad, 1961), Resnais and novelist Alain Robbe-Grillet exploded the notions of character, memory and time by telling a story whose past and present obstinately refused to converge. In Vertigo (1958), Hitchcock gave to his female lead two concurrent identities, of which one was apparently false, but in fact prefigured and brought about the all-tooreal death of the other. Buuel answered in perplexing kind by having two different actresses play the same female character in That Obscure Object of Desire (Cet obscur objet du dsir, 1977). At the same time as they reflected on the past and identity, these films brought to the fore the formal dimension of cinematic narration and how it is implicated in constructing such categories. But that was then, and filmic forays into the unconscious capable of problematising the notion of the self, while at the same time reflecting on the mediums role in the development of subjectivity, have become rather less common.

Fig. 3 - Banner heading of the Memento website

In Memento, this foray is also characterised by a strong formal dimension, in that is produced by the overlapping contributions of various technologies that are at once of memory and of representation. The Polaroid snapshots
too refrain from calling into question the basic tenet that a true identity exists, and can be recovered.

72

are one example, the tattoos another; but so is film (including its flashbacks, an important convention / technique for representing memory), and so is the films website. This nexus of competing narrative perspectives is deftly orchestrated by Nolan in the construction of a mystery that refuses to stabilise into a single, coherent solution. Interestingly, the website opens with a picture, above, of burning media27 of a memory technology in the very moment of its destruction. This is followed by a Flash animation that plays on the same oscillation between recollection and erasure. The first frame is this:

Fig. 4 - Screen capture of a Flash animation from the Memento website

and it quickly morphs into this:

Fig. 5 - Screen capture of a Flash animation from the Memento website

and then back again, so rapidly in fact that the phrase some memories are best forgotten may in fact never be entirely on screen at any one time (the screenshot reproduced above is the most complete one I was able to capture). Then the disjointed word m em en t o disappears, and what remains is the piece (of paper?), now blank, on which the letters en were printed. This expands and turns into a newspaper clipping, at first glance a fairly commonplace teaser designed to give some idea about the plot of the film without spoiling the ending. If we turn however to this sentence: Little is known about Shelby himself, but a man by the same name was reported missing from a Bay-Area psychiatric facility in September of 1998, we notice that the clipping does give away the other, lesser ending, the one
An image we are going to encounter again on the cover of Cherchi-Usais The Death of Cinema. See Figure 13 on page 138.
27

73

that subverts the foundations of Leonards life story (as told by himself), by suggesting that Sammy and Leonard are in fact the same person, as the devious Teddy (DO NOT BELIEVE HIS LIES) had maintained all along. As well as pushing this interpretation from the margins of the film proper, that is to say from the nobodys land of promotional material, or extra content, which in the era of Web presence and DVD makes a strong case for inclusion in the exegesis of the main text, this piece of quasi-filmmaking calls attention to that which by definition does not: subliminality. Before the clipping comes into sharp focus, seven words in it the first to burst into view for a fraction of a second during the animation appear highlighted, for a duration far too short to be consciously apprehensible. Here is, again, the best snapshot I was able to capture:

Fig. 6 - Screen capture of a Flash animation from the Memento website

I cannot make much of the choice to emphasise those seven particular words, but this too is an instance of the perplexing mode of cinematic narration, subliminality, instrumental to this alternative reading of the films plot. In a long dissection/review of Memento in Salon.com, film critic Andy Klein enumerates the scenes and visual clues which buttress this alternative reading28. At least one of them, in which Sammy Jankis is replaced for a

74

split second by Leonard in a flashback of the scene at the psychiatric hospital where Sammy resides after the death of his wife, is so fleeting that one cannot be sure of having actually seen it without resorting to a freezeframe. This is clearly a clue of tremendous revealing force, but this force is almost entirely blunted by its ephemeral duration on-screen. This delicate balance of force and counterforce is a key to the remarkably complex task of interpreting Memento. If one takes theatre projection as the primary mode of viewing of a film, it makes sense to ask if the image is actually there, or if we should think of it instead as the proverbial tree falling in the wood where nobody is there to hear it. Does the tree make a noise? Does an image that nobody can apprehend create meaning? In one sense, the image is unquestionably there; in another, its being there is dependent on the audiences ability to perceive it. The ontological status of the image, in this regard, is partly dependent on its highly specific context, that is to say, a scripted and tightly crafted feature film. Suppose we were talking instead about the film of the Kennedy assassination shot by Abraham Zapruder: each single frame in this case could be (as it has been) magnified and subjected to minute inspection in search of clues that would be invisible if the film where shown at normal speed, and the circumstances do not pose issues regarding the legitimacy of such manipulation. But this forensic mode of reading is based on the assumption that, in this instance, the cinematic real corresponds entirely with the real real of November 22nd, 1963. Memento, on the other hand, is a piece of cinematic fiction, and as such it requires an audience to view and interpret it. What of the subliminal clues, then? We know they cannot but have been deliberately inserted, put-in-the-scene, as a French person would say. But a mise-en-scene is there to be (re)activated by the spectator. An image
Andy Klein, Everything you wanted to know about Memento, Salon.com, 28 June 2001, http://dir.salon.com/ent/movies/feature/2001/06/28/memento_analysis/index.html?pn=1.
28

75

produced in a feature film of this kind does not register at any level of the scale of cinematic reality unless a spectator can see it. Or, to put it another way, see it consciously, which is, of course, a notion that is embroiled in the very concept of subliminality.

One Frame a Day

Tom claimed that one book was enough to last a lifetime anyway, especially if you read it in the special way. What do you mean? Id asked him. What special way? One word a day, hed answer. Jeff Noon, Crawl Town (from Pixel Juice)

Enter the DVD player, and the perfect freeze-frame. The home spectator can now slow down a film to the point of dispensing with subliminality altogether, pause on a single frame in the same way that the reader of a novel can gaze at an individual sentence or word for as long as s/he likes. Films made since the mass commercialisation of VCRs were already partly exposed to this phenomenon, but those made in the (brief) age of laserdiscs and since the advent of DVDs are totally open to near-infinite modulations of viewing duration. An entry-level DVD player can slow down a film to the level of single frames29 or accelerate it by as many as thirty-two times its normal speed. Except that there is less and less ground to consider speed One the norm, of course. For one thing, it is far more problematic than in the past to claim that cinema projection is the primary, intended and ideal site in which a feature film should be viewed. For another, there are growing signs that filmmakers are becoming more conscious of the modulation of the viewing speed allowed by these new instruments. The freeze frame is being

This does not exactly match the frame ratio used in films shot with traditional cameras or transferred on film stock. The standard exposure rate of 35mm cameras is 24 frames per second, whereas the NTSC and PAL systems used for playback operate at 29.97 and 25 frames per second respectively.

29

76

recognised as a normal (hence normative) viewing practice, and is henceforth factored in, made part of the conscious creative process. This complicates even further the thematic relationship in Memento between memory technologies and interpretative practices. Leonards readings are fragmentary, and ultimately misguided (as were Rens and Latros) because what he looks at are snapshots and short statements, texts that aspire to transparency and perfect referentiality (one snapshot plus one caption equals one incontrovertible fact), but fail to achieve it. By obsessing over small details and sparse connections, Leonard loses track of the whole picture. But obsession over small details also happens to be a fairly normal behaviour in the age of DVD and home theatres, and has led to such phenomena as the cult status achieved by Figwit, alias Wellington comedian Bret McKenzie, for his three-second appearance as an extra in the background in The Lord of the Rings: The Fellowship of the Ring (2001)30. In this way, the added layer of subliminal and near-subliminal clues in Memento invites a reflection on the nature of the several media, including film itself, which encode them, and on their ability to carry and communicate meaning. The divergence between the two alternative readings of the film suggests that the defect may not be just in Leonards brain, but intrinsic to all these technologies. From this perspective, the loss of memory function would not be so much the result of a physical blow to the head or crack on the surface of a CD-ROM, but of shifts at the level of the production and circulation of information and meaning, in other words at the level of the code.

30

For a peek at Figwits fandom, see http://www.figwitlives.net/.

77

Noise
Ever since the discovery of DNA, circa 1953, Darwins theory of evolution has been framed in terms veering more and more closely towards information theory: to wit, the natural selection of replicators is made possible by random errors in the reproduction of the genetic codes of living organisms. Most of these errors result in specimens that are less fit than their predecessors, but every once in a while a mutation works the opposite way, producing outcomes that favour the survival and further reproduction of the mutated individual. Given the prodigious strength of the theory, the notion that errors will periodically occur during replication is currently held as a law of nature, and is in fact pretty much the sole recipe for evolution. And yet we still talk of errors, perhaps because, according to the dominant paradigm of information theory, it is even rarer (and in fact hardly conceivable) that an error in the reproduction of a message could lead to an improvement of the message itself. While evolution calls for us to mix up the genetic pool and views the excessive similarity engendered by inbreeding as a calamity, Claude Shannon invoked redundancy and highfidelity as the means for preserving the integrity of the message. His mind child, information theory, stressed in this way the ideal of purity, already inherent in the black or white nature of the digital, with its lack of intermediate and ambiguous states. In a nice counterpoint to the very first printed books, that attempted to reproduce the handwriting of scribes, the mid-twentieth century gave us the machine-readable alphabet, designed to be more easily scanned by a computer by reducing what letters and numbers have in common and highlighting the unique aspects of each. This alphabet is an icon of the space age, used to signify anything to do with a sciencefictional future, in spite of the fact that its use in the present is, rather more prosaically, practiced almost exclusively by banks.

78

Fig. 7 - Example of machine-readable character set

Informational purity, that is to say, communication without ambiguities, is a key to the integrity and successful operation of the command and control infrastructure. It is also a pervasive cultural emblem, a notion that imbues many of the fictions explored in the course of this dissertation. As we have seen, for digital information to be preserved, it must be distributed, and each of these transmissions involves in effect a copy. Shannon observed that errors and degradation are intrinsic to communication, but also that it is possible to guarantee the correct replication of a message by calculating and encoding the right amount of redundancy. Thus informational purity is at once always achievable and forever imperilled, the result of the cancelling out of the effects of noise and redundancy as mandated by Shannons theorem. In the first half of this chapter I discussed instances of fragility of the informational infrastructure that are physical in nature: like the blow to Leonards head, so too a hardware failure in the absence of adequate backup, the obsolescence of a memory medium, the scratch on the surface of a CD act on the symbols encoded in these structures by rendering them physically inaccessible. It is now time to turn to the threats to the information flow that operate at the level of the code, within the confines of the symbolic. These are the most insidious, for they strike at the very core of the edifice of the information age, by questioning the conditions of realisation of its foundational purity. Noise, as defined by Claude Shannon,

79

is one such threat. It interferes with communication, and yet is an intrinsic element of it. It can be accounted for, but never eliminated. Don DeLillos White Noise is the most notable example of how the concept has been elevated, in the societies that feel the greater impact of the information age, to the status of cultural topos. White noise is a particular kind of noise, characterised by a flat frequency spectrum. It is some times used to filter out other types of noises, notably in architecture acoustics and music recording; but it is also found in nature, where it takes the guise of all kinds of steady background hums, such as that marvellous soundtrack to the universe, the cosmic microwave background radiation31. In DeLillos novel, white noise is the incessant hum of a media-saturated culture, and is generated by a constant flow of information whose intensity is inversely proportional to its value. Scarce attention is paid to the noise it exists in the background of radio and televisions left on around the house, or in the visual and aural assault of commercial messages at the local supermarket. And over it all, or under it all, a dull and unlocatable roar, as of some form of swarming life just outside the range of human apprehension32. Sometimes a snippet pierces through and reaches the level of apprehension, much as when one tunes into a steady sound s/he was not previously aware of. This rising to the surface of consciousness is echoed by the interspersion in the narrative of decontextualised pieces of media-generated noise (The TV said Until Florida surgeons attached an artificial flipper(p. 29); The radio said: Its the rainbow hologram that gives this credit card a marketing intrigue (p. 122) and many others). But the noise is not benign, it acts parasitically on the mind, festers in social environments, amidst conversations that are mired by equivocation and ambiguity. The family,

31 Discovered almost by accident by two radio technicians in 1965, the Cosmic Microwave Background (CMB) is a radiation without an apparent source (in that it appears at the same intensity everywhere we look), and is thought to be an echo of the Big Bang. The idea that noise may be a property of the universe amplifies even further the remarkable cultural import of Shannons theory. 32

Don DeLillo, White Noise (London: Picador, 1999 [1984]), p. 36.

80

observes DeLillos narrator, Jack Gladney, is the cradle of the world misinformation. It is worth quoting the passage in full, for it works by piling misapprehensions one upon the other, in the best satirical tradition of Flaubert.
What do you know about Dylar? Is that the black girl whos staying with the Stovers? Thats Dakar, Steffie said. Dakar isnt her name, its where shes from, Denise said. Its a country on the ivory coast of Africa. The capital is Lagos, Babette said. I know that because of a surfer movie I saw once where they travel all over the world. The Perfect Wave Heinrich said. I saw it on TV. But whats the girls name? Steffie said. I dont know, Babette said, but the movie wasnt called The Perfect Wave. The perfect wave is what they were looking for. They go to Hawaii, Denise told Steffie, and wait for these tidal waves to come from Japan. Theyre called origamis. And the movie was called The Long Hot Summer, her mother said. The Long Hot Summer, Heinrich said, happens to be a play by Tennessee Ernie Williams. It doesnt matter, Babette said, because you cant copyright titles anyway. If shes an African, Steffie said, I wonder if she ever rode a camel. Try an Audi Turbo. Try a Toyota Supra. What is it camels store in their humps? Babette said. Food or water? I could never get that straight. There are one-hump camels and two-hump camels, Heinrich told her. So it depends which kind youre talking about. Are you telling me a two-hump camel stores food in one hump and water in the other? The important thing about camels, he said, is that camel meat is considered a delicacy. I thought that was alligator meat, Denise said. Who introduced the camel to America? Babette said. They had them out west for a while to carry supplies to coolies who were building the great railroads that met at Ogden, Utah. I remember my history exams.

81

Are you sure youre not talking about llamas? Heinrich said. The llama stayed in Peru, Denise said. Peru has the llama, the vicua and one other animal. Bolivia has tin. Chile has copper and iron. Ill give anyone in this car five dollars, Heinrich said, if they can name the population of Bolivia. Bolivians, my daughter said. (pp. 80-81)

It could be countered that this is in fact a good-humoured, almost affectionate depiction of the brain fade effect (p. 65)33 generated by the noise. There is however an escalation in the satirical scope of the novel, which becomes rather cosmic, also a la Flaubert, with the introduction of the Airborn Toxic Event: a massive leak of the (made up) chemical Nyodene Derivative, or Nyodene D, that prompts the evacuation of the town where the novel is set. The effects of the toxic substance, which Jack becomes exposed to while refuelling the familys car at a gas station, perplexingly include a sense of deja vu (p. 116) and appear in fact to be a function of what is known about Nyodene D at any one time. This knowledge is carried in turn through the airwaves by radio and television. They get them only when theyre broadcast, (p. 133) says Jacks wife Babette of the symptoms of the intoxication. The automated assessment of Jacks prognosis following the exposure is even more tightly wrapped up in the medium that produces the information.
Youre generating big numbers, [the technician] said, peering at the screen. I was out there only two and a half minutes. Thats how many seconds? Its not just you were out there so many seconds. Its your whole data profile. I tapped into your history. Im getting bracketed numbers with pulsing stars. [] The whole system says it. Its what we call a massive data-base tally. Gladney, J.A.K. I punch in the name, the substance, the exposure time and then I tap into your computer history. Your genetics, your personals, your medicals, your psychologicals, your police-andhospitals. It comes back pulsing stars. This doesnt mean anything is

White Noise, 65. The concept is borrowed from Ted Mooneys 1981 novel Easy Travel to Other Planets, that includes among its central themes an information sickness generated by the incessant information flow.

33

82

going to happen to you as such, at least not today or tomorrow. It just means you are the sum total of your data. No man escapes that. (p. 140)

Thirty years, the life-span of Nyodene D inside a human body, becomes the projected horizon of Jacks technodeath: a hybrid event, partly natural and partly cultural, induced by a man-made chemical and symbolically traced, or quite possibly effected, by its very diagnosis, the naming of it. Running parallel to Babettes memory lapses, another illness that gathers momentum as the novel progresses, Jacks slow poisoning becomes a metaphor of the more sinister and incapacitating effects of the noise of culture. The two conditions are in fact intertwined, as evidenced by the recourse of both Jack and Babette to Dylar, a black market psychopharmaceutical designed to remove the fear of death, and whose side-effect tellingly is a warping of the ability to process the symbolic. Insanity as a silencing of the noise34.

Conversely, of course, insanity, and paranoid schizophrenia in particular, have become a source of paradigmatic contemporary reading practices. The phenomenon of hearing voices works, according to the current prevailing hypothesis, by virtue of the mind converting noise into intelligible speech (and can be alleviated by excluding outside noise via other aural stimuli, for instance by wearing a walkman). Works such as Umberto Ecos The Name of the Rose and Foucaults Pendulum, and Thomas Pynchons The Crying of Lot 49 variously delve into the problem of what constitutes a paranoid reading that gives meaning to noise as opposed to a correct, rigorous, well-supported interpretation of a text. In so doing, they also ask the question of what a text is. As in the case of the paranoid schizophrenic, it may be a morosely heightened sensitivity to certain details (on the part of Pynchons Oedipa Maas for postmarks, for instance) that leads the characters in these novels to connect and bind a nexus of disparate signs into texts, by linking them in ways that a person who is sane (or not appropriately tuned in, depending on your point of view) would not see. But what of the ontological status of the resulting text? Do connections breed meaning, bring the text into reality? The Name of the Rose suggests as much, when the plot seemingly uncovered but in fact imagined, written by the detective protagonist is appropriated and advanced by his antagonist, who proceeds to kill his fellow monks according to the script.

34

83

The Amnesia Epidemic

Alzheimers is like when they switched to metric. When everything youd known for years suddenly got mixed up. The numbers werent the same, they didnt mean the same thing. You didnt really know how much things cost, or what the temperature was, or how far it was to the next town. Jeffrey Moore, The Memory Artists

Forgetfulness has gotten into the air and water. Its entered the food chain, (p. 52) says Jack to Babette, at once trying to making her feel less alone in her strife and encapsulating one of the overarching themes of the novel. Forgetfulness in White Noise is a cultural condition engendered by technology, an epidemic that spreads like pollution and is equally a byproduct of a society dominated by the twin, out-of-control forces of consumption and waste. The din of information that provides the soundtrack to this society fuels the entertainment and marketing machine that supports it, but is otherwise of no value (hence equal to noise, in Shannons terms). Nobody seems to know the important things: how to build a refrigerator, how a radio works, even how to eat and drink properly, the basic parameters of everyday care of ones body. In Jonathan Franzens The Corrections, this amnesia epidemic assumes a more literal form in the onset of dementia suffered by one of the protagonists, Alfred. The novel has much in common with DeLillos work: its narrative is also centred around a family of the American Midwest and is similarly driven by satire and humour, with some of the best pot shots aimed at academia and the cultural studies industry in particular. But the strongest link resides in the way that the two works bind the theme of forgetfulness to culture and technology. Here too much is made and then unmade of an experimental drug, Corecktall, a revolutionary neurobiological therapy initially aimed at sufferers of degenerative neurological diseases such as Parkinsons and Alzheimers, but ultimately offering for the first time the 84

possibility of renewing and improving the hard wiring of an adult human brain35. The cure to dementia becomes the avenue to put an end to all social ills. Of all the potential applications of Corecktall, this is the most humane. This is the liberal vision: genuine, permanent, voluntary self-melioration (p. 209). But this of course is the hype of marketing, the obligatory hyperbolic fanfare that precedes an IPO. As to reality, Alfred in the end is denied access to a phase II trial of Corecktall, an insult that compounds the injury of having had stolen from him a patent crucial to the drugs development. What remains is the condition, its cultural resonance. Like a rapidly deteriorating Alzheimers [sic] patient, New Zealand is losing its recent memory36. Thus the opening of an article in the New Zealand Herald on the preservation of the countrys digital heritage, and the turn of phrase is by no means unusual. Alzheimers has become a buzzword, a common place. Along with AIDS, it is one of the defining diseases of our times. Loss of memory function is a more profound handicap in those societies that have come to depend more and more on information flows; these also happen to be the same societies in which technological progress and the monopoly on the worlds resources have increased the life expectancy of individuals to levels that have dramatically raised the rates of senile dementia. In other words, the same technological imperatives that demand greater and greater information processing skills create in the societies that have fostered them the conditions for the loss of those skills. This double bind emerges quite aptly in one of the principal storylines in the sixth series of the US television drama ER, centred on the onset of Alzheimers disease in a prominent physician (played by Alan Alda). The drama here does not reside in the rather hackneyed device of inflicting a serious disease on a medical doctor, but in the fact that Alzheimers literally takes the medicine away from the

35 36

Jonathan Franzen, The Corrections (New York: Picador, 2002), 118.

Andrew Janes, Library captures digital heritage, New Zealand Herald, 29th March 2005.

85

man, erodes his knowledge to the point that he can no longer be trusted to practice his profession. This exclusion (which is in effect a killing off of the character as well) is symptomatic of the barring of similarly handicapped persons from participation in society at large in a country that more than any other embodies the notion of a technological progress that cannot and will not be reversed. The characters response, his way of taking leave, is to summon his residual mnemonic resources for the recitation of a poem by Wendell Berry full of melancholic nostalgia for the peace of wild things, the silence that precedes the din of technology.
When despair for the world grows in me and I wake in the night at the least sound in fear of what my life and my childrens lives may be, I go and lie down where the wood drake rests in his beauty on the water, and the great heron feeds. I come into the peace of wild things who do not tax their lives with forethought of grief. I come into the presence of still water. And I feel above me the day-blind stars waiting with their light. For a time I rest in the grace of the world, and am free.37

In the societies where the memory of this peace is more remote, the wish to recapture it more hopeless, Alzheimers is the grave extreme of a universal cultural condition. As DeLillo writes, peoples eyes, ears, brains and nervous systems have grown weary38.

The poems title, The Peace of Wild Things, is also the title of the ER episode that first aired in the United States on November 11, 1999.
38

37

De Lillo, White Noise, p. 67.

86

The Metavirus
Thats a hypercard. I thought you said Snow Crash was a drug, Hiro says, now totally nonplussed. It is, the guy says. Try it. Does it fuck up your brain? Hiro says. Or your computer? Both. Neither. Whats the difference? Neal Stephenson, Snow Crash

More literally true to the definition of an informational epidemic, the computer virus opens up much the same symbolic vein as Alzheimers disease for those who wish to reflect on notions of mediated remembering and forgetting. Different in that it is programmed, malicious, selfreplicating, the virus carries nonetheless the same ability to wipe the mind clean and impair its ability to function. And since the mind in question, as we have seen, is the custodian of ever-increasing portions of the knowledge and memories that are most useful and dearest to us, it does not have to matter very much that it belongs to a computer the loss will be felt just as keenly. Or will it not? The story of the computer virus, it could be argued, is in fact one of constant overstatement, in which each threat turns out to be a scare, and each impending datapocalypse gets postponed until the following week, like in a cheap serialised thriller39. After all, did not the Y2K Bug, the mother of all informational pandemics, fizzle out along with all the leftover champagne some time in the afternoon of January 1st, 2000? Perhaps it did. But perhaps the point is precisely that we do get scared, we do panic. Whether it is well-founded fear or neurosis, we do dread the loss of data,

39 Notable for its zeal among the organizations that document the myths and hoaxes surrounding the outbreak of computer viruses is VMyths (www.vmyths.com), and it takes its mission very seriously. The media suffers from a short memory. So does the antivirus industry. Six months is ancient history to them -- and 1999 was a millennium ago. Reporters and experts alike will drag out old things and call them brand new. Vmyths remembers the past simply because someones got to. A beginners guide to Vmyths, http://vmyths.com/resource.cfm?id=56&page=1

87

and renew at the same time every year the subscription to the appropriate anti-virus software, the flu shot of the information-dependent. Besides, even if what is real were in fact solely the anxiety, then the symbolic richness of the concept would hardly diminish. For it is from the outset, by its very denomination, that the computer virus presents itself as a hybrid, half informational entity, half biological disease-carrier, a threat that infects the network in order to harm the human. Nor could the harm really come to anyone else, of course: a database does not suffer for the loss of information; changing the recipes of Is and Os on a hard drive does not make it unhappy. And yet, the metaphor is so ingrained, its terms so complex and opaque in their reciprocal actions and reactions, that choosing to be more rather than less literal can help to gain some perspective on the lot. PCMag.com did as much recently in a piece entitled Human Contact Spreads PC Viruses, which opened in perfect scare mongering stile by claiming that [t]he federal Centers for Disease Control (CDC) and National Science Foundation (NSF) have issued a stunning joint announcement: PC viruses, worms, and spyware can now be transmitted via human contact.40 As ones eye runs to the date, April 1st, the question nonetheless lingers: how else would these ills travel? The computer is no alien technology, the offending programmes are written by humans, and most data is about people. This human dimension, so easy to obscure and forget, is made thicker and more visible by Neal Stephenson in Snow Crash by calling constant attention to the mechanics of the interplay between online and offline selves. The chief vehicle of this toing and froing between the actual and the virtual, the literal and the metaphoric, is the eponymous snow crash, both a computer virus and a drug. It is the polysemic nature of this metavirus that allows Stephensons characters to say things like [h]is software got

A.C. Feafunnoll, Human Contact Spreads PC Viruses, PCMag.com, April 1st, 2005. http://www.pcmag.com/article2/0,1759,1781208,00.asp.

40

88

poisoned. Da5id had a snow crash last night, inside his head.41 The drug messes with the information inside the body, at the level of DNA, the virus with the bodies of information that circulate in cyberspace. The upshot is that the victims lose the ability to speak except in tongues, tapping into a primeval language that is not intelligible to them. At the same time as this mysterious language itself an ancient form of virus speaks through those affected, snow crash exposes them to a form of mind control that bypasses the higher linguistic functions of the brain. Someone who knows the right words can speak words, or show you visual symbols, that go past all your defenses and sink right into your brainstem. [] Once a neurolinguistic hacker plugs into the deep structures of our brain, we cant get him out because we cant even control our own brain at such a basic level (pp. 368369). As the century of mass media, fascist propaganda machines and viral marketing was drawing to a close, Snow Crash wove Burroughs pronouncement that language is a virus into an environment marked by the deeper and deeper enmeshing of humans and their technologies of communication. What cyberpunk literature articulated verbally in the last two decades of the twentieth century, science-fiction cinema translated visually, producing such iconic images as the green rain of code in the Matrix trilogy. Standing in front of screens showing only the incessant and seemingly chaotic downward motion of fluid, changing characters, the humans in these films, unplugged and aided only by their eyes and brains, read whole landscapes and events in what appears to be a boundless richness of detail. These are scenes that take place in an online world, a simulation that for the majority of the human population is all there is, and is apprehended, from the inside, in the same way as the physical world we are used to. But those few, enlightened people who have learnt to recognise it (and reject it) as an illusion have also developed the uncanny ability to see it on screen at a much higher level of symbolic abstraction, turning themselves into real time
41

Neal Stephenson, Snow Crash (London: ROC, 1993), p. 186.

89

compilers of the most complex of machine codes. In spite of the sometimes raging technophobia of the film, this is a brilliant portrayal of the merging, mediated by language and code, between humans and machines.

Fig. 8 - Agent Smith (Hugo Weaving) as metavirus in The Matrix Reloaded (2003)

Riding the boundary is Agent Smith, a software construct designed to hunt and destroy the avatars of the human rebels who have lifted the veil but still travel inside the Matrix in order to spread their gospel and induce others to see past the simulation. For the first incarnation of Agent Smith, humans are a disease, a virulent infection, and the agents are the antibodies42. But something happens in the showdown against the leader of the rebels, Neo, at the end of the first film. A cross pollination occurs, or, as Smith puts it, a mixing of the codes. As a result, he becomes unplugged, and strays from his mission. In this new hybrid form he acquires the power to duplicate and absorb other avatars until, by the end of the third instalment of the series, Matrix Revolutions, the whole of the Matrix population consists solely of Agent Smiths, raising the possibility that, in true metavirus form, the offline humans still connected to the system may also have been infested and transformed. Thus Smith comes to embody another key anxiety of the information age, the fear that machines will in time not only control but

You move to an area, and you multiply, and multiply, until every natural resource is consumed. The only way you can survive is to spread to another area. There is another organism on this planet that follows the same pattern. A virus. Human beings are a disease, a cancer of this planet, you are a plague, and we are the cure. The Matrix (Andy and Larry Wachowski, 1999).

42

90

effectively mimic and replace the human race, reducing the whole of culture to a spectral automated conformity43.

The story, Lost

Johnny takes Metaphorazine. Every clockwork day. He says it burns his house down, with a haircut made of wings. You could say he eats a problem. You could say he stokes his thrill. Every clingfilm evening, climb inside a little pill. Intoxicate the feelings. Play those skull-piano blues. Johnny takes Metaphorazine. Jeff Noon, Metaphorazine (from Pixel Juice)

Hybridity, if not downright promiscuity, between information, language, machines and humans is one of the strongest themes in the work of English novelist Jeff Noon. Noon enjoys getting these concepts to mate, to breed, and shares with many a contemporary cultural scholar notably Erik Davis in his important work TechGnosis a telling taste for the compound word. One such newfangled semiotic creature is nymphomation, as in the title of his fourth novel, a word born of the collapse of information and sexuality signifying the crosspollination between the worlds either side of the interface that is a constant of all of Noons fictions. From the dreamrecording feathers of his debut novel, Vurt, to the virus unleashed by a virtual gaming world that makes people sneeze to death in Pollen, and ever since, Noon has steadfastly refused to locate the boundary, the point of interface, opting to play card tricks with them instead. Hence his obsession with the mirror, a surface which seems to open up onto another space but is in fact a reflection, sometimes distorted, of what is on the side of the onlooker. Information-scrambling disease is another recurring theme,
43 There are variations of this vision that are not tied to the concept of the information flow. The threat of communism (or, as some would rather interpret it, of social conformity spreading from within) is the source of the viral substitutions in Don Siegels The Invasion of the Body Snatchers (USA, 1958). The storylines involving the alien race of the Borgs in Star Trek: The Next Generation also raise this spectre, turning into a threat the societal drive towards a subjugation of the good of the individual in favour of the good of the collective.

91

introduced in Automated Alice by Newmonia, a virus that creates biological and linguistic hybrids such as the zebra man and the civil serpent. It is in his most recent novel, Falling Out of Cars, that Noon articulates his deepest reflection to date on the reciprocal permeation of information and the physical real. The premise revolves again around a mystery virus, a condition that amplifies noise, strictly in the sense antagonistic to information meant by Shannon. Whether this affects the humans who carry the disease, or in fact the world that surrounds them, is strangely unclear, as the boundary between perception and objective experience is constantly shifted. The whole country is fraying at the edges44, quips one of the characters, and he does not mean it metaphorically: the very coastline is losing its contour. The poisoned software of Stephensons metaphoric landscape is replaced here by the sick car, as the disease moves from the strictly informational to the mechanical as well. But the most radical effects are still felt at the level of the code. Rule-based concepts and systems, such as the game of chess, are among the first to go. The digital age is swiftly undone (I had a computer, said Henderson. Once. What happened to it? I threw it away, over a bridge. Mercy killing. (p. 122)). Phone lines are rendered useless by the hiss and crackle. Language suffers, too, and so does the novels ability to generate stable meanings, in that it takes the form of the diary of one of the diseased. The unreliability of the narrator, Marlene, is therefore radicalised by the fact that her story is not told in a disembodied narratorial voice but produced using one of the technologies under attack. Recall the scroll minutely inscribed by Latro, which he could not apprehend except in fragments, or the snapshots shuffled and rearranged by Ren and Leonard in their frustrating attempts to construct an overarching narrative. Marlenes amnesic condition goes even deeper, for it infects the operation of writing itself.
44

Jeff Noon, Falling Out of Cars (London: Doubleday, 2002), p. 130.

92

This was the story, I thought; this was everything that had happened already, all the things I had seen, and done. This was the secret, the box of clues. All I had to do was look through the pages, read the words; I would start to make sense of myself once again, find the proof of myself, by doing this, this reading. And yet, the book was filled with some kind of alien language, page after page of indecipherable markings; all the words crawling around the paper, merging together, separating, and all the time losing themselves before my cold staring eyes. Even the pages I had been working on just a short while before, they too were cast in this strange manner, and lost. The story, lost. Only here and there amongst the pages would a few lines of text emerge clearly from the black, smoky mess of ink, words I could not recognize as my own. Some other person had taken over the writing of the book. And then I bent my head a little, and brought the book closer, towards my face. There it was, on the edge of silence, a gentle fizzing sound. The book was making a noise. (p. 212)

Loss of information content becomes a condition so pervasive as to affect humans simultaneously at the level of biology, culture and technology. Another drug, Lucidity, goes the way of Dylar and Corecktall, and its power to attenuate the noise quickly wears off as the organism grows accustomed to it. More effective are stratagems that limit the semiotic richness of human environments. In these places, [o]nly the most important road signs could be seen, and these limited to half a dozen basic symbols. LEFT, RIGHT, GO,
STOP, YES, NO.

Very few of the stores had proper names to them. They were

called things like BUTCHER, or else BAKER, or even PRODUCT. There was more than one shop called, quite simply, SHOP. This was safety, of a kind: information held at bay (p. 48). It is again the strategy adopted by Leonard in Memento. By contrast, the opposite movement of seeking in greater recourse to technology a cure for the disease, by building the much hyped Lucidity Engine, dissolves in shambolic rituals that are grotesque parodies of engineering. There is, however, no suggestion in the novel that one 93

should seek to restore the peace of wild things, strip the world of human media and technologies. What Noon gives us instead is the deeply resonant expression of an end of the world brought about by forgetfulness.
The further we went, the more I seemed to lose all sense of myself, and my surroundings. Now, in recollection, I can use these various words: horse, tree, gun, flower, clock, Marlene. But just then I no longer knew the name of the creature that roamed; ahead I no longer knew the name of the strange objects that grew all around me. I no longer knew what it was I should call myself, not even this; neither my being, my species, nor my given name. (p. 374)

Hybridisation, metaviruses and information diseases inform the most radical contemporary narratives of amnesia. The scrambling of the inside/outside dichotomy that these texts perform creeps inside the orderly edifice of the digital, producing sceptical dystopias in which the cultural equivalent of the Alzheimer epidemic severs the connection between signification and the real, thus neutralising both individual and collective agency. Whether it is possible to impart a more positive spin to technohuman hybrids and on spillages of memory and identity will be discussed in the fourth chapter. But first we need to rewind the story so far and see how the counterargument to the caveat of Thamus has been recast in the age of informatics, only to generate another equal and opposite anxiety, that of an excess of memory. Within this account the hypermnesiac will stand, in antithesis to the amnesiac, as the opposite pole in the angst-ridden narrative of our relationship with memory technologies; for it represents both the attainment of a dream, that of a flawless memory filled with impossibly vivid recollections, and the realisation that the gift can lead back to forgetfulness and a state of no knowledge at all.

94

Chapter Two - The Horror of the Total Library

I have tried to rescue from oblivion a subaltern horror: the vast, contradictory Library, whose vertical wildernesses of books run the incessant risk of changing into others that affirm, deny and confuse everything like a delirious god. Jorge Luis Borges, The Total Library

In his study of mnemonist Solomon-Veniaminovich Shereshevsky, the great Russian psychologist Alexander Luria reports that his subject was vexed by a most unusual problem. Namely, that [t]he big question for him, and the most troublesome, was how he could learn to forget.1 Haunted by a nearperfect recollection of events stretching back to his infancy, including the innumerable lists of numbers, foreign words and nonsense syllables that he had been fed over the years by his audiences, Shereshevsky found he had difficulties in isolating essential details and making generalizations that could help him to understand, as opposed to just memorise, even the simplest of stories. As his career progressed, so did the clutter in his mind increase, and such became his frustration that he began to develop forgetting techniques, in an ironic reversal of earlier attempts to improve his already

Alexander Luria, The Mind of a Mnemonist: A Little Book about a Vast Memory, translated by Lynn Solotarof (Harmondsworth: Penguin, 1975), p. 55.

95

remarkable mnemonic powers. His experiments including writing things down in the hope that his mind would let go of them. Seeing that this would not work, he resorted to discarding and then burning the slips of paper on which he had jotted down the things that he wished to forget. But
the magical act of burning he tried proved of no use to him. And after he had burned a piece of paper with some numbers he wanted to forget and discovered he could still see traces of the numbers on the charred embers, he was desperate. Not even fire could wipe out the traces he wanted to obliterate! (p. 57)

At a first reading, this would seem a rather dramatic illustration that, contrary to what Thamus argued, external means of remembering never do double as means of forgetting, even when one actually wishes them to operate in that way. But of course Platos critique was rather more pointed. It suggested that the quality of the content of memory would change, and furthermore that the new information technology would impair the faculty of discerning that transformation. Similarly, throughout his study Luria insists that Shereshevsky (or S, as he calls him) had a very poor capacity to think logically, abstract, generalise, or understand figurative language key ingredients for the high-level processing of information and that, moreover, the more items he committed to memory, the more these faculties were impaired. I know of no evidence that Jorge Luis Borges knew specifically of Shereshevsky, although it seems most likely, given his profound life-long interest in the workings of memory, that he would have read about other mnemonists of his ilk. Be that as it may, in a short story first published in Argentina in 1942, Funes el memorioso, he wrote his own account of what it would be like to possess total recall2. His insights have much in common with Lurias and are often cited today in attempts to illuminate the

Lurias study was also carried out in the 1940s, but would only be published two decades later.

96

difference between information and knowledge and articulate a cogent critique of the paradigms that inform the current technoscape.

My mind, Sir, is a garbage disposal


Following an injury suffered in a horse-riding accident, a Uruguayan peasant by the name of Ireneo Funes develops the gift of perfect memory: an ability to preserve intact in his mind every single shard of experience, every minute detail of every instant of every day of his life. Here is a character, then, for whom the dream of a perfectly documented life is a staggering reality; one who can not only retain a seemingly limitless amount of information, but also recall it at will. The narrator reports that Funes characterised his old self before the accident as blind, deaf-mute, somnambulistic, memoryless he looked without seeing, heard without hearing, forgot everything almost everything.3 After the accident, what he had described as an infirmity of the mind is replaced and externalised in the form of a paralysis of the body, a fact that scarcely interested him, for he felt that it was a little price to pay in exchange for his new powers of perception and memory. Contempt for the body and for materiality as a whole, a state of mind that we shall encounter in the next chapter in many stories set in cyberspace, is reflected in Funes heightened and morbid sensitivity for the tranquil advances of corruption, of caries, of fatigue, for the incessant progress of death, of moisture (p. 104). Meanwhile, on the inside looking out is a mind fervently at work. Funes pastimes include the attempt to solve the problem of semiotic regression, by assigning to each conceivable experience one symbol and one only, since
[i]t was not only difficult for him to understand that the generic term dog embraced so many unlike specimens of differing sizes and different forms; he was disturbed by the fact that a dog at three-fourteen (seen in

Jorge Luis Borges, Funes, the Memorius, in Fictions (London, John Calder, 1965), p. 101.

97

profile) should have the same name as the dog at three-fifteen (seen from the front). (p. 104)

Critically, Borges recognised how excessive memory could impair ones ability to extrapolate or generate knowledge. To think is to forget a difference, observes the storys narrator, to generalize, to abstract. In the overly replete world of Funes there were nothing but details, almost contiguous details (p. 104). Thus the art of memory practiced in antiquity and kept alive throughout the Renaissance by the Gnostic tradition chronicled by Yates and that so fascinated Borges4 still a source of direct influence for many contemporary mnemonists5 finds in modernity and postmodernity an unlikely double in the art of forgetting (Lurias own expression). The practice of this art is made necessary by the realisation that information does not equal value after all, but rather requires an effort to discriminate, to single out the meaningful and discard the meaningless. In the introduction to a collection of essays that bears, doubtless as an homage to Yates but also to Luria, the title of The Art of Forgetting, Adrian Forty points out that the functioning of whole societies is predicated on acts of voluntary amnesia, such as amnesties; but, rather surprisingly, nowhere in the ensemble work does there emerge a direct engagement with the books daring title, or with some of its apparent sources of inspiration, such as the quotation from Georges Perec which introduces the collection: Remembering is a malady for which forgetting is the cure6. To explore this

Darren Tofts exposes an interesting blind spot in Yates work when he points out that Borges is not cited in The Art of Memory (Memory Trade, p. 68) in spite of the detailed knowledge of the topic and remarkable insight that he manifests in Funes as well as outside of fiction (by the time he wrote the story he had already published a very Yatesian essay on Ramn Lulls thinking machine). This is likely a symptom of the unaccountably long time taken by the English-speaking literary scene to recognise the importance of Borges work in spite of the relatively early translation of some of his writings. Such as Italian mnemonist Gianni Golfera, who alongside instruction books explaining his memory techniques has written a book about Bruno entitled Larte della memoria di Giordano Bruno (Milan: Anima Edizioni, 2005).
5

98

counterintuitive proposition one needs to get to the heart of the cult of information and question the foundations of a project that, while it is by no means historically new, has found in the digital computer an immensely powerful engine. The sometimes timid, certainly marginal but at the same time stubborn contemporary talk of an information glut, or overload, is framed in terms that connect directly with Lurias observation of S and Borges portrayal of Funes. Could it be that the great wealth of information that is being accumulated thanks to new and ever more powerful archiving technologies does not in fact constitute wealth? Our archetypal amnesiac, Leonard Shelby, clearly knew too little; lacking the means to recall, he could not learn. But is it possible for an individual or a society to develop hypertrophic memories, and come to know too much? Perhaps Vannevar Bush would have answered in the negative. After all, he saw forgetting as a problem, and sought to solve it through nothing less than a mind machine. But he dealt in knowledge, not information, and this makes a rather important difference. He maintained that by developing the right technology, a worldwide network of memex workstations, it would be possible to reproduce and grow externally the system of causal/temporal hierarchies and lateral associations that links the contents of human memory. The input and the architecture of the memex were designed with the aim of embodying intelligence, and creating the conditions for the production of more knowledge. Bushs machine represents the means, for those not plagued with an excessively sticky and persistent memory, to avoid ending up like Shereshevsky or Funes, by relieving the mind of the need to remember disparate pieces of information and their complex interrelationships. With this system in place, individuals and organizations all over the world would be able to manipulate this highly processed information, and use it to fuel progress.
In The Art of Forgetting: Materializing Culture, edited by Adrian Forty and Susanne Kchler (Oxford: Berg, 1999), p. 1.
6

99

This is not to say that the premises of his project were not steeped in an excessive reliance on the power of the machine, as many have observed. As Chris Locke explains7, and as I have already partly argued in the introduction, a key to Bushs proposition was his reductionist belief that the memex mirrored exactly the workings of human memory. Nonetheless, I want to stress here that Bush never meant to give birth to an omnivorous beast bent on gobbling up all the information it could find. At a time when the critics and supporters of Bush alike point out that the memex machine has been finally developed and deployed, under the guise of the World Wide Web, one has to remember that when the original blueprint was conceived, information was not generally held among intellectuals, or anyone else for that matter, to be a value in and of itself8. But then of course Claude Shannons paper on The Mathematical Theory of Communication was yet to be published it would be in 1948 and the twin forces of information theory and cybernetics had yet to start shifting paradigms. Information theory, as conceived by Shannon, gave a rather peculiar regard to the question of value, defining the informative content of a message, or amount of information conveyed, as being inversely proportional to the predictability of the message. Which of course is very apposite in a telecommunications-engineering environment, but tells us nothing

Chris Locke, Digital Memory and the Problem of Forgetting, in Susannah Radstone (editor), Memory and Methodology (Oxford and New York: Berg, 2000), pp. 25-36.
8 As Ronald E. Day points out, the history of the term information as it is documented by the Oxford English Dictionary suggests that its use as a noun synonymous to fact or knowledge is relatively new. Until very recently, he writes, information had the sense of imparting knowledge (in the sense of telling someone something) or of giving sensory knowledge (in the way that our senses inform us of some event). Ronald E. Day, The Modern Invention of Information: Discourse, History, and Power (Carbondale and Edwardswille: Southern Illinois University Press, 2001), p. 1.

Katherine Hayles explains that Shannon was conscious of this departure from the common understanding of information as a vehicle of meaning. See N. Katherine Hayles, Boundary disputes, in Robert Markley (ed.), Virtual Realities and Their Discontents (Baltimore and London: Johns Hopkins University Press, 1996), p. 18, and cf. also her How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics (Chicago & London: The University of Chicago Press, 1999), pp. 55-56.

100

whatsoever regarding what the information is about, or what it is good for. After all, a telephone directory is a highly unpredictable text, in large part due to the arbitrariness of the numbers assigned to each subscriber; Dantes Comedy markedly less so, if only because it consistently rhymes when we expect it to. But to suggest that a telephone directory is always more informative than Dantes comedy, regardless of context or purpose, is a bit of a stretch. Nonetheless, the prospect of being able to quantify information soon started to exert an inordinate fascination over people well outside of Shannons original readership, which comprised the purveyors of the Bell Systems Technical Journal. In no time, as Erik Davis writes, [t]he paradigm of information began to invade humanist discourses, promising efficiently to clean up all sorts of messy problems concerning communication, learning, thought, and social behaviourall of which could now be seen as depending on more or less efficient systems of information processing. [] All of this set information on a collision course with meaning.9 As a result of this powerful shift, knowledge is no longer equal to power. Information is. And instead of meaning, or even informative content, nowadays we have the bit. Those who argue, as Nicholas Negroponte does at length in Being Digital, that the world is no longer made of atoms, but of bits, do the dominant world societies and classes a great service by obfuscating what little there remains in the collective consciousness of the connection between matter, information and value. Hakim Bey showed no little measure of frustration at having to point out, in tones that verged on the shrill, that you cannot eat information10. Far from being post-industrial, he went on to observe, the American and other first-world societies have relocated the old sites of industry and agriculture out of sight, but still depend on

Davis, TechGnosis, p. 83.

Hakim Bey, The Information War, C-Theory.net, 1996 http://www.ctheory.net/text_file.asp?pick=64.

10

101

a vast substructure of old-fashioned material production. Mexican farmworkers grow and package all that Natural food for us so we can devote our time to stocks, insurance, law, computers, video games. Peons in Taiwan make silicon chips for our PCs. Towel-heads in the Middle East suffer and die for our sins.

Such is the pervasiveness of the information fetish that it can create the illusion that value resides in symbols, such as money, independently of all else. And that the force of the argument is compelling, at least from the point of view of the haves, is undeniable: first-world lives are steeped in symbols and ever more sophisticated media; even fruit and vegetables on the supermarket shelves appear iconically identical to one another, and take forever to rot. Used as we are to measuring wealth in dollars, and to shifting numbers across the phone lines that connect electronic points-of-sale to bank accounts when we pay for groceries and other things, it is perhaps not surprising that we should have started to measure knowledge in bytes, and come to be possessed by a desire to accumulate as much of it as we possibly can. Think of the cultural ubiquity of memory technologies such as Napster, the iPod and the DVD burner, and the latest wave of digital storage mania a biased phrase I am borrowing from a monographic issue of Mediamatic11 will begin to delineate itself, with its many coded practices, daily demands and habits of mind. The power to download, share, burn and play at speeds unprecedented has gripped consumer culture, with no apparent end in sight. But the will to information, not unlike the will to knowledge or power, can never run very far from its own dark shadow. Thus, as I shall argue in the remainder of this chapter, the exhilaration of storage mania is threatened not only by the fear of digital amnesia, but also by the realisation that its ultimate goal, to possess all of the data, is only desirable until such moment

11

Mediamatic 8.1, 1994, available at http://www.mediamatic.net/set-8421-en.html

102

as it can be attained. Then it poses an entirely new and perplexing set of problems. Thus, literally possessing all of the data, the richest catalogue of lived experience, Funes begins unexpectedly to question its value. My mind, Sir, is a garbage disposal (p. 102), he concedes at one point. From that moment on, his extraordinary memory emerges as the root of a crippling neurosis. Finally he becomes the unfortunate Ireneo almost literally squashed under the heat and pressure of reality, left exhausted and yearning for forgetfulness and oblivion. In a dream, he gets his peace:
It was very difficult for him to sleep. To sleep is to be abstracted from the world; Funes, on his back in his cot, in the shadows, imagined every crevice and every moulding of the various houses which surrounded him. [] Toward the east, in a section which was not yet cut into blocks of homes, there were some new unknown houses. Funes imagined them black, compact, made of a single obscurity; he would turn his face in this direction in order to sleep. He would also imagine himself at the bottom of the river, being rocked and annihilated by the current. (p. 102)

The information age, having approximated total recall, or at least claiming to have done so, finds that its unconscious is increasingly populated by such dreams.

103

Can It Be Done?
In The Order of Books, Roger Chartier attempts to trace a genealogy of the drive to capture and catalogue the information flow in its entirety. The dream of a library that would bring together all accumulated knowledge and all the books ever written can be found throughout the history of Western civilization, he claims, and in fourteenth and fifteenth century Europe, following the invention of print, this became one of the major tensions that inhabited the literates of the early modern age and caused them anxiety:
A universal library (or at least universal in one order of knowledge) could not be other than fictive, reduced to the dimensions of a catalogue, a nomenclature, or a survey. Conversely, any library that is actually installed in a specific place and that is made up of real works available for consultation and reading, no matter how rich it might be, gives only a truncated image of all accumulable knowledge. The irreducible gap between ideally exhaustive inventories and necessarily incomplete collections was experienced with intense frustration.12

Here Chartier is talking about knowledge, of course, not information. But once the task of selecting this knowledge has been relinquished, the two cannot be easily prised apart. The project of collecting all the books ever written is indiscriminate (a librarian would say comprehensive), it seeks to quantify and amass rather than qualify and select, which arguably is also what contemporary storage mania is all about. Anyone who wished to build such a library nowadays, would be faced by a number of texts several orders of magnitude greater, and distributed across a wider and more complex variety of media. But there is room to argue that the gap between the ideally exhaustive inventories and the necessarily incomplete collections has actually shrunk (some would say disappeared, as we shall see) thanks to equal if not greater leaps in the technologies that the would-be universal librarian can call upon. The great multimedia machine

12

Roger Chartier, The Order of Books (Cambridge: Polity Press, 1994), p. vii.

104

claims to have the power to perform the remediation of all of these texts, so that they can be stored and shared in a single medium; and by linking the machines on which these texts are to be found, worldwide networks appear able to create a vast decentralised library that can be accessed simultaneously regardless of location. Both claims need to be examined and qualified, of course. But not before a third, and perhaps more extraordinary one, has been added: that the overall digital storage capacity has exceeded, or is about to exceed, or could conceivably exceed within our lifetimes, the amount of information in the world. This is a sort of ritualistic prediction that surfaces periodically in discussions in and around the Net. While it often appears garbed in formulas, graphs, algorithmic laws and other paraphernalia from the world of hard sciences, its essence is irrational, its aura shamanic. In rhetorical complexity and scope, it ranges from the heady heights of the Singularity Math Trialogue enacted by Ray Kurzweil, Vernor Vinge and Hans Moravec13, to the prosaic, commonsensical approach of Michael Lesks widely quoted How Much Information Is There In the World14. How does Lesk a professor at Rutgers University and the author of Understanding Digital Libraries come to the conclusion that, all told, [t]here may be a few thousand petabytes of information in the world? It is quite simple, really. He starts off by quoting an estimate according to which that there are 20 terabytes of print information (remember, information comes in multiples of the bit) in the Library of Congress, adds to that the photographs, maps, movies and audio recordings contained therein another 2,000 terabytes or so. Then he works out how much information is encoded in the 38 million tons of paper sold for writing and printing in the United States every year. By now, his numbers have started to become more
13

Ray Kurzweil, Vernor Vinge and Hans Moravec, Singularity Math Trialogue, KurzweilAI website, http://www.kurzweilai.net/meme/frame.html?main=/articles/art0151.html? Michael Lesk, How Much Information is There in the World? http://www.lesk.com/mlesk/ksg97/ksg.html
14

105

than a little suspect, but he keeps at it until he has come up with the amount of information currently produced every year in the US, which he then multiplies by four, reasoning that the total world GDP is roughly four times larger than that of the United States alone. This latest equivalence between dollars and bytes is of course rather revealing of the ideology that permeates the cult of information, but the process whereby pounds of paper are translated wholesale into information is no less salient. The aim of Lesks exercise is to see if this information could be made to fit into the existing worldwide digital storage media. Estimating this storage capacity could only lead to an informed approximation, but is of course a far less thorny undertaking. After all, we already buy and sell hard drives according to how many bytes can be crammed into them. After a few technical and semantic distinctions, and a fair arsenal of statistical references, the author concludes that
[i]n the year 2000 [the paper was written in 1997] we will be able to save in digital form everything we want to including digitizing all the phone calls in the world, all the sound recordings, and all the movies. Well probably even be able to do all the home movies in digital form. We can save on disk everything that has any contact with professional production or approval. Soon after the year 2000 the production of disks and tapes will outrun human production of information to put on them. Most computer storage units will have to contain information generated by computer; there wont be enough of anything else.

That the singularity should occur in the year 2000, in perfect millenarian fashion, is perhaps nothing more than a delicious coincidence. The important lesson is that yes, it can be done, we can remember everything. And if we can, does it not mean that we should? As Mathias Fuchs incisively puts it:
[a]mnesia angst, the fear of memory loss, the programmers neurosis turned into software, has become firmly installed in most computer programs. The user is allowed to store unhesitatingly, to duplicate documents or make hundred-fold copies of them, but as soon as he wants

106

to delete a document, there immediately and inevitably appears a dialogue field with the question: Do you really want to delete this?15

The Stickiness of the Database


As the name of a well-known Norton Utility, now built into both the Windows and Mac operating systems, pointedly reminds us, even deleted information can in many cases be undeleted, brought back from digital oblivion. This suggests, in direct contrast with the views examined in chapter one, that digital technologies display a strong bias for retaining information and a corresponding aversion for all acts of deletion, no matter how deliberate. This bias is perfectly consistent with a cultural paradigm that places wholesale value in information, of course, but some of its implications are far from straightforward. The fact that computers remember by default, but need to be explicitly instructed in order to forget, means that they often hold on to information beyond the point in time when it ceases to be functional to a particular task, and this can be perceived as a problem on a number of levels. Many large organizations have systems in place to reduce the clutter, both in order to allow the most relevant and up-to-date information to be located more quickly and to spare memory resources, in what is a fairly straight cost-benefit exercise. More interestingly, perhaps, a whole industry has spawned over the last few years in the field of personal computing whose products purport to rid ones machine of information that is not only surplus to requirement, but potentially embarrassing or damaging. Email spam, always a good indicator of what fears exist out there to be exploited, peddles a whole range of wares in this category.

Mathias Fuchs, Are you sure you want to do this?, Mediamatic 8.1 (1994), http://www.mediamatic.net/article-5887-en.html.

15

107

From: ContentWatch <offers@jumpjive.com> Subject: Advisory: Hidden file danger Did you know that your computer AUTOMATICALLY SAVES every PICTURE from every WEBSITE you visit? This function helps your PC access your favorite websites more quickly, but it also allows EXPLICIT adult pictures to be saved to your computer if adult websites have been accidentally or intentionally visited! FREE, confidential online PC check! Safely detect offensive files in just minutes!16

This piece of direct marketing, dating from 2002, is far from subtle in targeting the (predominantly male) computer users who visit pornographic websites at work or at home, but are not savvy enough to delete the traces left by such visits, therefore exposing themselves to the danger that others might stumble upon the evidence. The companys website carried at the time a banner which made the threat all the more insidious, by implying that such material could be lurking around even on the PCs of those who thought they knew how to erase it:

Fig. 9 - Banner advertisement for ContentAudit

ContentAudit scans the contents of ones hard drive while the user is online, with the aim of finding files with a high likelihood of containing objectionable material. Unfortunately, this is just as tricky for a machine to

16

Spam email received in August 2002.

108

accomplish as it is for the human operator, although for quite different reasons. The software has a very limited ability to scan for meaning, and is not capable of looking at a picture or video and telling if it is pornographic17; it can only hope that the filename will contain suggestive keywords which is of course not always the case (especially for those stored in cache files automatically generated by the web browser, and identified by automatically generated names). The human operator, on the other hand, is generally capable of telling whether an image or video is pornographic, but finds it a lot harder to locate the files and scan a large number of them. With text files the software has a little more luck, for the right balance of keywords generally locates pornography, but the vast majority of the material one is trying to find is likely to consist more of pictures and videos than of files containing words (and after all the customer is enticed by an advertisement that mentions nothing but pictures). As well as missing out on a number of files which are likely to be very relevant to the search, the procedure requires more than a little human intervention, for the list of text files still needs to be expunged of many false positives (say, legitimate files containing the word sex, or even cheerleader18). The added work involved in this procedure does not raise the overall success rate of the scan, for it merely eliminates the wrong hits rather than filling in the gaps in the results. And so the doubts about what may be lurking inside ones own computer are not truly allayed, but actually enhanced by the realisation that information is not only good at sticking, but also at hiding.

I am not interested here in problematising what may or may not constitute pornography or objectionable material, given that if the user is searching for it with a tool of this nature s/he probably buys at least to a certain extent into the conventionally shared notion of what these categories should encompass. What I find more interesting is that the software fails to spot the glaringly obvious, and often includes the totally harmless. The problem has a further layer of intricacy: namely, the fact that the software works only on English texts. When I tried it on my own computer, ticking along with sex the categories of drugs, violence and gambling, I obtained a huge list containing all the files in my hard drive containing the word dice, which in Italian is the third person singular of the verb to say.
18

17

109

The stickiness of the database, this memorisation by stealth that goes on in the background in many of our dealings and transactions involving electronic media, has significant implications with regard to the construction of subjectivity in the digital age, leading to an anxiety-inducing phenomenon of its own known as identity theft, which I will examine in chapter four. For now, I want to emphasise the process whereby information accumulates steadily, often without explicit purpose, in the vast and ever growing expanses of the digital database, there to lie dormant, forgotten, in a sense, but nonetheless intact, waiting to be recalled and possibly put to a new use. Like Poes purloined letter, the information never really went away. It simply stopped calling attention to itself, thus ceasing to be visible. It blended in with all the other 1s and 0s. *** Over the last twenty years, the great ease of acquisition and storage that the digital computer affords, coupled with the deep inscription of the programmers neurosis into its procedures in the form of the stickiness of the database, has furnished the means to reframe the culturally familiar project of collecting all of the information out there. A representative and influential figure in this movement is evangelical librarian19 Brewster Kahle, founder of Alexa Internet and the Internet Archive. Since 1996, Kahle has been working to preserve the World Wide Web in its entirety by taking monthly snapshots of it, totalling some 50 terabytes each month. The resulting data is stored in a few rooms at the Presidio, in San Francisco, and is backed up in a mirror located in Alexandria, Egypt, on the site of one of the most ambitious and ultimately catastrophic archival endeavours of all time: the great library which, according to legend, was founded by Alexander the Great in the fourth century BC and later completely

A handle given by Paul Boutin in his profile of Kahle entitled The Archivist, Slate (April 7, 2005), http://www.slate.com/id/2116329/.

19

110

destroyed, we still do not know whether by accident or design. As well as cultivating a partnership with this most illustrious forerunner, Kahles project is linked to another monumental ark of knowledge, the Library of Congress in Washington, DC, that accepted from Alexa in 1997 its first large-scale digital donation. The installation, still on display in the librarys lobby, is entitled World Wide Web 1997: 2 Terabytes in 63 Inches and comprises 44 digital tapes alongside four red computer monitors that intermittently flash 10,000 Web pages, two every second, from the 500,000 sites gathered and stored by Alexa Internet.20 In the long term, the aim of the Internet Archive is to allow scholars and anyone else who might be interested to travel back in time (the site that allows us to navigate the archive is known as the Wayback Machine) and surf the Web as it existed at the time when the snapshot was taken. Currently, and for the kind of reasons that should by now sound familiar, the ride is not altogether smooth. The Archive has difficulties in accessing certain types of pages, especially those that link with their hosts in real time, as in the case of streaming content and many Java applications. More significantly, it has already lost, compared to earlier snapshots, a great deal of material as a number of organizations media companies in particular have since built their own archives and blocked the companys crawlers, which has an immediate retroactive effect. Other content had to be removed under American anti-terrorism law21. As a result, the phenomenon tagged with metaphorical brilliance as link rot has set in, and clicking on the hyperlinks in the archived Web pages brings up more and more often the well-known 404 Page not Found notice, a familiar emblem of the Webs capacity for amnesia.

20 21

A demo can be viewed at http://www.archive.org/web/LoC_sculpture.php.

See Inside the Wayback Machine, .net 112 (July 2003), now available at http://www.maxpc.co.uk/features/default.asp?pagetypeid=2&articleid=18463&subsectionid =736&subsubsectionid=609.

111

While its record is far from complete, however, the Internet Archive is still able to turn back the clock on a large portion of the Webs content and in some cases bring back from oblivion some of the millions of pages that have long since disappeared from the network. Web sites, as it is routinely remarked, have a very short average lifespan. Sometimes their disappearance is brought about in deliberate and dramatic fashion, as when George W. Bush took over the American presidency in early 2001 and wiped clean the contents of the www.whitehouse.gov website and of its companion archive, www.pub.whitehouse.gov22; more often, they vanish with a whimper, due to lack of care, or unpaid bills. But it need not be, and people like Kahle will make sure of that. Put in these terms, it does seem like an undertaking of potential social, cultural and historical value, made possible by a medium that contrary to the views examined in the first chapter does deliver an unprecedented capacity for content preservation, so long as a sufficient and properly orchestrated effort is made. The discourse of the great Internet archive, be it Alexas or one of its many variants, uses in fact the pronouncements of the likes of Rothenberg, Brand and others concerning the ushering in of a digital dark age as a rhetorical springboard in order to call for gargantuan efforts of preservation. But these efforts, it turns out, are driven by the very same kind of faith in the digital that those critics are calling into question. In the preface of the document in which the Library of Congress outlines its digital preservation strategy, chief librarian James H. Billington writes for instance that [t]he new digital technology offers great promise, but it also creates an unprecedented surfeit of data in an unstable and ephemeral environment, and that forty-four percent of the sites available on the Internet in 1998 were no longer in existence a year later, and the average life of a Web site is now [in 2002] only 44 days23. This scenario of endemic

22

See Richard Wiggins, Digital Preservation: Paradox & Promise, LibraryJournal.com (15 April 2001), http://www.libraryjournal.com/article/CA106209.html.

112

amnesia is painted with the same brush as some of the ones we encountered in the previous chapter. But the answers to the problem differ greatly. It will be recalled that one of the chief critics, Stewart Brand, holds the view that we are moving towards a future structurally incapable of keeping faith with its past, and that digital technologies as they are currently deployed are an integral part of the problem. In response, he asks that we preserve less, not more, and that we trade quantity in exchange for time, by taking a longer view. Some of his techniques, such as adopting the habit of placing an extra zero before the year in dates (as in 02005), are pure Zen; but in spearheading the Long Now Foundation and the 10,000 Years Library, he has helped to construct a strong, alternative way of thinking about the value of information and the means of its preservation. Compare one of the Foundations cornerstone projects, the Rosetta Disk, with the Internet Archive, and the gap that separates the two underlying views of what constitutes cultural memory, and what is the proper role of technology in its preservation, will be dramatically exposed. The aim of the Disk is ambitious, too to build a public archive of all documented human languages and is also meant as a response to the danger of amnesia, based on the estimate that fifty to ninety per cent of [these languages] are predicted to disappear in the next century, with little or no significant documentation24. Access to the information, gathered and compiled by academic specialists and native speakers around the globe, takes place on the Web, but the archive itself is backed up on micro-etched physical disks, in analogue form, so as to require no further encoding and no hardware, except for an optical microscope. Each disk is three inches in diameter and capable of storing the whole 1,000 language corpus. But the most important thing about the disk is that it embodies an effort to remember important information based on an assessment of present and future value. Any
Preserving Our Digital Heritage: Plan for the National Digital Information Infrastructure and Preservation Program (Washington: The Library of Congress Press, 2002), p. vii.
24 23

http://www.rosettaproject.org/live/concept

113

decision to devote resources to such a highly selective act of memory is in itself exclusionary, for deciding what ought to be remembered implies that other things will likely be forgotten. With only so much space in his ark, Noah was told to preserve biodiversity at the expense of making space for more of a kind, including his own kind. Believing that resources are limited, the 10,000 years librarians choose to preserve cultural diversity and the ability to decode whatever texts will survive. The Internet Archive, by contrast, strives to remember everything25, based on the belief that we cannot know today what future generations will find valuable. This is of course a very respectable view, one that has informed practices in librarianship and bibliography for quite some time. Put simply, what is regarded as ephemeral or unimportant today may well not be so tomorrow, and in the absence of comprehensive preservation efforts, the texts that fail to carry sufficient commodity value will be destroyed, as the loss of the great majority of artifacts from the early age of cinema already alluded to in the previous chapter makes painfully clear. When it is coupled with the exhilarating prospect of actually commanding unlimited storage resources, however, what this view also does is to defer in perpetuity the task of assigning value to information. In other words, of discriminating26. Totalising archival projects such as Kahles, whose eventual plan is to expand into the preservation of books and audiovisual

It is possible, as I already mentioned, to opt out of the Internet Archives sweeping data gathering. From a technical point of view it is quite simple, a matter of copying a file called robots.txt in the top level of ones website; the file is recognised not only by the Internet Archive, but also by all major Web directories such as Google as a request that the information contained therein is not to be collected and stored elsewhere. This is a request with which these organizations choose to comply. However this, to paraphrase Fuchs, is akin to asking Are you sure you do not want to archive this? and shows again a bias towards memorisation that is becoming more and more prevalent in the practices of the Web. Notably, by adding the option of accessing its cached copies of pages that may have since changed or disappeared, Google has also built into its catalogue whose primary aim is still to allow the information on the Web to be located and accessed an important and stubbornly resilient archival dimension. Willem Velthoven makes a similar point in the Editorial of the Storage Mania issue of Mediamatic, cited above.
26

25

114

content as well, differ from their historical predecessors precisely in that there is a growing perception that they can be achieved. Put another way, the claim is that there is enough space on the ark to fit the whole world, and Noah will not have to make any hard choices. Come the Flood, the ark and the world will become one and the same.

Perec and the Confines of the Archive


Georges Perec, the author of that somewhat eluded quotation at the beginning of The Art of Forgetting, died in 1982, at the age of 46, too early to experience the advent of the personal computer and the birth of the Internet. One is tempted, in reading his work, to try to imagine what he would have made, if anything, of these electronic media once they started to permeate Western societies to the extent that they have. Would he have his own blog, take part in newsgroups, create Web installations, write hypertextual fictions? He was, after all, a member of Oulipo, the Ouvroir de littrature potentielle, a literary movement deeply engaged in reflexive thinking about the interaction between computers, mathematics and the arts that counted among its leading figures Italo Calvino and Raymond Queneau. Calvino, who had written in an essay entitled Cybernetics and Ghosts about the application of combinatorial mathematics to literature, a set of ideas which he later developed in novel-form in If on a Winters Night a Traveller; and Queneau, author of Hundred Thousand Billion Poems, a book which is also a machine, an algorithm for generating verse. Perec, whose Life, A Users Manual is probably the single most important text left to us by Oulipo, was the prophet of ephemera, a lover of puzzles, and an obsessive maker of lists and catalogues. He kept a diary of all the foods and beverages that he consumed in 1974, made a list of the thirtyseven things he wished to do before he died, wrote an essay on the art and manner of arranging ones books, wondered aloud about why the letters of 115

the alphabet are ordered the way they are. He championed the infraordinary, a category that he held to be in direct opposition to the skewed principle of newsworthiness that makes it appear as if the one and only destiny of motor-cars is to drive into plane trees27. His collection entitled Species of Spaces ends with an index of some of the words used in this work a take on the indices of subjects and names to be found at the end of most scholarly works, as well as a remediation of the text search facility of electronic texts and a precursor of the Web search engine which tells us for instance that the word cherry is to be found on pages 58 and 89, the word conversation on page 9028. Underlying all this is Perecs fixation with memory, materiality, representation and the medium. One of the central figures in Lifes A User Manual is the wealthy misanthrope Bartlebooth, whose grand life project consists in travelling to five-hundred ports around the world, painting a watercolour of each, gluing the paper on wooden puzzles fashioned by an artisan, and then, back in Paris, first solving the puzzles, then lifting the paper from the wood and treating it until the cuts that separate the pieces have disappeared. Each watercolour is then sent back to the original port, there to be dipped in seawater until the colours have dissolved and the original blank sheet of paper is restored. Finally, the sheets are sent back to Paris and to Bartlebooth. Each watercolour is a souvenir, an object to remember by, and yet its ultimate destiny is to be erased on the site where the original memory was impressed upon it, following a path that goes from memory to amnesia and takes place entirely on the outside, on a highly coded and meticulously deployed medium. This puzzling project (pun intended) is a parodic mirror of Perecs own, which consists in freezing a Parisian building in time, a few minutes before eight pm on 23 June 1975, and moving from room to room based on the principle of the knights move in the game of chess on a 10x10

Georges Perec, Approaches to What? In Species of Spaces and Other Pieces, translated by John Sturrock (Harmondsworth: Penguin, 1997), p. 209.
28

27

Georges Perec, Spieces of Spaces, in Species of Spaces and Other Pieces, pp. 93-95.

116

grid that represents a cross-section of the building. While the stories that are narrated as the novel pauses in each space (including the stairwell) are essentially about people, the building firmly establishes itself as the novels overarching presence and is written about in ways that often undercut the familiar literary convention of focussing on human beings as opposed to, say, the contents of a handymans wardrobe, down to the last nail which is of course exactly what Perec revels in doing. There is an element of earnest playfulness in Perecs work that is at times reminiscent of Joyce. And indeed, not unlike Ulysses or Finnegans Wake, Life masks under its conceits and elaborate constraints an attempt to achieve nothing less than the total novel, a microcosm in words for the reader to inhabit, and in this way anticipates the aspirations of the universal digital archive. Joyces horizon in Ulysses was the city of Dublin, on another June day of seventy odd years before; his principal self-imposed constraint, that the trajectories of the novels protagonists should follow those of Homers Odyssey. Perec works on a smaller location, the building, and a shorter timeframe, an instant frozen in time; but also in much greater telescopic detail. He attempts to narrate the building whole, its inhabitants, its history, as in one of those impossible maps on a scale of one to one imagined by Borges and Lewis Carroll. That an omniscient narrator should be granted the power of total recall is neither surprising nor unusual, of course, but the difference here lies in the way in which this power is exercised, and the challenge it poses to the reader. This challenge had already come partly to the fore in Perecs shorter pieces. How does one go about reading, for instance, his Attempt at an inventory of the Liquid and Solid Foodstuffs Ingurgitated by Me in the Course of the Year Nineteen Hundred and Seventy Four? The gist of the piece is all in the title, and the text that follows is exactly what it purports to be: a long list of foods and beverages, perfectly intelligible and yet quite difficult to read in the habitual, linear fashion that print has accustomed us

117

to. There is nothing to capture the interest, nothing to drive the reader forward, no expectation of a resolution or dnouement. If the same idea had occurred to Borges, no doubt he would have attributed it to one of his characters, telling us that such and such kept a diary of the things he ate and drank in the year such and such, without bothering to delve into the perplexing detail of such a list. Not so Perec who, if he is to be believed, actually kept such a diary29. But experimental literature of the kind offered by Action Poetique, the magazine that originally published this piece, in 1976, had already more than a long tradition of challenging conventional writing and reading practices one only has to think of Queneaus own hundred thousand billion poems which, if actually compiled, would take some two hundred million years to read. The real dissonance comes, however, when the meticulous parading of the infra-ordinary is transplanted into Perecs grand novel, where it cuts across other familiar conventions that, by contrast, are not abandoned or in any way challenged. There is nothing unusual, for instance, in the fact that Life contains a series of mysteries that the reader is expected to solve, presented chiefly in the selfreflexive form of the puzzle. We know that in a literary mystery, as in the art of solving puzzles, being able to pay attention to small and seemingly insignificant details is paramount to the solvers chances of success. And yet in Life such hyper-attentive reading is systematically overloaded, and frustrated, by the glut of detail thrown at it. We saw how Memento introduced a reconfiguration of the viewing practices which are conventionally held to be appropriate to solving a mystery in the medium of cinema. Something similar is at work in Life through the deployment of perplexing lists and catalogues, the stubborn fixation on details that may be part of the setting, but not of what is commonly held to be the story, loosely defined as what makes up the relevant as opposed to the irrelevant in a particular narrative, but also the memorable as opposed to

The Work of Memory, interview by Frank Venaille, in Species of Spaces and Other Pieces, p. 130.

29

118

the forgettable (a point I shall return to shortly). Consider the catalogue of home decorating and fitting appliances with which the otherwise desperately lonely Madame Moreau keeps prosperous company in her apartment. It begins thus:
WALLPAPERING KIT. Includes 6 folding yardstick; scissors; roller; hammer; 6 metal rule; electricians screwdriver; trimmers; knife; brush; pliers; paint knife; handle; all in portable plastic case, lgth. 2, wdth 4, hght 4. Weight 6 lbs. Fully guaranteed 1 yr.30

and occupies four whole pages, meticulously laid out in small type. The insertion of the catalogue at this point is not entirely irrelevant to the characterization of Madame Moreau, but poses an obvious and immediate problem to the reader of novels, used in the main to move in a straight line from the first page to the last, at a more or less even pace, taking in each word as it comes up, examining the illustrations, if there are any, and finally reaching the conclusion. Life, by contrast, is meant to be represented in space as a two-dimensional map, not as a straight line, and can be read in multiple orders just like a database; but it also defies the expectation that some of its parts, like Madame Moreaus catalogue, have to be read at all. It is at these junctures, through the incorporation of such perplexing subtexts, that the author-function morphs into a subroutine (IF TEXT, THEN PRINT) and the novel appears on the verge of becoming coextensive with its fictional universe. The reader is left at such times with the choice of whether to take up that relinquished task of assigning or denying relevance to the various parts of the whole, at the cost of rushing through or skipping whole pages, or else succumb to the anxiety that some key detail may be hidden in unlikely places, and always read on (IF TEXT, THEN READ). By placing one of the apartment blocks dwellers, Serge Valne, in charge of painting a picture of the building in cross-section, room by room, space by space in other words, doing with a brush and palette what he himself is

Georges Perec, Life, A Users Manual, translated by David Bellos (London: Vintage 2003), p. 70.

30

119

doing with words Perec complicates things further by inviting a forceful analogy between the act of reading Life and that of looking at a painting. From this viewpoint, reading every single passage of the novel would be comparable to trying to pore over a canvas in all of its details, a task which is not only difficult to approach conceptually so that it can be put into practice what is the smallest unit of visual detail? but can also divert from that of taking in and appreciating (understanding, making sense of) the picture as a whole. In another novel he published while he was crafting Life, entitled W or the Memory of Childhood, Perec practices a quasi-Proustian anamnesis through the highly disciplined and painstaking routine of focussing on pictures and objects in order to restore memories from his childhood. In Life he constructs instead a hypertrophic archive, a system of records that does not allow memories to fade in the first place, thus leaving no scope for forgetfulness, let alone anamnesis. The result is a memory palace, but quite unlike the ones built and maintained by Renaissance mnemonists: those were dynamic semiotic structures, tools for managing and manipulating knowledge; this is a far more unwieldy edifice, an attempt to exhaust a (however fictional) human and architectural space and all the information encoded therein. If Perec truly felt that memory is a malady for which forgetting is the cure31, then Life would appear to be a most virulent manifestation of a disease that he was readily able to diagnose but against which was himself far from immune. In fact, he had previously characterised the keeping of his dietary diary as a totally compulsive move occurring in a period of his life where he had a veritable phobia about forgetting32, and which also happened to fall within the ten-year gestation

31 If, indeed, for the phrase actually comes from Perecs discussion in Think/Classify of a computer programme written by Paul Braffort for generating aphorisms. It would be interesting to know whether the terms fed into the computer, remembering and forgetting, were in fact suggested by Perec, but the straightforward attribution to him of the phrase in The Art of Forgetting is a little dubious in any case. See Perec, Think/Classify, in Species of Spaces and Other Places, p. 203.

120

period of Life. When such a compulsion prevails, as it does in Life, archival practices substitute themselves for experience and the archive expands to coincide with the real. *** Jacques Derrida has another name for the compulsion that gripped Perec: le mal darchive, a phrase that his translator, Eric Prenowitz, rendered as archive fever. The English phrase occupies much the same semantic area as storage mania, and is apposite enough for our discussion but, as Herman Rapaport argues, the French original evokes also the suggestive possibility that there may be malice in the archive, as well as a kind of seasickness, or nausea33, and these are subterranean connotations also to be found lurking in the pages of Life. For Derrida, the archive is defined as existing on the outside, and will never be either memory or anamnesis as spontaneous, alive and internal experience. On the contrary: the archive takes place at the place of originary and structural breakdown of said memory34. The claim, in line with the position put forward in the Phaedrus with regard to writing, is that the act of creating a record alienates the subject from the experience that the record refers to. Consequently, Derrida contends, the archive is neither mnemic nor anamnesic, but rather hypomnesic, that is to say mnemotechnical supplement or representative, auxiliary or memorandum (p. 11). The nemesis of the archive is the death drive of Freudian psychoanalysis, which functions also as a drive to forget, to erase the archive and even its own traces. But, Derrida argues, and this is a crucial point, archive fever and the death drive cannot be prised apart:
32 33 34

The Work of Memory, Species of Spaces and Other Pieces, p. 130. Herman Rapaport, Archive Trauma, Diacritics 28.4 (1998), pp. 68-81

Jacques Derrida, Archive Fever: A Freudian Impression (Chicago & London: The University of Chicago Press, 1995), p. 11.

121

There would indeed be no archive desire without the radical finitude, without the possibility of a forgetfulness which does not limit itself to repression. Above all, and this is the most serious, beyond or within this simple limit called finiteness or finitude, there is no archive fever without the threat of this death drive, this aggression and destruction drive. (p. 19)

In Perecs work this interaction of opposites is constantly played out. This is most obvious in Bartlebooths scheme, which consists in the painstaking construction of a vast archive that is meant from the outset to be eventually dismantled in an equally precise and deliberate fashion, until no trace whatsoever of the project is left; but it is also inscribed in the many lists and catalogues that constellate Perecs writings. Cradled in orality, poetry for centuries had no choice but to strive to be memorable in order to survive beyond the fleeting present of a single recital; and I would argue that this aspiration to say or write memorable things, worthy of being recited, read, copied and quoted, has largely survived the invention of writing and the successive technologies for encoding and storing texts, all the way up to the present. Literature is still, by and large, the domain of remarkable writing and memorable meanings. Perecs lists, by contrast, are spectacularly unmemorable, the items seemingly chosen on the basis of how effectively and promptly they are likely, so to speak, to succeed to fail to be remembered, and thus come to closely resemble the blank pages of Bartlebooths former watercolours, in spite of the fact that they never cease to bear text. Thus the death drive in Perecs archive does not take place solely through erasure, as in the Derridean scenario, but also by accumulating textual records that privilege information over meaning to such an extent that they cannot be brought back to living memory. In keeping with the lugubrious metaphor, one could say that the archive has become their tomb. It is my contention that forgetfulness and erasure creep into the seemingly boundless expanse of the great digital archive precisely because here too archive fever 122

and the death drive have come to coincide, in the name of storing more and more information.

Saving the Present


Another word for the archive-as-tomb is museumisation (or musealisation, or the even more suggestive museification), the process by which objects are turned into artifacts, translated into the language of heritage. As the very many critics of the institution of the museum have variously observed, this translation effects profound changes at the level of signification concerning the objects that enter the museal sphere. Ossification is a popular word among these critics, and one that adapts well both to the memory palaces of Perec and to the Internet snapshots taken by Alexa. Both of these aspire to be the real, to obscure their own representational and archival nature; Perecs novel does so by eliding, or appearing to elide, the arbitrariness of the narrative voice, with the aim of telling the whole story (an aim injected with hefty doses of irony and self-mockery, but this does not strictly matter here); Alexas snapshots do so by purporting to recreate the Internet whole, in spite of the fact that its whole edifice has been shifted and made to fit a different environment altogether. The Internet is a dynamic textual environment, a pulsating electronic uber-text that flickers and changes by the microsecond, tied into a nexus of relations, created by means of the hyperlinks, which is also highly dynamic. To download it means to halt this flux, and to severe the circulatory system of links that sustains it35. As a result, the Internet as it is stored and displayed by Alexa cuts the figure of an embalmed animal in a natural museum, frozen at a moment in time, looking alive and yet utterly dead. The curators of the Internet Archive installation at the Library of Congress, it will be recalled, chose to have it flash ten thousand Web pages on a wall of screens, at the rate of two per second, as if to jolt it into life. But what this conspicuously random, chaotic

For a similar critique of Kahles project, see Giulio Blasis introduction to The Future of Memory (Turnhout : Brepols, 2002), p. 14.

35

123

cycling actually achieves is to underscore the point that the Internet as it is conceived by its archivists is something profoundly other, almost a parody of the Internet it purports to preserve in that it cannot be searched, read, written into or interacted with in the same ways as its counterpart. As Andreas Huyssen has remarked, this kind of museal sensibility has expanded well beyond the confines of the institutions that gave birth to it, and now occupies a significant portion of everyday lived experience. He explains:
If you think of the historicizing restoration of old urban centers, whole museum villages and landscapes, the boom of flea markets, retro fashions, and nostalgia waves, the obsessive self-musealization per video recorder, memoir writing and confessional literature, and if you add to that the electronic totalization of the world on data banks, then the museum can indeed no longer be described as a single institution with stable and well-drawn boundaries. The museum in this broad amorphous sense has become a key paradigm of contemporary cultural activities.36

While phrases like obsessive self-musealization are a little disparaging, Huyssen stops well short of claiming that this phenomenon is wholly neurotic, a manifestation of a social or cultural dysfunction. In an uncompromising and frankly apocalyptic piece of criticism, Jorinde Seidel characterises it by contrast as the product of a terminal culture that has turned us all into supernumeraries or undertakers in charge of preparing ourselves and our world as heritage37. In a rather apposite passage, Seidel claims that, under the effect of the universal imperatives of care, preservation and protection, things become monuments, nature becomes reserve, knowledge becomes information and data, people and forms of life become models or examples (my emphasis), until finally nothing is allowed to be deleted or forgotten. Again, the death drive is seen at work in the

36

Andreas Huyssen, Twilight Memories: Marking Time in a Culture of Amnesia (New York and London: Routledge, 1995), p. 14.

Jorinde Seidel, Operation Re-store World, Mediamatic 8.1 (1994), http://www.mediamatic.net/article-8359-en.html.

37

124

production of an archive rather than its erasure, and the reversal is made possible by morphing knowledge into information. This is a gambit that enables us to turn the complex task of saving (as in preserving) our cultural heritage, into the straightforward action of saving (as in storing into digital memory) a data file. If one accepts the claim that digital storage space is larger than the amount of information produced by humanity and its machines combined, then these new museal spaces will appear to be enormously vast, and hence capable of sustaining the kind of universal collection that institutional museums have largely abandoned. One such collection of the pre-digital era can still be viewed at the American Museum of Natural History in New York, which embarked from the late nineteenth century on a project aimed at gathering, sorting and storing specimens of the totality of the fauna, flora and geological formations in the world38. It was a project that fitted in seamlessly with the positivist ethos in which it had been produced, but the contemporary museum, whether one chooses to call it postmodern or not, is a rather different beast. It aims to select and re-enact. It has become experiential, interactive. Travelling in a contrary direction, the digital domain allows us then to reframe not only the dream of the universal library whose genealogy was an object of study for Chartier, but also that of the universal collection or museum of the modern era. As Huyssens passage suggests, digital technologies are not alone in promoting and channelling this phenomenon, but certainly play a dominant role, in large part due to their ability to bring different media together a key to the complete memory experience and also, but no less importantly, to lower the cost of the enterprise. Opting for a more upbeat characterisation of this phenomenon, we could say that the museum has been democratized,

For a vigorous critique of this project and the ideology that underpinned it, see Donna Haraways account in Teddy Bear Patriarchy: Taxidermy in the Garden of Eden, in Donna Haraway, Primate Visions. Gender, Race and Nature in the World of Modern Science (New York and London: Routledge, 1989), pp. 26-58.

38

125

it has gone peer-to-peer. In the first world at least, curatorship of a kind is now open to the masses, and the universal museum has expanded to include the individual personal sphere of many citizens.

Documented Lives

Internet Collapses Under Sheer Weight Of Baby Pictures

SAN FRANCISCOMany web users were trapped without service Monday, when a large section of the Internet collapsed under the weight of the millions of baby pictures posted online. Some personal web pages contain literally hundreds of adorable infant photos, MCI senior vicepresident Vinton Cerffe said. Add to that the number of precious pumpkins on photo-sharing sites like Ophoto.com, and anyone can see it was a recipe for disaster. The Internet simply was not designed to support so much parental pride. Cerffe said he expects regular web-traffic flow to resume once the nations larger Internet providers are reinforced with stronger cuteness-bearing servers. The Onion, 28 July 2004

Biographies are written all the time about people who are perceived to be important, and generally have to work backwards,
Fig. 10 Winston Churchill as a toddler, in 1876

through anamnesis, in order to reconstruct past events. But when you know right

away that someone is going to be important aristocracy by birth being a likely pointer, at certain times and in certain countries then the data gathering for the eventual biography can and often does begin from day one. By the time Winston Churchill was born, in 1874, his father, heir to the 7th 126

Duke of Marlborough, had embarked on a career that would culminate in the position of Chancellor of the Exchequer. Naturally, some expectation of greatness was invested in his progeny, and it was perhaps with this in mind that the methodical archiving of the life of his son Winston began in earnest from his very childhood. I am not referring of course merely to the certificates required by law regarding his registration as a citizen, but to a vast array of documentary evidence on his activities and progress at school, as well as all of his correspondence and naturally the photos and portraits in which he appeared. By the time his son Randolph and historian Martin Gilbert set out to write the definitive version of his biography, this archive had expanded to truly monumental proportions. The Official Churchill Biography, listed in the Guinness Book of World Records as the largest ever written, is approaching completion and will eventually comprise eight main volumes and 23 companion volumes, which consist of facsimiles of a substantive portion of the archival records and thousands of photographs. To give an idea of the size, the three-part companion to Volume 5 alone is over four thousand pages long. One might well ponder what it would be like to be constantly followed by this archival shadow, this alter ego made of information something I shall return to in the fourth chapter. The more straightforward point I want to make here is that records like Churchills are no longer the exclusive prerogative of few families of very high standing, but rather something that most people in the West can seek to emulate, and in many respects greatly surpass. Michel Foucault traced the beginnings of a similar movement at the level of the institutions of eighteenth century Europe, and called it a lowering of the threshold of description, which meant that henceforth it would no longer solely be the privileged few to be described in detail, but every individual39. He argued that this compulsory objectification and subjection, deployed with heavy reliance on the technology of writing and

Michel Foucault, Discipline and Punish The Birth of the Prison (London: Penguin Books, 1997), pp. 77-78.

39

127

the accumulation of documents, constituted a means of control and a method of domination in that it enabled the drawing up of Norms. Whether this can also be said of archival projects of the self-inflicted kind, or those not carried out by corporations or the state, can be a little more contentious. For instance, how should we characterise the proliferation of Web directories and sites devoted to baby pictures satirised in the passage from The Onion at the top of this section? Is Cuteness the norm at work in this domain? Is a form of control sought and obtained? Or is this, rather less menacingly, a means of cutting across temporality, both by committing memories that are regarded as precious, fragile and fleeting such as the images of very early childhood to a medium where they might not fade as readily, and by sharing these memories, so that they will be remembered by others too? Whereas the shift that Foucault describes occurred within a period not characterised by substantive technological advances, following the shift to the digital and the enormous popularity of social networks of self-description and interaction such as MySpace, one could infer that the lowering of the threshold of description to ones own level is a long-held dream, reaching back to the earliest prehistoric petroglyphs, and that new technologies have made it progressively more affordable and hence attainable. In times past, a lock of hair could unite lovers across distances; but when my grandfather left his home village to complete his tailors apprenticeship in Milan, in the early 1920s, he took a photograph of my
Fig. 11 My grandmother Ermes Magnoni at age 16, in 1922

grandmother with him instead. Not literally a piece of her, however

128

steeped in the symbolic conventions of accepted social practice, but an image, a description of her. And arguably he did so because he could: the technology and practices that enabled the Marlboroughs to take little Winstons early photographic portraits had come within the reach of a very modest farming community in the north of Italy. It would be largely pointless in my view to try to gauge to what extent the set of habits that fall within the scope what Huyssen calls obsessive selfmusealisation is the product of technological advances, or rather constitutes the driving force behind the demand for products that directs and sustains these advances. A typical way of framing the question would be this: are digital cameras so sophisticated and affordable because so many people are keen to use them, or is the extensive use of the technology driven by the affordability and usability of the cameras themselves? This is chicken-andegg stuff, and the answer would have to be that neither is wholly true, both are partially true. What cannot be put in serious doubt, however, is that the modalities of description and self-description have changed remarkably in conjunction with the expansion of personal computing. Not only has the threshold been lowered, but the intensity and granularity of the description have increased several fold. As a result and I would like the Onion piece and an image of the acreage occupied by Churchills biography to be borne in mind at this point the amount of information produced by this activity has also increased in staggering fashion. Recording devices such as the LifeLog, already touched upon in the introduction, promise to increase the order of magnitude even further, broadening the definition of what kind of experience is or can become meaningful a question that Perecs life work tried to grapple with as well. In biography too, as in the digital museum and library, the answer is veering ever closer towards everything, and this again is consistent with the breadth of the current definitions of information and with the straight equation between information and value held to be true by so many.

129

In The Road Ahead, a pamphlet aimed at touting the bright new horizons of the digital age, Bill Gates shows how easily in this type of discourse reasons and objectives, causes and effects can be made to merge:
[T]he highway will also make it possible for an individual to keep track of his or her own whereabouts--to lead what we might call a documented life. Your wallet PC will be able to keep audio, time, location, and eventually even video records of everything that happens to you. It will be able to record every word you say and every word said to you, as well as body temperature, blood pressure, barometric pressure, and a variety of other data about you and your surroundings. It will be able to track your interactions with the highway--all of the commands you issue, the messages you send, and the people you call or who call you. The resulting record will be the ultimate diary and autobiography, if you want one.40

The passage contains an implicit syllogism we encountered before in this chapter: we can, therefore we should. Barometric pressure is seldom consciously experienced and is arguably quite a marginal feature of everyday life, except perhaps in certain extremes of weather. But it is easy enough to measure and to add the record. It is just another number, another stream of data. Smells, by contrast, are a salient type of experience, one that can come to dominate all other sensory inputs at times, as Patrick Suskinds novel Perfume makes so vividly clear; but they are also very hard to apprehend except via ones nose (or to describe in words except through the exceptional prose of Suskind). There are currently no recording devices that are capable of simulating the sense of smell in the same way that a camera simulates eyesight, or a microphone hearing. Therefore smell is not part of the ultimate diary and autobiography and by implication is shut out of the realm of experience. Barometric pressure is in, odours are out, and so are the rich worlds of tactile experience and taste, along with their respective reservoirs of recollection accounted for, besides Prousts monumentally obvious example, in important works such as David E. Suttons excellent Remembrance of Repasts: An Anthropology of Food and Memory.
40

Bill Gates, The Road Ahead (New York: Viking, 1996), p. 303.

130

But even allowing for these egregious omissions, the prospect of a digitally documented life raised by Gates staggers in its eagerness to include so much, and begs a question which is central to the present discussion: what are we to do, both as individuals and as a society, with all this information, now that most of we citizens of the first world can each leave a trail of documents far larger than Churchills? We could agree, as I suggested earlier, to defer all questions that relate to value, reasoning that even the measurements of barometric pressure of every moment of ones life may become a form of knowledge in another time or place, in ways that we cannot predict, just as the original author or authors of the Rosetta Stone could not possibly have divined the significance it would acquire several centuries later. But surely this kind of self documentation is not produced solely for the (dubious, at any rate) benefit of posterity, but rather in the hope that it will retain a living connection with the archival subject. The resulting record would need therefore to be accessible in ways that allow us to locate and retrieve the salient details, the meaningful passages. But how? As the elision of smell, touch and taste from computer simulations forcefully reminds us, the world is analogue, not digital. Once the multimedia record of the sights, sounds and atmospheric conditions of a minute of ones life has been converted into 1s and 0s, the machine becomes oblivious to the human meaning of most of the data, to the extent that it has no idea of what we were saying or doing at the time, and in the company of whom. To a computer, a recorded sentence is a wave pattern, not a sequence of words. If we wished to replay a joke we heard from a friend some time last month, we would not be able to locate it by entering the punch line, or a couple of keywords. Most likely, at least until computer scientists have solved key problems in visual and aural machine recognition, all that the user could do in such an event would be to rewind the whole recording at a speed still slow enough to enable them to visually recognise some of the features of the scene they need to isolate. Which leads to the other key

131

variable: time. A recorded life that has no glosses, no captions, no clear markers that separate the significant moments from the insignificant, would take a lifetime to watch. The ultimate diary and biography envisaged by Gates would lead then to a paradox: that of an information vault that nobody could open41. *** One could retort that such a diary might still be of some use. Gates himself, revealing perhaps a little more than he should, seems to think that it would be good news for those who might one day need an alibi. But maybe a better access to the vault could be obtained simply by refining the indexing and searching techniques. If it were possible to generate a sort of parallel log, a list of brief descriptions of the main events of each day, then certain segments corresponding to key dates and incidents would still be relatively easy to locate, in spite of the seeming lack of points of reference that could help to unravel the text as a whole. These tags (or metadata) could be incorporated in the recording, thus simplifying the task of navigating through it; a bit like in a DVD, where one can jump to a given scene or sequence thanks to certain pre-determined markers. It would also then be possible to fast-forward though a documented life, cut through the routine and go straight to the moments that really matter. Omar Naims The Final Cut (Canada/Germany, 2004) offers a glimpse of what the process might look like. The premise of the film is that some time in the 1950s a chip was

Ten years on from The Road Ahead, several experiments of the nature envisaged by Gates are well underway. Besides the already mentioned LifeLog, with its as yet unclear military applications, one of the standouts is the work of Microsoft researcher Gordon Bell which goes by the name of the software designed to support it, MyLifeBits, and that could well be seen given where its funding comes from as the one most directly connected to Gates original idea. As of May 29th, 2006, when Mr Bell spoke at Victoria University of Wellington, all the technical shortcomings I have discussed particularly the fact that none of the recorded audio and video can be indexed or searched are still unresolved, a fact that seriously undercuts the wonderfully pithy opening slide in Mr Bells presentation in which he proclaims, I am data. For more information on MyLifeBits and its extensive coverage in the media, see the projects website at http://research.microsoft.com/barc/mediapresence/MyLifeBits.aspx

41

132

invented that could be implanted into someones brain at birth and record their visual and auditory input, strictly from their own POV, 24 hours a day throughout their lives. The sole purpose of this digital archive, which cannot be accessed before the death of the carrier, is apparently to provide the material for a film to be created by a professional editor (or cutter) and shown at the persons funeral as a cinematic eulogy42. The cutters job consists then not only in isolating the most meaningful sequences in the midst of a lifetimes worth of footage, but more specifically those that enable mourners to construct a positive overall picture of the deceased. The emphasis is therefore on the selection of information. Unfortunately the films setting remains stuck on ludicrous pretty much throughout, and is rather more perplexing than illuminating or insightful, but it offers nonetheless an attempt to flesh out the imagining of Gates, make its contours a little less vague. Of interest to the present discussion is the moment when the cutter (Alan Hakman, played in the film by Robin Williams) uploads a memory chip to his console and sets out to approach its contents for the first time. A straightforward viewing of the lifelong film is of course out of the question, and for obvious enough reasons. What the

In passing, it is interesting to note that the output of the chip displayed by the cutters machine is already remediated as film. The visual input, captured at the level of the cortex, is framed in a perfect rectangle with uniform cinematic focus and none of the constant saccading of the pupils and narrow areas of sharp focus that characterise human vision. The simplification fits in neatly with the conventions of the genre, of course, and one would be well advised not to get too hung up on such details; but it is also consistent with the merging of recording apparatus and the human senses which characterises the discourse and practices of description and self-description through digital means. In order for certain claims to be staked regarding this project, the medium has to become as transparent as possible, at the cost of conveniently forgetting that there are substantial differences between viewing a scene with ones eyes, in three dimensions, as opposed to on a two-dimensional screen, as well as glossing over the complete elision of the senses of smell, taste and touch. The latter in particular serves well the peculiar sanitisation of so many digital imaginings, which excise most bodily sensations and fluids, or relegate them to obscure areas outside of representation. To name but one of these areas of experience, the digital does not do smells very well, as I argued earlier in this chapter, hence the elision is in a way connaturated with the medium (as well as shared by others, such as photography or cinematography, of course). But there is a deeper undercurrent here, a more radical departure from materiality and the body. One only needs to think of cybersex, that is to say sex without bodies, and how central it is to the digital imaginary. Some of these issues will be revisited in the next chapter.

42

133

console does, then, is to sort the 544,628 hours worth of life-files while they are being uploaded, generating in the process a somewhat spurious list of categories that begins with childhood, sleep, puberty, eating, awkward phase, romantic life, temptation, personal hygiene, religion, tragedy, wedding, masturbation, fears, athletics, growth spurt, university, violence, school, courtship, career. On the basis of this highly selective screening carried out by the machine, Hakman can proceed to delete some categories altogether, and skim the others in search of the few sequences that will be included the feature-length film of the dead mans life. The machines ability to assign such sophisticated semantic value to information enables Hakman to access an otherwise substantively unreadable text (recall Lesks observation that much information gathered by computers is in effect readable only by other computers). While this process is ultimately aimed at facilitating the search for the moments of significance, it appears to do so through a process of elimination, that is to say with a bias towards deletion. It tags sleep and masturbation, for instance, so that Hakman can press the DELETE button and move on. The very name of the console, the guillotine, while it originates from a long-serving tape splicer used in film editing, is nonetheless suggestive of the kind of radical and wilful forgetfulness that is required in order to isolate meaning in the midst of such expanses of information.

Fig. 12 The wood-carved DELETE button in Hakmans portable guillotine.

134

The Glut of Information


The problem with this set up is that currently the guillotine is the science-fictional part, whereas the ability to equip oneself with all sorts of electronic sensors and recording devices is the mundane reality. Contrast the guillotines capacity to quickly and efficiently (if a little idiosyncratically) make sense of a massive stream of content to the spectacular failure of tools such as ContentAudit to find pornography and violence in a mere hard drives worth of files, and the arguments of those who lament the existence of a glut of information will come into sharper focus. The key question posed by these critics is whether the technological and more broadly cultural tools which enable us to make such selections are actually keeping up with the increases in the sheer amount of information that the grand digital archive projects we have encountered in the course of this chapter library, museum and biography demand that we gather and preserve. In the afterword to The Death of Cinema, Paolo Cherchi-Usai outlines the scale of the problem by trying to match the content generated by a subsection of contemporary media, the one that deals in moving images, with humanitys current capacity to make meaningful use of it:
It is estimated that about one and a half billion viewing hours of moving images were produced in the year 0199943, twice the number made just the decade before. If the rate of growth continues, three billion viewing hours of moving images will be made in 02006, and six billion in 02011. By the year 02025 there will be some one hundred billion hours of these images to be seen. In 01895, the ratio was just above forty minutes, and most of it is now preserved. The meaning is clear. One and a half billion hours is already well beyond the capacity of any single human: it translates into more than 171,000 viewing years of moving pictures in a calendar year.44

The five-digit format to indicate the year is borrowed by from Stewart Brands The Clock of the Long-Now (see page 113 of this work).
44

43

Cherchi-Usai, The Death of Cinema, p. 111.

135

It could be opined that measuring the sum of the visual content produced worldwide against the attention span of a single human being is something of a distortion, for the full breadth of our cultural production is not meant to be consumed by a single individual; that we might prefer to say that a mere 171,000 people sitting in front of a screen around the clock for a year (or three times as many, if we chose to make them work in more humane eighthour shifts) would manage to watch down to the last moving image produced in 1999. And that, at any rate, the maths in itself is unconvincing. The number of books circulating in 1895, to take the year of the birth of cinema as an example, in all likelihood also far exceeded the capacity of a single human being to read them, but this did not lead, nor has it yet, to what we might be inclined to refer to as the death of literature. And yet I remain convinced that issues such as those raised by Cherchi-Usai are not so easy to dismiss. Whether or not the current shifts in the media landscape represent a quantum leap in comparison to the past, as Cherchi-Usai would have it, is very much open to debate. While most of us would agree that the bombardment of content we currently experience through our computers, televisions, game consoles, cinemas, radios and printed media is unprecedented in absolute terms, not all commentators are inclined to regard this as a radically new kind of phenomenon, nor to conclude that our cultural means of processing the information are likely to be overwhelmed. Among these are not only techno-enthusiasts such as Negroponte and Gates, but also critics who engage in a much more committed and multi-layered type of critique. Such is the case of Manuel Castells, author of the monumental and ambitious study The Information Age: Economy, Society and Culture. Asked in an interview to comment on the capacity of our culture to withstand the exponential increase in the flow of information, Castells replied:

136

The glut-of-information idea is simply a primitive, misleading, cheap shot of neo-Luddites. There can never be enough information. We ignore so many important things. And on a planet largely illiterate, and ignorant (including widespread ignorance in a large segment of the American people for instance, who knows what DNA does?), to speak of information glut is simply an insult to intelligence. The issue is the relevance of information for each one of us, how to find it, how to process it, how to understand it. For this we need more information technology, not less. We need much better browsers, we need more sophisticated design of web sites, we need user-friendly, mobile devices. We need a quantum leap in information/communication technologies, information storage/retrieval systems, and education systems.45

At the same time as he claims that the problem is one of relevance, in this passage Castells implies to the contrary that all information is intrinsically valuable (read: potentially valuable to someone) by stating that there can never be enough of it. But if he truly allowed the notion that there might be such a thing as irrelevant information to be entertained, he would have to concede, at least from a theoretical standpoint, that a steep increase in the overall amount of information would render it more difficult to identify and process what is relevant. In this respect, Castells framing of the problem is weak: people who ignore what DNA is and does do not necessarily need more information (for the information about the nature of DNA is already there); they need to be better informed. They need an education, strictly in the etymological sense of being led out, not merely of ignorance, but also of a paradoxical state of too much information in which the search for relevance is frustrated. At what stage, then, if ever, does the increase of information cease to correspond to an increase in knowledge, and reverse the flow? The image chosen by Cherchi-Usai for the cover of The Death of Cinema in which the frame is hit by the full force of an obliterating explosion suggests not a

John Gerstner, The other side of cyberspace. Interview with professor Manuel Castells, Communication World 16.4 (March 1999) p. 11.

45

137

gradual progress but the reaching of a breaking point, following which we can imagine, post-apocalyptically, a new state of quiet in which both information and knowledge are reduced to zero. The death of cinema but the same could be said of any medium under the accelerating force of digital technology will mean primarily the loss of the artifacts it used to be made up of, that is to say, of pieces of information which have been processed to the point of achieving cultural significance.46 Whether or not a breaking point will be reached, whether or not the glut will drown out the information exchange altogether, depends in part on the outcome of a technological struggle between the instruments of production and storage of information on the one hand, and hypermedia catalogues, search engines, intelligent indices on the other. As Castells implies, there is no cultural way out of the problem: only more (read: better, more sophisticated) information technology will do. To compound the problem, the struggle is a fundamentally lopsided one: for quantitative growth in the production/storage devices needs to be matched by qualitative advances in the sorting/processing devices. As things stand, memory technologies are capable of accommodating more and more content at the current rate of growth; searching and processing devices, on the other hand, need to constantly evolve, to find new and more intelligent ways of establishing connections in the midst of the ever-increasing content.
Fig. 13 - The Cover of Paolo Cherchi-Usais The Death of Cinema

In his presentation during the Film and History Conference in Wellington in late 2000, Cherchi-Usai stated his provocative conviction that Orson Welles Citizen Kane, that is to say the paradigm of the cinematic masterpiece and hence a prime candidate for any effort of film preservation, will not survive beyond the next three hundred years.

46

138

Taken to its paradoxical extreme, the information age re-enactment of the dream of the universal archive leads to the Borgesian Library of Babel: an impenetrable labyrinth of artifacts that have ceased to be cultural, in that by having exhausted the combinatorial possibilities of text production they will have succeeded in formulating every possible meaning, thus rendering meaning itself meaningless.

The Library of Babel in Cyberspace


The library imagined by Borges consists of an indefinite and perhaps infinite number of hexagonal galleries47 containing every possible book. That is to say, every combination of the twenty-odd orthographic symbols, and hence
everything which can be expressed, in all languages. Everything is there: the minute history of the future, the autobiographies of the archangels, the faithful catalogue of the Library, thousands and thousands of false catalogues, a demonstration of the fallacy of these catalogues, a demonstration of the fallacy of the true catalogue, the Gnostic gospel of Basilides, the commentary on this gospel, the commentary on the commentary of this gospel, the veridical account of your death, a version of each book in all languages, the interpolations of every book in all books. (pp. 75-76)

The combinatorial nature of the collection means that in practical terms the library cannot be accessed in the sense that we are familiar with that is to say: selectively, in search of specific content. The catalogue is only one of the books, and to have a chance of stumbling upon it would take several lifetimes; besides, if one were lucky enough to find a catalogue, there would be no way of knowing whether it is the faithful one; and even in the event that the catalogue happened to be the faithful one, its listings would be so

Jorge Luis Borges, The Library of Babel, in Fictions (London, John Calder, 1965), p. 72.

47

139

numerous that it would prove to be as impossible to navigate as the library itself48. To find a particular book, then, the inhabitant of the library is condemned to browse, wandering perpetually from one hexagon to the next, with no hope of success. Not only the task of finding a particular book, but also that of discovering a book that makes any sense at all within ones cultural framework is a virtual impossibility. This is nicely illustrated by the narrators remark that a book very much consulted in his zone is a mere labyrinth of letters, but on the next-to-the-last page, one may read O Time your pyramids (p. 74). This is not to say that the books are meaningless in absolute terms: on the contrary, looking at culture also from a combinatorial point of view, the narrator is able to conclude that every book must have a meaning in a given time or place, past, present or future49:
I cannot combine certain letters, as dhcmrlchtdj, which the divine Library has not already foreseen in combination, and which in one of its secret languages does not encompass some terrible meaning. No one can articulate a syllable which is not full of tenderness and fear, and which is not, in one of those languages, the powerful name of some god. To speak is to fall into tautologies. (p. 79)

Borges story remains one of the richest and more resonant allegories of the relationship between humanity and the written word, one so often quoted in

It appears reasonably obvious that, within the limitations of book length and orthographic symbols offered by Borges, the Library catalogue could not possibly be a book similar in kind to the others, for it would be so vast as to approximate in size the rest of the collection. A more fitting conceptualisation is the one attributed to Letizia Alvarez de Toledo in the footnote at the end of the story regarding a substitute for the whole library: a single volume of ordinary format, printed in nine or ten type body, and consisting of an infinite number of infinitely thin pages. [] This silky vade mecum would scarcely be handy: each apparent leaf of the book would divide into other analogous leaves. The inconceivable central leaf would have no reverse. (p. 80) Or in fact perhaps different meanings in different times and places: An n number of possible languages makes use of the same vocabulary; in some of them, the symbol library admits of the correct definition ubiquitous and everlasting system of hexagonal galleries, but library is bread or pyramid or anything else, and the seven words which define it possess another value. You who read me, are you sure you understand my language? (p. 79)
49

48

140

the context of discussions about the nature of the library and of modern culture as to have become something of a mantra. And yet, even as media historians mourn the passing of the age of print, it still demands a prominent place in the discourse surrounding the nature of the contemporary, hypermediated cultural exchange. Written in 1941, four years before the paper on the memex in which Vannevar Bush gave us the first glimpses of what we now call hypertext, the Library of Babel has been found to be a powerful thinking tool to make sense of the universal digital archive and of its most likely current instantiation, the World Wide Web.

Fig. 14 A computer artists impression of the Library of Babel in cyberspace

Under the pseudonym the Librarian of Babel, journalist Mike Holderness has written in the online magazine Ariadne a series of incisive and ironic columns on the role of the World Wide Web within the Library of Babel, an institution whose mission statement is to make accessible to all the totality of human knowledge and of which Borges story was, in his words, a melancholy brochure.
At the time of Borges brochure, our Mission Statement would have seemed impossibly ambitious. The Library had, it is true, an acquisition programme which, we believe uniquely, guaranteed absolutely that it would contain all human knowledge, and much else besides. But that

141

brochure (probably the finest produced by any library) could not but reflect the misery of the then librarian, faced with the impossibility of the task of cataloguing. We were immensely encouraged, however, by the emergence from the mid-1940s onwards of proposals for competing institutions. (I refer, of course, to the extraordinary prescience of Vannevar Bush and Ted Nelson.) And, from 1989, it seemed that the arrival of the World-Wide Web and then of its search engines offered us salvation. We were no longer, it seemed, fated forever to footle in fallible catalogues.
50

As I discussed in chapter one, the technology that underlies the World Wide Web enables the acquisition and dissemination of texts from all over the world while maintaining the characteristics of a single, uniquely accessible collection, thanks to the very high degree of interoperability of the software in charge of displaying its holdings. The analogy with the Library of Babel is powerful because of the sheer number of texts delivered by the Web; its lack of a traditional logical or hierarchical structure; its almost organic mutability; and, most importantly, an underlying tension: on the one hand, the Web appears to contain an indefinite (although not infinite) number of texts, approximating the sum of human knowledge; on the other, knowing that a given text may be contained in it does not mean that we will ever be able to find it51. For one thing, Web page addresses, like the letters on the spines of the books at the Library do not indicate or prefigure what the pages will say (p. 73); and, more importantly, the Webs content, by its own mutable, unstructured nature, escapes many conventional forms of cataloguing.

50

Mike Holderness, The Librarian of Babel: The key to the stacks, Ariadne 9 (1997), http://www.ariadne.ac.uk/issue9/babel/.

As Borges writes: When it was proclaimed that the Library comprised all books, the first impression was one of extravagant joy. [] The uncommon hope was followed, naturally enough, by deep depression. The certainty that some shelf in some hexagon contained precious books and that these books were inaccessible seemed almost intolerable. (Borges, The Library of Babel, pp. 76-77)

51

142

Search engines, on the other hand, provide ways of cutting through the large amounts of unfiltered, non-hierarchical information, thereby increasing our chances of isolating what is most relevant to us. The expression search engine has come to cover a lot of different types of Web tools. These include submission-based directories such as Yahoo, archives which strive to gather and make available to keyword-based searches the totality of online content, such as Google, and so-called intelligent agents that track user behaviour and search autonomously for possibly relevant data, such as Letizia52. By enabling the user to establish links between previously nonconnected locations on the web, search engines enhance hypermediacy and vastly improve our ability to find meaningful constellations in the Barthesian galaxy of meaning that is the World Wide Web. Useful as they are, however, search engines are far from infallible, and for several reasons. As Holderness argues, albeit a little pessimistically, [t]he more resources have been directly invested in producing a document, the less likely it is to be accessible through a Web search53. This is due to the fact that, in part owing to the economic model under which the Web operates, many content providers who deliver their material on-line for free still choose, as we have seen, to protect their pages from web-crawlers, preferring that their sites be searched internally. But far more crippling are the limitations that search engines have with regard to semantics and the growth of on-line information. As discussed under the heading of the stickiness of the database, at this stage machine searches can for the most part only scan for content, not meaning, and have a strong bias towards textual and numerical information. This means that they are capable, for instance, of finding every Web page

Letizia owes its name to Letizia Avarez de Toledo, from The Library of Babel. For the relevant quotation see note 48 on page 140. The Letizia Home Page was formerly at http://lieber.www.media.mit.edu/people/lieber/Lieberary/Letizia/Letizia.html.
53

52

Holderness, The Librarian of Babel.

143

available to them containing the word hat, but not every available picture containing a hat, unless the hat is in the pictures title or description54. To state this as a limitation may seem unfair: after all it is not as if we can currently search print books for pictures of hats (or for the word hat, for that matter) without browsing them ourselves. But the uniquely horizontal, non-hierarchical organisation of information on the Web is predicated on a very high degree of searchability; after all, how else could we hope to orient ourselves in an environment in which the sites of the largest institutions, say the Library of Congress, are accessed by means of a four-number IP address, and are in this regard just as visible as the most slapdash personal webpage? As computers get more powerful and bandwidth grows larger and larger, the need for searchability demands that the increase in multimedia content be matched by the development of tools that can sort through it. Which takes us back to the issue of how much information can the cultural, as well as technological, interface between users and the network withstand. In 1996 Umberto Eco wrote in his regular column for the weekly magazine LEspresso about an Internet search he had done on the platypus, which had yielded what he regarded as a surprisingly large number or results: approximately 3,000 web pages. Besides a number of red herrings (the home page of a girl nicknamed Platypus, a restaurant by that name, etc.) he found a few reputable and well-compiled sources and many amusing tributes to this most post-modern of animals55 enough material to fill his standard 800-odd word column. If he were to repeat this search today, Eco

Among the attempts to address the textual bias of search engines is QBIC, or Query By Image Content, software developed by Microsoft that enables the user to search for pictures by selecting colours from a palette or sketching shapes on a canvas. The software may one day, as Gates suggests in The Road Ahead, make it possible to find a famous painting by describing its features to a computer, much as we can find a poem by entering one of its lines into a search engine. At this stage, however, its rate of success is still very limited. For an example of the interface in action see the website of the Hermitage Museum in St. Petersburg at http://www.hermitagemuseum.org/fcgibin/db2www/qbicSearch.mac/qbic?selLang=English. Umberto Eco, Chiamatelo platipo od ornitorinco, fatto sta che molto popolare, LEspresso, 8 August 1996 (The translation is mine).
55

54

144

would find approximately 6,500,000 pages56 (a 2,166-fold increase), and no doubt many more humorous, useful and well-crafted contributions to platypus lore. But how would he approach the task of sampling the material? After all, search tools have become more sophisticated, but they are hardly over two thousand times better than they were in 1996 (it is an example of the issue of quantitative versus qualitative advances I talked about earlier). Adding keywords, refining ones search, looking for pages where people might have entered the results of their searches on the subject at hand, are useful strategies but hardly a guarantee of success. Two studies conducted in 2002 by industry operators Mondosoft and Albert placed the failure rate of Web searches between 40 and 74 per cent, adding that only 1 in 20 visitors will scroll to the second page of search results57. This latter finding would seem to be the most damning, since due to the inability of engines to display results in a genuine order of relevance58 it indicates that most users are content to pick from a random fraction of the pages returned, and not because on average they feel that they have found there what they were looking for59. Google recognises this to the point of placing, alongside its standard Search button, another button labelled Im feeling lucky which takes the user to the first web page returned for a given query. The company

56 57

Search executed with Google in October 2006.

The results are summarised in Chris Sherman, Why Search Engines Fail, Search Day 344, August 29, 2002 (http://searchenginewatch.com/searchday/02/sd0829-searchfailure.html) What industry leader Google does is to place at the top the pages that are most linked to by other pages, with the presumption that this would mean that they have been found to be especially useful by the user community. What the page-ranking algorithm does is to raise the score of a page if the pages that link to it use the same anchor text. This makes it relatively easy to artificially raise the rank of a website by creating a number of ad hoc pages containing the link and the anchor text. This is called a Google bomb. Many commercial organisations adopt this practice to make their sites more visible, but the most notorious Google bomb was of a satirical nature. For a few days in late 2003, if one searched Google for the expression weapons of mass destruction the top result was a mock-up of the standard 404 page not found notice, explaining that weapons of mass destruction could not be found. Since these parameters are subjective, I shall take the participants in the survey at their word when they say that a search has failed, although a discussion of what might constitute a successful search on the Web would raise some interesting issues.
59 58

145

states that [a]n Im Feeling Lucky search means less time searching for web pages and more time looking at them60, but the explanation sounds awfully like an admission that the task of sorting through on-line content is getting out of hand. Some fifteen years into the history of the World Wide Web, the experience of the tension we alluded to above, between knowing that the information exists and realising how elusive it remains, seems already to be as common as one in two of our encounters with the Library of Babel in cyberspace; which the search engines would appear to illuminate, like the lamps in Borges story, with a light that is at the same time insufficient and incessant (p. 72). Meanwhile, and this is an estimate made as far back as 2000, content increases at the rate of 7.3 million pages per day in the surface web alone.61 Not all of it of course is new in the strict sense of the word: Web content has a way of replicating, migrating onto new pages in slightly different form due to copying errors, omissions or intentional editing. At the same time, as authoring tools develop ways of automating complex functions and content selection comes to rely more heavily on the work of software agents (especially for extracting and compiling information from existing databases), the process of content accumulation is becoming partly mechanised and therefore considerably faster. The mechanical production of content, coupled with the astonishing (by the standards of earlier technologies) pace of its increase, gives to the Web a semblance of the combinatorial qualities of the Library of Babel, making it possible, at least

http://www.google.co.nz/search?q=&ie=ISO-88591&hl=en&btnI=I%27m+Feeling+Lucky There are two groups of Web content. One, which we would call the surface Web is what everybody knows as the Web, a group that consists of static, publicly available web pages, and which is a relatively small portion of the entire Web. Another group is called the deep Web, and it consists of specialized Web-accessible databases and dynamic web sites, which are not widely known by average surfers, even though the information available on the deep Web is 400 to 550 times larger than the information on the surface. How Much Information? edited by Peter Lyman, Hal R. Varian et al. (Research Project, School of Information Management and Systems of the University of California, Berkley, 2000) http://www.sims.berkeley.edu/research/projects/how-much-info/internet.html.
61

60

146

conceptually, to think that at the current rate of expansion it will some day indeed hold something close the totality of human knowledge. But that will also be the day when our search engines cease to be of any use: for no matter how narrow the scope, no matter how many the keywords, the system will always return a near infinite number of results. Then we shall have no choice but to press the Im feeling lucky button.

Information and Metaphor


This, I suggested earlier, is a paradoxical extreme, and one might very well want to temper it with the contrary, everyday experience of how exhilarating it can still feel to find with ease on the Web information that, in times not too remote, would have required exhausting legwork and possibly quite some expense to obtain. But extremes too can be instructive, especially when they provide the means to critique some common but largely untested assumptions. Cherchi-Usais provocative work can be read as the cry of a film conservation expert who does not share in the evangelical fervour of those who tout the digital as the ultimate solution to the key problem in his profession how best to preserve moving images. And the glut-of -information discourse, more generally, can be read as the welcome means to expose some of the contradictions in the contrary and also extreme view that information is an absolute value. Data mining is a concise metaphorical way of investing information with the resource value of precious ore, and those who retort that information can be a pollutant (as David Shenk does with in the title of his Data Smog) operate also through metaphor. So does Borges when he characterises the memory of Funes as a garbage heap; and so do William Gibson, Robert Longo and Keanu Reeves when they represent the state of information overload in the brain of Johnny Mnemonic as a paralysing migraine.

147

What about the glut of information, then? The implicit metaphorical link between food and information is very valuable from the point of view of those who wish to critique the notion that we should have information at all costs, in that food promotes health up to a point, but then becomes detrimental to it. California Institute of Technology professor Roy Williams picked up on this when he came up with a measurement of virtual weight, and so did Martin Larsson, general manager of Toshiba's European storage device division, when he declared that
Britain has become a nation of information hoarders with a ferocious appetite for data. As storage capabilities increase and the features and functionalities of mobile devices expand to support movie files and entire libraries of multi-media content, we will all become virtually obese.62

One of the characters in DeLillos White Noise, it will be recalled, found a job teaching people how to breathe and how to eat, skills that evidently had been or were on the verge of being lost. But this is not merely a satirical punt, as the alarming and worsening statistics regarding obesity in the first world sharply remind us. In other words, the tie between the glut of information and the glut of food is something more than an accident of metaphor. Marion Nestle, professor of the department of Nutrition, Food Studies and Public Health at New York University, as well as editor of the US Surgeon Generals 1988 Report on Nutrition and Health, exhaustively documents in her book Food Politics the history of the US food industrys efforts to foster public confusion about nutrition, in spite of the fact that dietary recommendations for the prevention of chronic disease have hardly varied for the past half-century63. These recommendations are almost disarmingly simple: eat a varied diet with an emphasis on plant foods fruit, vegetables
Both quotations are from the BBC Online news article Britons growing digitally obese, (December 9, 2004), http://news.bbc.co.uk/2/hi/technology/4079417.stm. Marion Nestle, Food Politics: How the Food Industry Influences Nutrition and Health (Berkeley, Los Angeles and London: University of California Press, 2002), p. 29.
63 62

148

and grains and avoid excessive intake from any one food group, particularly the ones that are high in fat. What could be easier to remember than that? While in the period following the second world war these guidelines meant that the still predominantly malnourished citizens of the United States should eat more, a piece of advice enthusiastically supported by agriculture and food interests, now the public health imperative is to demand without reservation that they eat less. According to our best science, this message truly constitutes, in spite of the pronouncements of the like of Castells, enough information. Any more than that, and the message starts to lose its efficacy. Hence, according to Nestle, the strategy of the US food industry has consisted not only in intensely lobbying the government not to be quite so direct in telling people to reduce their food intake, but also in adding to the advice, helping promote the virtues of so many alternative diet regimes that it has become very difficult, by all measurable standards, for people to remember what the basic advice was and still is. The obesity epidemic, then, could be ascribed at least partly to an archival malfunction, a failure to contend with the volume of information circulating in the public media. This shift from information to embodiment runs counter to a potent movement extensively documented and critiqued among others by Katherine Hayles in How We Become Posthuman and by Erik Davis in TechGnosis; a movement away from embodiment and towards a radical informational transcendentalism which seeks to subvert materiality altogether. Here too information ceases to be a metaphor, but only in order to become the true and sole substance of the real. Hayles characterises the history of this movement as of a narrative whereby information lost its body; Davis conflates its core tenet in the compound noun mythinformation. Both are fascinated and more than a little horrified by its chief dream-come-nightmare that we might one day cease to inhabit human bodies and exist only as minds in the abstract world of computer science.

149

Information, memory and metaphor meet in this scenario under the tutelage of technology, and of computers in particular. The already sometimes radical views expressed in this chapter, concerning the power of the digital to record, archive and circulate the information that constitutes individual and collective memory, are taken to an extreme that is as outrageous as it is compelling: namely, that the digital may be the most natural habitat of information, therefore of memory, therefore of identity, culture and humanity.

150

Chapter Three The New Home of Mind

Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. John Perry Barlow, Declaration Of Independence for Cyberspace (1996)

You know for sure that a philosophical speculation has achieved pop status when it rates a mention in an episode of Friends. So, I just finished this fascinating book, says the character played by David Schwimmer in The One Where Phoebe Runs (1999). By the year 2030, therell be computers that can carry out the same amount of functions as an actual human brain. So theoretically you could download your thoughts and memories into this computer and live forever as a machine. Although voiced here by a quintessential geek, this idea of transgression of the ultimate biological limit, of digital transcendence, has embedded itself deeply in current discussions about the informational society, and plays a significant role in the fleshing out of the posthuman. It is an idea that breathes new life into the Cartesian cogito, allowing staunch scientific materialists to reframe old beliefs concerning the afterlife and the soul in light of one of the dominant theories of mind; and that does so not only by effacing human embodiment, but also by concealing the body of the machine. 151

The book that so fascinates Schwimmer in the Friends episode is almost certainly Hans Moravecs Mind Children (1988). In this much-critiqued work Moravec, the head of the robotics programme at Carnegie Mellon University, predicts that by 2030 computers will indeed be capable of as much complexity as the human brain, and that it will therefore be possible to map and transfer the mind onto them. The idea, as we shall see, was not entirely new at the time, nor did Moravec provide any specific technical insight or research direction in order to help shape the work that might, in time, make it a reality. But it was, it seems, an idea whose time had come to reach the wider public. Moravecs status of worldrenownedroboticist1 made him in this respect a very suitable proponent, and he packaged it skilfully through a mix of the standard rhetorical strategies of scientific divulgation and fictionalised reports that articulated very vividly the various stages of his project. In one such report, Moravec ventured to imagine how the procedure of what he terms transmigration might take place, and described an operation performed by a robot surgeon in which the brain of a human being was scanned, measured, simulated, excavated, and copied onto a robots brain; during the procedure the patient lost consciousness only momentarily, and reawakened at the very moment when the body on the operating table [went] into spasms and [died]2. The passage is not just a playful digression, a slip into science-fiction in order to give a concrete example of otherwise hard-to-grasp concepts, but an absolutely crucial step in the construction of Moravecs argument. In order to talk of a transmigration, rather than, say, a simulation or a copy, Moravec needs to present the reader with a procedure so invasive as to destroy the
1

An appellation chosen for Moravec in the introduction of Pigs in Cyberspace, a paper originally delivered by Moravec at the 1992 Library and Information Technology Association conference and published in Thinking Robots, An Aware Internet, and Cyberpunk Librarians, edited by R. Bruce Miller and Milton T. Wolf, (Chicago: Library and Information Technology Association, 1992).

Hans Moravec, Mind Children The Future of Robot and Human Intelligence (Cambridge and London: Harvard University Press, 1988), pp. 111-112.

152

originary (biological) habitat of the mind in order to create the new (digital) one. Not having any basis in fact to prefer such an option to a refinement of the nowadays-common forms of neurological scanning, such as Magnetic Resonance Imaging, he switches temporarily to a rhetorical framework, the fictional account, which allows him to make that kind of arbitrary authorial decision. In this way, when he slips again into his scientists coat, Moravec is able to argue that the new mind is not only equivalent or identical to the old one, but in every regard the same mind; whereas if the procedure had been a simulation or a copy, the reader would have been faced with the simultaneous existence of two or more minds, one encoded in the awkward biochemical messiness of the human brain, the others in the slick design of perfectly engineered microchips. If this coexistence were possible, which one would then be the real I, where would the identity that the project is trying to preserve actually reside? By the time Moravec comes to answer this question, the major premises of the mind uploading project and of digital transcendence are firmly laid. The eventual answer consists of a defence of the pattern-identity versus body-identity position, or, to put it as Katherine Hayles does in her critique of Mind Children, the privileging of informational pattern over material instantiation3, which Hayles regards as a crucial tenet of the posthuman view. Bodies, be they human or machinic, are in this view interchangeable vessels, whereas the self resides wholly inside the mind in the form of a nexus of informational patterns. These patterns are physically encoded in the brain but can in principle, and soon will be in practice, understood and abstracted, scanned and copied, stored and uploaded more or less at will. This latest formulation of dualism, which in Hayles study is shown to have colonised contemporary techno-scientific discourse in the wake of cybernetics, is of course strongly reminiscent of its Cartesian forerunner, which in turn did much to bridge the gap between earlier philosophical and

Hayles, How We Became Posthuman, p. 2.

153

religious discourses and the conceptualisation of mind in the age of science. Descartes res cogitans, unlike the spirit, is not a life principle, but specifically a thinking thing, an informational device if you will. Being defined by its non-materiality, just as the res extensa is defined by its inability to formulate thoughts, it provides a powerful conceptual lens through which the human whole can be split and the mind regarded in pure isolation from the body. One of the chief problems for Descartes, given his empiricism, was how to reconcile this definition grounded on mutual exclusion with the observation that minds and bodies function together, without his having to resort to religious explanations located outside of science, in the realm of the unknowable. Descartes positing of an ad hoc interface, the epiphysis or pineal gland, only circumvented the issue, for it merely named the very thing that his theory refuted a res cogitans which is also extensa and this explains at least in part why Cartesian dualism is in such a state of apparent disrepute. And yet this is the exact same conundrum that Moravecs robotsurgeon operation runs into: if consciousness can be computationally modelled, expressed in a form that makes the medium utterly redundant, how does it interact with the medium, through what kind of interface and connections? And how can these connections be severed and recreated, if the mind is to travel from one body to another? We know that this cannot be a straightforward revival of Cartesian dualism, because philosophers of mind have moved on, and the disavowals abound. Erik Davis, on his way to discussing the stubborn resilience of the cogito in our cybercultural musings, has quipped that [o]f all the lumbering giants of the Western philosophical tradition, none now resembles a punching bag as much as Ren Descartes,4 who gets it from all sides. Including, peculiarly,

Erik Davis, Synthetic Meditations: Cogito in The Matrix, in Prefiguring Cyberculture: An Intellectual History, edited by Darren Tofts, Annemarie Jonson and Alessio Cavallaro (Cambridge: MIT Press, 2003), p. 12.

154

some of the proponents of cognitive science. But, as Davis also notes, the cogito still exerts a considerable intellectual magnetism. In Consciousness Explained, for instance, Daniel Dennett develops a comprehensive empirical theory of mind in direct and explicit opposition to Descartes. And yet he too ultimately finds himself drawn, albeit only in principle, to the notion of digital transcendence. If what you are, he writes,
is that organization of information that has structured your bodys control system (or, to put it in its more usual provocative form, if what you are is the program that runs on your brains computer), then you could in principle survive the death of your body as intact as a program can survive the destruction of the computer on which it was created and first run.5

But is this not ultimately the same as saying, with Descartes, that the I is a substance whose whole essence or nature consists only in thinking, and which, that it may exist, has need of no place, nor is dependent on any material thing6? Another fierce anti-dualist, Steven Pinker, claims that the computational theory of mind espoused by the likes of himself and Dennett is one of the great ideas in intellectual history, precisely because it solves one of the puzzles that make up the mind-body problem: how to connect the ethereal world of meaning and intention, the stuff of our mental lives, with a physical hunk of matter like the brain.7 How to reconcile, then, this claim for anti-dualism with the visions of digital transcendence that the computational model has spawned? I believe, with Hayles, that one has to acknowledge the extraordinary force and pervasiveness of the metaphor on which this model rests. The idea that it is possible to download mental content is a lot older than the computer. When Milton wrote that books are not absolutely dead things, but do contain a

Daniel Dennett, Consciousness Explained (London: Penguin Press, 1992), p. 430.

Ren Descartes, A Discourse on Method, edited by A.D. Lindsay (London: J.M. Dent & Sons, 1912), p. 27. Steven Pinker, How the Mind Works (New York: Norton, 1997), p. 24.

155

potency of life in them to be as active as that soul was whose progeny they are; nay, they do preserve as in a vial the purest efficacy and extraction of that living intellect that bred them,8 he described a process of transmigration whereby the written page is imbued with the soul (we would say the mind) of its author; and this soul, it is implied, remains capable of continued participation in intellectual life, so long as its substrate is permitted to circulate among the readers of successive generations. Other narratives articulate scenarios in which it is the book that exists inside the mind rather than vice-versa. Think for instance of the Book People in Bradburys Fahrenheit 451, who memorise (and, in so doing, give body to) the contents of the books that are not permitted to circulate in their society. In Too Loud a Solitude, Czech novelist Bohumil Hrabal proposes a third kind of commingling, which emphasises equally the physical and informational aspects of the book/text and body/mind binomials. The protagonist of Hrabals novel, Hanta, is the operator of a hydraulic press for compacting paper who develops a surrealistically symbiotic relationship with the books he destroys, a relationship that calls into question the boundaries between cultural products and individual experience and sensibility, and that is supremely self-conscious and almost obsessively articulated by the protagonist-narrator right from the novels outset:
For thirty-five years now Ive been in wastepaper, and its my love story. For thirty-five years Ive been compacting wastepaper and books, smearing myself with letters until Ive come to look like my encyclopaedias-and a good three tons of them Ive compacted over the years. I am a jug filled with water both magic and plain; I have only to lean over and a stream of beautiful thoughts flows out of me. [] Such wisdom as I have has come to me unwittingly, and I look on my brain as a mass of hydraulically compacted thoughts, a bale of ideas, and my head as a smooth, shiny Aladdins lamp. How much more beautiful it must have been in the days when the only place a thought could make its mark was the human brain and anybody wanting to squelch ideas had to compact human heads, but even that wouldnt have helped, because real

John Milton, Aeropagitica (Westminster: A. Constable and Co., 1903), p. 35.

156

thoughts come from outside and travel with us like the noodle soup we take to work; in other words, inquisitors burn books in vain. 9

What is immediately striking about these non-digital age imaginings is the extent to which they are grounded in rhetoric. They argue that books and minds may, in important ways, occupy the same space, but do not venture to suggest that a literal transmigration will ever take place. Even Hanta, who is otherwise comfortable with his deep and sensual involvement with the printed page, fears as catastrophic the prospect that he might one day literally become one with the books which he has cluttered his home with over the years, and that always threaten to crush him in his sleep as they weigh massively on rickety and precarious shelves. Why, then, this rhetorical gap? Why is it that the robotics expert Moravec can resort to bold, factual denotation in order to put forward a scenario which is largely analogous to what the exponents of book culture could only craft into elaborate tropes? A possible explanation is that the two metaphors are of a different order. A book is an inert object, good for storing information but not for processing it one of Platos objections against writing was precisely that a book cannot sustain an argument. Computers, on the other hand, can play with symbols and produce new and often unexpected outputs from the inputs that are fed into them. That machinic computation and human reasoning have much in common is a position that has gained a steady ascendancy in the course of the last few decades, and that has taken hold in the public discourse thanks not only to works of fiction and popular scientific books but also to much advertised and carefully staged events such as the chess tournaments between grandmaster Garry Kasparov and IBMs Deep Blue. Metonymy, rather than metaphor, is the key trope here: for if being as good a chess

Bohumil Hrabal, Too Loud a Solitude, translated by Michael Henry Heim (San Diego, New York and London: Harcourt Bruce Jovanovich, 1990), pp. 1-2.

157

player can stand for being an intelligent person, then proving that a machine can compete at chess with the best human player will make it intelligent, too. In this regard, the three games won by Deep Blue, first in 1996 and then in the tournament that the machine won in May 1997, are less important than the fact that the tournaments were played at all, in front of an audience and with all the trappings of a world championship amongst human players. The games official commentators appeared deliberate in their efforts to describe the game play of the two opponents as being drive by human-like strategising, as opposed to computer-like problem solving. In summing up the first game of the 1997 series, lead commentator Ashley said for instance:
Yesterday's game was highly dramatic. It was the kind of game that everybody wanted to see. Exciting, bold play. Kasparov at first was very timid with his play. He wanted to be cautious. But Deep Blue took it to him, wanted to play very exciting chess, very unconventional chess in the early stages, but later it got wild and woolly and Kasparov was able to manage the complications and deal with the computer on its own ground.10

But arguably, and using roughly the same number of words, he could just as easily have said:
The game went like this: 1.Nf3 d5 2.g3 Bg4 3.b3 Nd7 4.Bb2 e6 5.Bg2 Ngf6 6.O-O c6 7.d3 Bd6 8.Nbd2 O-O 9.h3 Bh5 10.e3 h6 11.Qe1 Qa5 12.a3 Bc7 13.Nh4 g5 14.Nhf3 e5 15.e4 Rfe8 16.Nh2 Qb6 17.Qc1 a5 18.Re1 Bd6 19.Ndf1 dxe4 20.dxe4 Bc5 21.Ne3 Rad8 22.Nhf1 g4 23.hxg4 Nxg4 24.f3 Nxe3 25.Nxe3 Be7 26.Kh1 Bg5 27.Re2 a4 28.b4 f5 29.exf5 e4 30.f4 Bxe2 31.fxg5 Ne5 32.g6 Bf3 33.Bc3 Qb5 34.Qf1 Qxf1+ 35.Rxf1 h5 36.Kg1 Kf8 37.Bh3 b5 38.Kf2 Kg7 39.g4 Kh6 40.Rg1 hxg4 41.Bxg4 Bxg4 42.Nxg4+ Nxg4+ 43.Rxg4 Rd5 44.f6 Rd1 45.g7 White wins.

Listening to the first account, one is likely to draw the conclusion that success at playing chess requires such qualities as intuition, application, creativity and the desire to win, as opposed to perfecting the best algorithms
Archived coverage of the tournament is available on the IBM website at http://www.research.ibm.com/deepblue/home/html/b.shtml.
10

158

to play a tightly structured and regulated game on a 8x8 grid, and having the ability to compute a large number of different outcomes several moves down the track. In other words, whether Deep Blue could be said to be showing one set of qualities or the other largely depends on the rhetorical stance of the record. That the former stance is so tempting and compelling helps explain why events like this are powerful perception-shifters, and why those left arguing that computers will never be capable of human-like reasoning such as John Searle and Roger Penrose do so from such a rear-guard position. Which leads to a second, more crucial difference, one of cultural import: currently the computational metaphor has very little competition as a means of accessing from within the scientific project the mysteries of the human mind. In the concluding section of Consciousness Explained, Dennett writes that [t]he concepts of computer science provide the crutches of imagination we need if we are to stumble across the terra incognita between our phenomenology as we know it by introspection and our brains as science reveals them to us.11 And further:
I havent replaced a metaphorical theory, the Cartesian Theatre, with a nonmetaphorical (literal, scientific) theory. All I have done, really, is to replace one family of metaphors and images with another, trading in the Theater, the Witness, the Central Meaner, the Figment, for Software, Virtual Machines, Multiple Drafts, a Pandemonium of Homunculi. Its just a war of metaphors, you say--but metaphors are not just metaphors; metaphors are the tools of thought. (p. 455)

At the same time as he recognises the power and even the necessity of metaphor, Dennett betrays here an excess of confidence in the ability of the scientist (or the empirically-minded philosopher, to be more precise) to master this tool, to control the symbolic. For if metaphors are the tools of thought, how does one rule over a metaphor, prevent it from doing its own

11

Dennett, Consciousness Explained, p. 433.

159

thinking? In the brief history of the idea of digital transcendence we notice a progression that appears every bit as inexorable as the one towards superintelligent robots that Moravec foresees. Once the mind has been conceptualised as a network of computational modules for processing the information received through the body, the metaphor cascades into a chain of equivalences not only between mind and body but also between brain and motherboard, body and chassis, memory as in the contents of the mind and memory as in the contents of a computers hard drive, and so forth. Ultimately this cascade causes the collapse of the metaphorical structure and brings about a free interplay of substitutions of each tenor with its vehicle. If the mind is software, then by necessity it has to be transferable, if only in principle, onto a computer (recall Schwimmers summary of Moravec in Friends: one day computers will carry out the same amount of functions as a human brain, therefore it will be possible to download the mind onto them). The rest of the story pretty much writes itself, causing the self-avowed antidualists to scramble in search of the pineal gland among scanning devices, neural implants, bio-ports and so forth. This interplay of referents can be observed almost without fail whenever projects having to do with digital transcendence are covered in the media, sometimes offering a glimpse of the credit that the idea enjoys in board meetings and marketing departments. Here is, for instance, a report following the first news circulated by British Telecom about its 40 million pound effort for the development of the astonishingly monikered Soulcatcher chip:
UK -- The Liverpool Echo reports that British Telecom (BT) is working on the Soul-Catcher--a microchip device small enough to be implanted in the optic nerve and capable of capturing a complete record of every thought and sensation experienced during an individuals lifetime. Dr. Chris Winter, head of BTs artificial life team, declared that the Soul Catcher chip promised immortality in the truest sense. Winter suggested that police could use implanted chips to relive an attack, rape or murder from the victims viewpoint to help catch the criminal

160

responsible. Each individuals visual record would be transmitted to central computers for storage. Within 30 years, BT predicts, Big Brother could be watching -- from inside your own head.12

Here we have a product that by tapping into the optical nerve claims not only to be able to record all thoughts and sensations, and to turn into the most sensational instrument of self-surveillance ever imagined, but also, perplexingly, to grant immortality. In a longer (and more sceptically handled) piece in The Ecologist, Dr. Winter is on record explaining that the end of death will be achieved by combining the information recorded by the Soul-catcher with a record of the persons genes, so as to recreate a person physically, emotionally and spiritually13. Another scientist at BT, Ian Pearson, goes on to explain that the proposal to digitise existence is based on solid calculation of how much data the brain copes with over a lifetime, which has yielded the figure of 10 terabytes of data, equivalent to the storage capacity of 7,142,857,142,860,000 floppy disks (p. 15). The task of unpacking this passage makes ones head spin, so many are the elisions, the short-circuits that lead from each premise to its conclusion, following much the same sort of syllogistic logic we encountered at various points in the last chapter. To wit: the chip will be small enough to fit along the optic nerve, therefore it will interface with the brain; the brain copes with data, therefore the mind is data; we think we can measure this data, therefore we can decode, understand the data. And if we can decode it, then we can store it forever. This last point, the heralding of immortality in its truest sense, is undermined by a rather unhappy comparison, for who would feel safe in the knowledge that his or her brain is stored in 7,142,857,142,860,000 floppy disks? It may seem a trivial point: surely Drs. Winter and Parson do not mean to use floppy disks except to reinforce the
12

Unsigned article, Strange Days, Earth Island Journal 11.4 (Fall 1996), p. 3. The piece is presented in its entirety.

Quoted in Zac Goldsmith, And Some Good News, The Ecologist 29 (Jan-Feb 1999), p. 15.

13

161

digital equation and impress us with the sheer amount of data, and the actual storage system they have in mind involves super-safe latest-generation hard drives, in multiple locations, frequently maintained and upgraded. But this scant, lackadaisical treatment of the question of material storage is highly instrumental to the rhetorical aims of the speaker, for it allows technological salvation to stretch into eternity, in a quasi-religious apotheosis. As if the new body, or series of bodies, that the mind would inhabit truly existed some place else, in the heavens, invulnerable to physical decay, not to mention hardware and software obsolescence and the host of other problems that, as we know all too well, threaten the survival of digital information in the most technologically advanced of societies and in the most ideal of conditions.

Recording Humans
At the same time, to ask how much information there is inside a human brain serves the same not-so-hidden purpose as Lesks measurement of how much information there is in the world. That is to say, it establishes that the information inside the brain and the information in the world are of the same kind as the information stored inside floppy disks and hard drives, hence it can be copied or shifted there; and, more importantly, it draws this conclusion without ever exposing to scrutiny or critique the computational metaphor on which such spurious calculations are based. The numbers, too, play a role. Notice the shift from the round figure of 10 terabytes, by force an approximation, to the much more precise and concrete 7,142,857,142,860,000 square pieces of plastic and metal which it comprises, and consider how attributes such as concreteness, numerability, and having physical dimensions and weight imply existence in the real world, not as metaphor.

162

Metaphor. Douwee Draisma reminds us that the early members of the Royal Society had a deep-seated aversion to this figure of speech, which they regarded as anathema to scientific discourse. At the same time, one of the Societys most illustrious founders, Robert Hooke, can be seen dabbling in calculations that attempt to quantify memory and ideas to make them fit into the confines of exact scientific description. The passage, originally from a lecture on memory and first published in 1705 in a collection of Hookes posthumous works, is worth quoting at length for its rhetorical proximity to the work of Lesk and the makers of the Soul Catcher chip:
A Hundred Years contain 36525 days, and 36525 Days contain 876600 Hours, and 876600 Hours contain 3155760000 Seconds. Now one with another, when the Soul is intent and acting, there may be 3600 formed within the compass of an Hour, and so one in a Second of Time. So that if the Soul could through the whole Course of 100 Years be continually so intent, and so acting and forming these Ideas, and inserting them into his Repository or Organ of Memory, there might be 3155760000 Ideas. But by reason of Sleep interposed, one third Part of the number will be taken off, the Soul then for the most part ceasing to form Ideas, or when it does, they are only imperfect and lost. So that there will remain but 2103840000, or to take a round Sum, but 21 hundred Millions. Now if we examine this remaining two thirds of Time or Moments, and therein consider what part of the time remaining is lost in Infancy, Old Age, Sickness and Inadvertency, we may well reckon that two thirds of these remaining Moments are lost, and no Ideas at all formed in them; and so instead of 21 hundred, there will remain but the number of 7 hundred Millions. And if we again consider how small a part of these are industriously and carefully stored up, we may very well agree, that not above a seventh Part of these are stored up: and so one hundred Millions may be a sufficient Number to be supposed for all the Ideas that may have been treasured up in the Organ of Memory through the whole Course of a Mans Life, though of a hundred Years continuance; and consequently one Year with another may be supposed to add to this store about one Million of ideas14.

Quoted in Douwee Draaisma, Metaphors of Memory (Cambridge: Cambridge University Press, 2000), pp. 58-59.

14

163

It is quite possible to abstract ideas from the people who entertain them, of course. The expression of an idea through speech achieves such a separation. But enumerating ideas in this way, making them countable to this extent achieves a more radical reconfiguration, one that invites to transfer thought itself, by means of science and technology, into an environment quite unlike that of human thought or conversation. The notion of memory is central to this reconfiguration. It is interesting that Hooke should use the noun idea to designate the contents of memory, but even more interesting to witness how, three centuries down the track, memory, mind and identity have become interchangeable terms in the rhetoric of digital transcendence, thanks in part to such efforts of enumeration. For the claim that the sum of the information stored in the mind equals so many bytes is in itself an act of uploading said mind into computer memory. For this project to work, it is necessary to think of memory as the faithful means of recording the real by an observer who is spatially located, but is not a subject in any other respect. This is in essence the work of devices such as the Soul Catcher, the LifeLog and Gates digital diary: they position themselves on the body, or penetrate it in order to graft themselves onto the nervous system, so as to capture the same stream of data as a persons senses and produce a record which is claimed to be truly referential and coextensive with the minds. Think again of the chess game between Kasparov and Deep Blue, and the string of text 1.Nf3 d5 2.g3 Bg4 3.b3 Nd7 4.Bb2 e6 5.Bg2 Ngf6 6.O-O c6 which constitutes a type of account (and a faithful one, to be sure) of its opening stages. This memory could quite conceivably have been generated by a computer, and is transferable within the parameters of the digital transcendence project. It is easy, too, to think of it as the contents of a computer file that, each time it is opened, can be expected to contain the exact same information in the exact same order. But the set of moves does not exhaust the experience of the chess game any more than listing the knights moves in Life, A Users Manual can be said to

164

exhaust Perecs novel; nor does human recall, by most accounts, produce at different times always identical memories of any given event. What is truly being referred to here is computer memory, in the same way that intelligence, in the commentary on the Deep Blue vs. Garry Kasparov contests, came to be equated with chess-playing ability.

The Instrumentality of Science-Fiction

Fig. 15 Jeff Bridges is simultaneously scanned and dematerialised as he is prepared for entering the world of Tron (Steven Lisberger, 1984).

Digital transcendence comprises two movements, then. In the first, computer simulation replaces older forms of representation and stakes a powerful claim: to be able not only to reproduce the human mind, but also to replay it dynamically, over time. In the second movement, the recorded mind, already separated from its first body of flesh, is elevated above the very notion of materiality; it becomes an algorithm, drawing strength and the promise of eternal existence from the pure laws of mathematics. Both movements require the obfuscation of many variables, a sort of structural blindness towards the many competing realities that threaten them. Turning to how various versions of digital transcendence have been articulated in science-fiction proper, as they have been since long before Moravec and the

165

Soul Catcher project15, can help to restore these layers of complexity that are so starkly absent in much expert commentary. The instrumentality of science-fiction in constructing and promoting digital transcendence, but also in articulating its sceptical critique, is a telling symptom of the power of the symbolic. I have already argued that, at key junctures of Mind Children, Moravec chooses to become a science-fiction author a transition self-consciously signposted by the use of italics in order to steer his speculations down paths of least resistance and, therefore, easier and greater acceptance. But any attempt to trace the origins of this idea will have to pay tribute to the many science-fiction authors who helped give birth to it and nurtured it for decades, notably through the works of Asimov on sentient robots and especially Philip Kindred Dick on cyborgs and the manipulation of memory and the psyche, finally tackling it head-on, in the same terms proposed by Moravec, from the early nineteen eighties onwards. Throughout this process, science-fiction has acted as a resonance chamber for the intuitions that led down the path of digital transcendence, such as Norbert Wieners idea that it would be conceptually possible for a human being to be sent over a telegraph line16, an early pronouncement on the translatability of human beings into information flows which would be famously overshadowed in the collective imaginary by Star Treks teleportation deck; and that in turn has acted as a source of constant inspiration and insight for AI researchers, philosophers of mind and scientists of various denominations. This is a state of affairs nicely illustrated by the 2001 edition of the 1981 novella True Names, by Vernor Vinge (himself a mathematician), touted on the cover as a the

The earliest literal example of digital afterlife I was able to find is the 1975 Japanese anime television series Kotetsu Jeeg, in which professor Shiba, having cause to fear for his life due to having made some powerful supernatural enemies, duplicates his consciousness onto a computer at his anti-atomic base to be activated after his death. This new version of the professor remains a character throughout the show. Quoted in Visions of Technology: A Century of Vital Debate About Machines, Systems and the Human World, edited by Richard Rhodes, (New York: Simon & Schuster, 1999), p. 239.
16

15

166

groundbreaking work that invented cyberspace17, and published alongside a series of non-fictional essays and an afterword by Marvin Minsky, one of the key proponents of the mind-as-information hypothesis. This feedback loop is of course characteristic of the distinctive place that science-fiction occupies in the intersection between science, society and culture, and as such is not peculiar to the notion of digital transcendence18. Nor it should come as a surprise that authors engaging in fully-fledged fictions, as opposed to the elaborate quasi-fictional speculation of the likes of Moravec, should be more comfortable in dealing with metaphors, and therefore more willing to delve into them and be led, as the case may be, down less than comfortable paths. In what is one of the precursors of cyberspace fiction, John Varleys short story of 1976 entitled Overdrawn at the Memory Bank, the author imagines a cube capable of storing the memory and sensory patterns of an individual so that they can be combined with those of another, be it a human or an animal. This allows computer programmer Fingal to spend part of his modest salary on a brief holiday inside the body of a Kenyan lioness, sharing with the animal its sensory apparatus and even finding himself beginning to think and behave as he supposes a lioness would. Meanwhile, back at the lab, a technical glitch prevents Fingals mind from re-entering his body, trapping it for six hours (which subjectively appear to him to last several months) inside the cube, with no body, human or animal, to connect to. His minds response is to create a fantasy world inside the bio-electronic machinery. As the company official who is trying to reunite him with his

17

Vernor Vinge et al., True Names: And the Opening of the Cyberspace Frontier (New York: Tor Books, 2001), front cover (emphasis in the original).

For an in-depth analysis of the complex interplay between science-fiction and science proper in the development of another field of inquiry, see Colin Milburn, Nanotechnology in the Age of Posthuman Engineering: Science Fiction as Science, Configurations 10 (2002), pp. 261-295.

18

167

body explains through a variety of manifestations (including appearances through books and broadcasts within Fingals fantasy),
[l]ife in a computer is not the sort of thing you could just jump into and hope to retain the world-picture compatibility so necessary for sane functioning in this complex society. This has been tried, so take our word for it. [...] Since you cant just become aware in the baffling, on-and-off world that passes for reality in a data system, your mind, in cooperation with an analogizing program Ive given the computer, interprets things in ways that seem safe and comfortable to it. The world you see around you is a figment of your imagination. Of course, it looks real to you because it comes from the same part of the mind that you normally use to interpret reality. If we wanted to get philosophical about it, we could probably argue all day about what constitutes reality and why the one you are perceiving now is any less real than the one you are used to.19

Even though a transfer from medium to medium occurs, Fingal experiences nothing like the seamless awakening of the resurrected in Moravecs story. The coupling of Fingals mind with the circuitry is the result of a complex mediation orchestrated by the minds innate ability to make sense of its surroundings, and by the machines analogizing software. As this cohabitation progresses, the story makes clear that, while the memory cube is a useful temporary vessel for transferring the mind from one body to another, it is not a viable long-term habitat. If the mind is not transferred into a body, any body, insanity will eventually set in. Consciousness inside the machine, one would conclude, is an aberration that soon strips the mind of its ability to function outside of it. But the feelings of intense paranoia experienced by Fingal inside the memory cube could equally be put down to the lack of social interaction. While more recently imagined virtual worlds such as the ones that have shaped cyberspace fiction also involve for the most part self-deception on
John Varley, Overdrawn at the Memory Bank, in Visions of Wonder: The Science Fiction Research Association Anthology, edited by David G. Hartwell and Milton T. Wolf (New York: Tor Books 1996), p. 499. The story was originally published in Galaxy magazine ni 1976.
19

168

the part of the mind, they differ by one crucial ingredient: connectivity. The hallucination, as in Gibsons seminal definition of cyberspace, has become consensual; the memory cube has given way to the memory network, a place where several minds can meet and construct their realities collectively. By adding this social dimension, cyberspace fiction reframes in much more forceful terms the question of whether minds that exist inside machines can be said to carry out a meaningful existence. The first text to open up this dimension in a fully-fledged manner is the already mentioned novella True Names. Vinges imagined world hinges, firstly, on the existence of a vast digital communications network similar to the Internet, and, secondly, on a technology which enables individual minds to interface with this network directly, via neural connections (the equivalent du jour of the pineal gland), and travel through the pathways in the form of data. Here too, as in Varleys story, reality inside the machine is an elaborate simulation generated by the mind in order to be able to continue functioning in this most alien of environments, and appears to obey laws that, while very different from those that govern the material world, are similarly consistent. While Fingals experience of the memory cube was mostly hallucinatory, save for the input provided by the cubes software, Vinges proto-cyberspace, which is reminiscent of early textual MUDs20, is

20 MUDs (multi-user domains or multi-user dungeons) are text-based online environments in which each user types instructions into a parser in order to manipulate a virtual counterpart and interact with other users and their surroundings, chiefly for the purpose of role-play gaming. MUDs are generally thematic, and often depict fantasy or science-fiction worlds. While they existed in 1979, at the time when Vinge wrote the first draft of True Names, he does not appear to have known about them, according to his introduction of the 2001 edition of the novella. Here is an account by Richard Bartle of some of the earliest examples of MUD:

The very first MUD was written by Roy Trubshaw in MACRO-10 (the machine code for DECsystem-10's). Date-wise, it was Autumn 1978. The game was originally little more than a series of inter-connected locations where you could move and chat. I don't think it was called MUD at that stage, but I'd have to ask Roy to be sure. Roy rewrote it almost immediately, and the next version, also in MACRO-10, was much more sophisticated. This one was definitely called MUD (I still have a printout of it). The database (i.e. the rooms, objects, commands etc.) was defined in a separate file, but it could also be added to during play. However, the result was that people added new rooms that were completely out of keeping with the rest of the environment, and, worse, added new commands that removed

169

an empty space furnished collectively by its users, and retains therefore certain elements of stability and coherence, including an overt fantasy theme. It is still a dangerous place, however, for the mind can be hurt or even annihilated by other minds or programmes. The novella uses this premise to explore the transformations undergone by the subject as it negotiates its double life, inside and outside cyberspace. The foremost issue touched upon is that of identity: what happens to the self inside cyberspace? Through what kind of intrinsic predispositions and social interactions does it develop? How do communities form in this new space, according to what idea of the individual and of relations? These themes are underscored by the efforts of the hacker protagonists to hide from each other and from law enforcers the point of origin of their connections, or their true names, as well as by the difficulties in telling the avatars of real people apart from personality simulators. Finally, Vinge tackles the issue of whether the computer might one day become the primary substratum of human existence, a site to which to transfer the mind after the demise of its flesh-and-blood carrier. In a reversal of the Frankenstein story, this development is made possible by the endeavours of an artificial intelligence, known as the Mailman, to build a sophisticated simulator of mind patterns; once the Mailman is vanquished, the simulator is appropriated by a human, Erythrina, who uses it to gradually transfer herself online, in preparation for her impending physical death: My kernel is out there in the System. Every time Im there, I transfer a little more of myself. The kernel is growing into a true Erythrina, who is also truly me. [...] When this body dies, I will still be,21 she explains. Vinge leaves the notion hanging there at the end of his novella, without delving into its many ramifications. What does the use of the pronoun I

any spirit of exploration and adventure that the game may have had. Richard Bartle, Early MUD History, http://www.mud.co.uk/richard/mudhist.htm.
21

Vinge, True Names, p. 330.

170

imply, in Erythrinas talk? The process of transfer should more properly be called a copy, since it does not unlike in the operation described by Moravec take anything away from Erythrinas offline self while it is being carried out; so at the very least there would be two Is, one inside and one outside the system, operating quite independently of one another, until the demise of the one outside or of both. Still, Vinge has made an important point: it is possible to conceive an existence inside the machine that is just as meaningful as if not more so than the one outside of it. *** Three years after the publication of True Names, the theme was to be famously explored, and given a darker and altogether more problematic treatment, by William Gibson in Neuromancer. Here we encounter the Dixie Flatline, the recording of the mind of a data cowboy (read: hacker) held in the vault of a powerful corporation, and stolen and activated in order to break through the defences of another corporation. Dixies unhappy afterlife inside cyberspace takes the form of a succession of short bursts of all-too-lucid awareness without sensation. The constructs numerous pleas to be deleted emphasise the unbearable lack of feeling in its new, disembodied existence:
How you doing, Dixie? Im dead, Case []. Hows it feel? It doesnt. Bother you? What bothers me is, nothin does. Hows that? Had me this buddy in the Russian camp, Siberia, his thumb was frostbit. Medics came by and they cut it off. Month later hes tossin all night. Elroy, I said, whats eatin you? Goddam thumbs itchin, he says. So I told him, scratch it. McCoy, he says, its the other goddam thumb. When the construct laughed, it came through as something else, not laughter, but a stab of cold down Cases spine.

171

Do me a favor, boy. Whats that, Dix? This scam of yours, when its over, you erase this goddam thing.22

In another of many parallels with Vinges novella, it is an artificial intelligence, Neuromancer, which creates a further set of preconditions for the continuation of intellectual life inside a machine. Case describes Neuromancer as something like a giant ROM construct, for recording personality, only its full RAM. The constructs think theyre there, like its real, but it just goes on forever. (p. 296) Because the recorded personalities inside Neuromancer are not aware of being digital ghosts trapped by a Cartesian demon, they can go on perceiving themselves as being made of flesh and blood and inhabiting their old world. Still, following Descartes one step further, one could say that they think, therefore they are. Except that the recording skips, like a broken record, for the afterlife of the constructs is built from a finite archive of memories from their previous lives, and traps the dwellers in a perplexing limbo. Gibsons bleak vision of a life inside the machine that is no life at all, does not wholly negate the possibility of digital transcendence. Here too computers can create functional copies of the mind capable of original thought and sophisticated behaviour, and provide a substratum for the human mind to exist not only metaphorically, but also literally, as cowboys like Case demonstrate when they routinely jack into the network and flip in and out of cyberspace (the pineal gland at work, again). And yet the existence both of the part-time cowboys and of those who have made the full transition is radically de-socialised and either caught in a never-ending loop, or made unbearable by the lack of connection with the body. In all cases, the technology that enables the transmigration onto the machine has worked as advertised, and yet proves woefully inadequate to account for what we might still be inclined to call human life.

22

William Gibson. Neuromancer (London: Harper Collins, 1995), p. 130.

172

Just as strikingly, grotesque parodies of human life often emerge also from the narratives that show support, sometimes unabashed, to the project. Such is the case of Charles Platts Silicon Man (1991), a novel aided by discussions with Vinge and directly inspired by the writings of Moravec. The robot-surgeon operation is performed here on an unwilling subject, a police agent who has the misfortune to stumble upon the secret, unauthorised experiments of the technologys pioneers. This assassination is followed by an afterlife which begins with a prolonged, scarcely endurable psychological torture, as the naked, impotent, terrorised mind of the deceased, reduced to almost pure knowledge of the loss of former life, body, wife and young son, is written into an experimental computer simulation, its senses gradually restored, its consciousness flicked on and off at the whim of the masters of this new universe. In light of these harrowing beginnings, the later enthusiastic conversion of the character to the benefits of his digital immortality (a literal case of being born again) is decidedly jarring, as is the reunion with his wife and son turned ito software constructs in the novels unlikely happy ending. Barring accidents, were immortal,23 claims the creator of the digital heavens in a wonderful and probably unwitting oxymoron which reveals how deeply flawed and self-defeating this fantasy can be, when it is imagined through and through even by its most enthusiastic followers.

23

Charles Platt, The Silicon Man (San Francisco: Wired Books, 1997) p. 323.

173

The Forever Network

Cyberspace is the place where a telephone conversation appears to occur. Not inside your actual phone, the plastic device on your desk. Not inside the other persons phone, in some other city. The place between the phones. The indefinite place out there, where the two of you, human beings, actually meet and communicate. Bruce Sterling, The Hacker Crackdown

Compared to the imaginings of Varley, Vinge, Gibson and Platt, cyberspace as it is currently deployed is a much less demarcated and more permeable place of uncertain coordinates. It is a place that exists, as Sterling suggests, somewhere in the space between two telephones, along the connection between several computers. It is a non-Cartesian plane in which physical distance between different material sources of information can be compressed or expanded according to how one chooses to connect the dots. If we consider the subset of cyberspace known as the World Wide Web, the geographical location of these points of origin the computers on which the information resides is an incidental element of the experience of surfing the Web, and trying to reconstruct the wanderings of ones digital trail throughout the physical world, by means for example of so-called traceroute applications, has for most users little more than a curiosity value24. To complicate things further, and undercut more deeply the notion of the cohesiveness of the subject in relation to the cyberspace experience, the units of information sent across the Web such as text files, pictures or emails break into packets and spread in various directions before piecing themselves back together upon reaching their destination. Which is not to say that the physical objects and systems on which the information is stored
Some traceroute applications visually depict on actual maps of the world the thread spun by the user while travelling through the web. See for instance www.visualware.com. Tracing the movements of others, and discovering the point of origin of their connection, as Vinge brilliantly predicted, is a matter of some importance for law-enforcement agencies and intelligence services.
24

174

and exchanged actually disappear, but that they are concealed at such depths beneath the level where the interaction occurs that it becomes quite easy to forget about them altogether and, through habit, eventually come to believe that digital information only exists in the form of patterns, because after all this is how we come to experience it. Which is why it is so natural to regard cyberspace as the ultimate post- or transhuman environment. This illusion of immateriality perhaps helps to explain why notions of permanence exert such a powerful fascination in discourses around the World Wide Web. The proliferation of online memorials bears testimony to this. Commercial providers such as forevernetwork.com and onlinememorial.com offer a wide range of services to those who wish to honour and remember their loved ones online. Submissions may include life stories, video and sound recordings, even open forums in which family and friends can share their memories of the deceased. Extraordinarily, however, these organisations give little or no guarantee of how long they are committed to maintaining the memorials online once the fee (which is payable upfront) has been received, and make no mention of the systems in place to preserve the information. In the case of now defunct organizations such as onlinememorial.com, the irony is rather sharp. Memorials will stay on the site for as long as the site runs which will be well into the future. OnlineMemorial.com has sufficient funds to keep the web servers running for many years to come, wrote the sites owners in 200325. Navigating to the web address two years later one discovers that the online-memorial.com domain is now for sale26. The site too was likely to be immortal, it would appear, barring such accidents as the failure to turn a profit. The chiefly non-commercial organisations that strive in various ways to preserve the content of the Web, by contrast, are all-too-conscious of the

25 26

http://www.online-memorial.com/information/FAQ.asp, as at July 8, 2003. The same URL now.

175

effects of neglect and the lack of forward-looking plans for the preservation of digital information repositories. A fitting example is AfterLife.org, by no means the most prominent of these organisations, but one that recognises explicitly the need to care for websites whose authors have died. Writes David Blatner, founder of the initiative:
I dont know how many elderly people or people with terminal illnesses are currently building Web pages, but I can only assume that the number is increasing and that within the next few years the passing on question will become one of significance. There are many important and emotional issues at stake here, as peoples personal Web sites become repositories and reflections of who they are, and what they feel is important to share with others. I believe that an international not-for-profit organization is called for. One to which people can bequest their web sites with the knowledge that the site will be available indefinitely to their loved ones and other interested parties.27

While the use of that most optimistic of adverbs, indefinitely, betrays a continuing belief in the dream of otherworldly permanence offered by the Web, this is inscribed in a contingent negotiation with the limitations of the medium, both in cultural and in economic, material terms, when it comes to ensuring the longevity of its content.

The Anti-Moravec

How would resurrectees support themselves? Its difficult enough to save for retirement. Meel Seesholtz

All of these issues are canvassed by Australian novelist Greg Egan in Permutation City, a novel of great virtuosity that follows Mind Children
David Blatner, Things to Do on the Web When Youre Dead, http://www.afterlife.org/info.html, as at March 9, 2005.
27

176

every step of the way and in so doing restores one by one, as if in the act of peeling an onion backwards, the realities that the project of digital transcendence has to exclude in order to assert its own conditions of possibility28. Initially the novel appears to take on board the chief premises of the project as we have encountered them so far: that it is possible to copy the human brain to very high degrees of complexity, run the mind on a computer and use the technology as an instrument to cheat biological death. As in all of Egans major works, however, these premises the technoscientific engine of the story fail to produce a straightforward, tame outcome in society, but rather enter with it in a series of complex negotiations and reciprocal interactions which offer the platform for a series of thought experiments on an ever larger scale. Firstly, Egan emphatically rejects Moravecs suggestion that the I waking up inside the machine is the very same I who went to sleep on the operating table, lay beside the memory cube, jacked into cyberspace, etcetera. As we have seen, this level of literality of the cyberspace metaphor requires an orifice, a physical interface between the brain and the silicon chip, and Egan does not allow for it. Nonetheless the Copies (with a capital c) are complex and sentient beings, certainly not lacking in psychological depth and ability to feel emotions so much so that upon being brought to life most of them proceed in short order to commit suicide, struck by the unbearable realisation that they represent the misguided dream of immortality of an alien life form29. As well as maintaining an obstinate

As Katherine Hayles documents by quoting from some of his interviews, Egan is in fact a self-avowed believer that the mind-uploading project will ultimately be achieved. Like her, however, I am struck by how Egans remarkable thinking through of these ideas and the much greater depth of his reflections tease out and question the intellectual path that leads to them, the various conceptions of what constitutes humanity that need to be cast aside in order for the underlying imaginings to take shape, and how problematic the ensuing reconfiguration is. That is the sense in which I characterise him as the anti-Moravec. For Hayles discussion of the Quanrantine - Distress - Permutation City trilogy, see My Mother Who Was a Computer, pp. 214-240. In David Brins Kiln People, the copies (or Dittos, a standard term in the works that explore these themes, which likely originated with Platt) are designed to have a short lifespan, sometimes of a single day, and are characterised by contrast by an intense will to live.
29

28

177

otherness, the Copies require forms of sustenance that highlight their dependence on human-based infrastructures and undercut the hopeful notion of an indefinite future. To be scanned requires money, Egan begins by observing, and computer systems to run and sustain each Copy; even in the case of wealthy individuals, whose estates can afford to lobby governments to recognise the status of Copies as disabled people, no more, no less really just a kind of radical amputee30, forever lasts for only as long as nobody pulls the plug, a decision tied at the very least to economics and politics31. At the same time as these variables are called into play, the subjective experience of existing inside the machine foregrounds a number of physical realities, actual and imagined. In order to reinforce the illusion of being alive in the familiar sense of the word, for instance, some Copies require that the simulation retain every vestige of corporeal life, including urine and faeces production. Not so the Copy of Paul Durham, one of the novels protagonists, whose bodily wastes would be magicked out of existence long before reaching bladder or bowel. Ignored out of existence; passively annihilated. All that it took to destroy something, here, was to fail to keep track of it (p. 9). While rejecting this vestigial corporality, Paul-the-Copy remains keenly aware of the most immediate physical reality on which his survival depends, that is to say, the hardware:
A cubic metre of silent, motionless optical crystal, configured as a cluster of over a billion individual processors, one of a few hundred identical

30

Greg Egan, Permutation City (London: Millennium, 1994), p. 34.

31 This is a problem perhaps underestimated by those who dabble in cryogenics. It is possible to envisage that future scientists may wish to revive Walt Disney (if his corpse really is frozen, that is), but then whom else? In Flesh and Machines, a book that in many respects contrasts the thinking on the future of robotics and humankind put forward by Moravec, MIT roboticist Rodney Brooks reveals that cryogenics is indeed very popular among the researchers in his field, but warns his colleagues by adding I am a little worried about all my friends who are going to have themselves frozen. They may not turn out to be very important or worthwhile to future generations. They may not return even if technology makes it possible. Rodney Brooks, Flesh and Machines (New York, Pantheon Books: 2001), p. 208.

178

units in a basement vault somewhere on the planet. Paul didnt even know what city he was in; the scan had been made in Sydney, but the models implementation would have been contracted out by the local node to the lowest bidder at the time. (p. 9)

To emphasise the fracture between the two competing realities, the inside and the outside whose demarcation has to be transcended, Egan explains that technical limitations at the future time when the novel is initially set, are such that although Copies are capable of simulating mental activity, they do so at a fraction of the pace, making interactions with living human beings painfully impractical32. Again, the implications spill out from the psychological sphere into that of harsh material, and especially economic, realities, as in the case of the Copy of David Hawthorne:
At a slowdown of thirty, the lowest Bunker-to-real-time factor the income from his investments could provide, communication was difficult, and productive work was impossible. Even if hed started burning up his capital to buy the exclusive use of a processor cluster, the time-rate difference would still have rendered him unemployable. Copies whose trust funds controlled massive shareholdings, deceased company directors who sat on the unofficial boards which met twice a year and made three or four leisurely decisions, could live with the time-dilated economics of slowdown. Hawthorne had died before achieving the necessary financial critical mass let alone the kind of director-emeritus status where he could be paid for nothing but his name on the company letterhead. (p. 112)

32

In another sign of Egans direct engagement with Mind Children, this is a counterpoint to the greater speed enjoyed by the new mind following the operation by the robot-surgeon: Your new abilities will dictate changes in your personality. Many of the changes will result from your own deliberate tinkerings with your own program. Having turned up your speed control a thousandfold, you notice that you now have hours (subjectively speaking) to respond to situations that previously required instant reactions. You have time, during the fall of a dropped object, to research the advantages and disadvantages of trying to catch it, perhaps to solve its differential equations of motion. You will have time to read and ponder an entire on-line etiquette book when you find yourself in an awkward social situation. Faced with a broken machine, you will have time, before touching it, to learn its theory of operation and to consider, in detail, the various things that may be wrong with it. In general, you will have time to undertake what would today count as major research efforts to solve trivial everyday problems. Moravec, Mind Children, p. 112.

179

It is more or less at this point that, having exposed some of the more immediate problems that implementing a large scale project of digital transcendence would involve, the novel turns itself inside out and chronicles first the theorisation and then the creation of an artificial universe which runs without hardware (p. 170, emphasis in the original). In this gambit, that matches Moravecs speculations in the second appendix of Mind Children, Egan asks us to imagine that the virtual worlds that Copies inhabit could, once formed, keep finding themselves in the form of patterns of dust scattered through the universe even once the hardware that computes them is shut down. The theory plays on packet switching, the technology referred to above that allows data sent across the Internet to split into packets along more or less random routes and be recombined upon reaching their destination; and in so doing radicalises the pre-eminence of pattern-identity over body-identity at the heart of digital transcendence. The result, Permutation City, is a virtual reality that ceases to be virtual, an imaginary world that becomes real precisely because it has been imagined in enough detail. But the tale contains one final twist. The city had been equipped by its creators with a simulation-within-the-simulation, a world created by a piece of artificial life software whose original function was to provide an island of randomness within an otherwise wholly premeditated universe (thus making the new universe less preordained and, therefore, more plausible). After millennia of computation and evolution, this simulation generates a race of insect-like creatures that begin to formulate theories about themselves and their world, and when the inhabitants of Permutation City decide to reveal themselves to them as their creators, the insects reject the explanation and come up with a theory that, although untrue, is ultimately more elegant. Faced with a better explanation of its existence, the computational principle that enables both Permutation City and the world of the insects to exist starts favouring the latter, and gradually ceases to compute the city, which begins to un-exist. Thus contingency reenters the universe inhabited by the saved ones, exposing them once again

180

to material conditions of existence, and therefore to the inevitability of their own death.

Restoring the Body of the Machine

It has always been integral to capitalist organization that science-fiction functions as a factor of production, relating it intimately to panicphenomena. Y2K takes things to a new level, as a disaster that comes from the future, scheduled by accident, and thus precisely anticipated in time. If it proves effectively ineradicable it is because it is trickling back, from the self-confirming inevitability of its occurrence. Something is about to happen, and we know exactly when. Melanie Newton, Y2paniK

Hey, Dave. Ive got ten years of service experience and an irreplaceable amount of time and effort has gone into making me what I am. HAL9000 in 2001: A Space Odyssey

The narrative which more than any other in recent years has undercut that of digital transcendence, calling attention to the risks involved in placing too much trust in the enduring archival power of computers, is without doubt that of the Millennium Bug. As will be recalled, the problem was rooted in the widespread practice among computer programmers, until well into the 1980s, to assign only two digits for the year in date codings. Originally motivated by a need to save on what were at the time highly expensive memory resources, later lingering on by force of habit in spite of the starkly diminishing returns, the two-digit shortcut it was feared would cause systems containing the shortened dates to fail to register the beginning of the year 2000 and effectively reset the 20th Century, with not only metaphysical but also very practical consequences for the worldwide command, control and communications infrastructure. The most pessimistic 181

commentators predicted that bad data originating from non-compliant machines would ripple through the network and re-corrupt the compliant ones, effecting a paralysis of the infrastructure on a global scale. A few defective computers would suffice to spread this contagious amnesia throughout the whole network, in a scenario that returns us to the darkest forebodings examined in the first chapter. In the end, depending on how one looks at it, either the Bug turned out to be an enormously expensive hoax, or an actual problem solved quite well (albeit at an enormous expense). Those who believe that the case for Y2K was grossly exaggerated have made perhaps the more convincing argument, pointing out that the Bug failed to strike globally, in spite of the vastly different levels of preparedness of governments, businesses and organisations in different parts of the world. The long build-up to the changeover of January 1st, 2000 had been dominated by arguments and counterarguments, in and out of specialised publications, in which the actual magnitude of the threat was fiercely debated. That the dispute raged not only among laypeople and the mass media, but also among industry experts, is exemplary of how increasingly difficult it has become for those ostensibly in charge of running the digital infrastructure to control its workings and predict its behaviour. But the scare did something more than simply undercut fantasies of control over the technology. It articulated a set of technological conditions capable of triggering the self-destruction of an immensely complex communications system; it suggested that the digital age could undo itself at a symbolically charged point in time. Hardware and software crashes, and the resulting loss of data, are a common enough experience for the average computer user, but their effects are generally local. Even computer viruses, which spread not only malicious codes but also powerful narratives of impermanence, have failed so far to suggest the possibility of such a systemic breakdown. Y2K introduced the idea of an OFF switch hidden inside the system, ready to flick itself and

182

reverse the direction of technological progress, returning humanity to a predigital era. Also drawing on a potent religious theme, the millenarian apocalypse, Y2K offers a nicely symmetrical counterpoint to the dream of digital transcendence. Paradoxically, however, both the dream and the nightmare betray the same set of attitudes and expectations concerning technology; and these are predicated in turn on the idea that information can flow outside of bodies, in a perfect vacuum which is at once friction-free (hence everlasting) and terrifyingly hard to penetrate and control. What has been lost is not just the concept of human embodiment, as Hayles suggests, but also that of machinic embodiment, that is to say, of the staggeringly complex and concrete realities that are implicated in the informational society. The point is made to great dramatic effect towards the end of Stanley Kubricks 2001: A Space Odyssey, in the scene where Dave Bowman disables HAL9000, the computer on board the spaceship Discovery which had turned against the crew. Up to that point, HAL had established itself as a creature of pure intent, not existing in space but through the deployment of its all-embracing control over the spaceship, given its access to both command systems and information (the latter achieved by means of a myriad of sensors, including the cameras framed metonymically by the cinematographers lens to give a necessary visual correlative to HALs everpresent conversation). By contrast, when Bowman finally gains access to the control room, and proceeds to extract one-by-one the blocks which together constitute the computers brain and mind, HAL is revealed as unquestionably embodied, and just as susceptible to physical pain and fear of death as the members of the Discovery crew, a point underscored by its childish pleas not to be terminated. The transformation follows the moment when HALs material core has been located, and is a direct product of the

183

act of forcing information back into a point in space and time, of restoring contingency as a dimension of technology. In other words, both the extremes of the everlasting digital heavens and of apocalyptic erasure are predicated on the same act of forgetfulness. One needs to conceive of digital technologies, from a deeply dualist perspective, as operating quite independently from the cultural, social, financial and physical realities in which they are embedded, in order to imagine an environment in which information flows flawlessly, without attrition or noise, ultimately without entropy. It is only in such an idealised space that it would be possible for a disembodied mind to exist forever, and for a coding error to spread instantly throughout the system, corrupting all else. Causes and effects, in the world which digital technologies inhabit, have to travel through far more concrete bodies, etch themselves onto far more tangible surfaces, navigate far more complex discursive environments.

Undocumented Persons
At the same time as it is important to temper these imaginings with richer, less reductionist views that do not entail the suppression of embodiment or the social, the power of digital transcendence has to be recognised, its appeal interrogated. That the more unabashed materialists should find in it the perfect language in which to think up their own immortality shorn of the trappings of traditional religions, is extraordinary to be sure; but what interests me more about this vision is that it is symptomatic of the extent to which some have come to conceptualise the future of humanity as being inextricably linked to that of computing. The digital will save us; we shall exist for as long as computers are able to remember. At the root of this belief, as well as the obfuscation of realities that I have endeavoured to describe, lies a willingness increasingly to divest ourselves of, and invest digital machines with the means to construct and preserve identity and 184

memory. The singularity posited by Kurzweil, Moravec and Vinge, the moment of effective transition from one state to the next, from being made of flesh to being made of bits, is in this respect wholly redundant. We already talk of how we became posthuman (Hayles), and of being digital (Negroponte). The culture is already steeped in this transition, which we wear on our collective body at the same time as we wonder what it would be like not to need a body at all. But, once again, who are we? Hakim Bey reminded us in the last chapter that the myth of information can hypnotize the wired haves into thinking that the real economy of material production no longer exists, that food and clothes and everything else that we need or enjoy can be magicked into existence, to paraphrase Egan, by waving a credit card and initiating a dance of electronic symbols. Digital transcendence fits precisely into this politics, and its excluded realities also define the kind of subject who will or will not partake of the immortality to come. Think for instance of the currency trader, who participates in a most egregious example of virtual reality mediated by computing. Even more than the institution of money itself, currency trading exists at a far remove from the real economy of manufacturing and farming and services provided by people, to people, as well as from the various physical events say, a drought in Australia which also affect the value of the money issued by this or that country. Over time, macroeconomic realities still matter. But in the day to day, speculation, the timing of buying and selling, the intensity of the exchange itself are the main determinants of value, of gain and loss. To the extent that it is so virtual, currency trading too could conceivably transcend, to the point where the industry might choose to drop all remaining pretences and start trading Monopoly money directly, which is only marginally less real than the virtual peso or the digital drachma anyhow. Steeped in this dense world of codes and symbols, the currency trader has already made half the transition. Suppose that his or her decisions at work,

185

the timing and nature of each transaction s/he has made, were analysed and modelled by a piece of software in order to create an expert system that behaved in much the same way, so that the work could continue when the trader is having a day off; how could we then tell, from the other side of the screen, when the trader is actually at work33? Imagine then that his or her taste in songs, books, television shows and movies were similarly analysed with great predictive accuracy, initiating automatic online purchases that again would be difficult to distinguish from those made by the trader in person34. Until finally, one day, the currency trader dies, but for a while nobody thinks of telling the expert software, which keeps showing up to work and making online purchases and compiling lists of favourites. A digital ghost. As in the case of the digitally documented life, which is after all the prologue to digital transcendence, we find that certain people are better suited for the transition than others. I am data, claims Microsoft researcher Gordon Bell of MyLifeBits fame35, and no matter how frustrated we might be at how thin his underlying argument is, there is a significant extent in which it is also compellingly true. The high-powered consultant, who advises the rich and powerful and sits through corporate meetings in which minutes never fail to be taken; who regularly consumes in-flight

This is hardly science-fiction, of course: expert systems and various types of intelligent information trackers have been used in the industry for quite some time, and constitute a latent but persistent threat of obsolescence for many a professional. The Financial Times recently reported for instance that Goldman Sachs has become the first bank to create a hedge fund replication tool in a move that could lead to a shake-up of the $US1300 billion ($1652.5 billion) hedge fund industry. Steve Johnson, Goldman sets up hedge fund clone, The Financial Times of December 5, 2006, https://registration.ft.com/registration/barrier?referer=http://www.google.com/search?q=%2 2Goldman%20Sachs%20has%20become%20the%20first%22&location=http%3A//www.ft. com/cms/s/5b8331c0-82fc-11db-a38a-0000779e2340.html.
34 Digital video recorders such as the TiVO already do this, recording by default the programmes that are close in topic or genre to those explicitly requested by the user. Amazon.coms suggestions operate in a similar way, by matching previous purchases with the purchases of other customers so as to produce titles that are deemed to be more likely to interest the user. 35

33

See note 41 on page 132.

186

entertainment and receives daily emails in the hundreds this is the kind of person who has an output, whose words are always recorded, who is data, in the same way that Nicholas Negroponte is digital. Fair enough. But let us consider for a moment another character introduced in the last chapter, my grandmother. A seamstress, barely literate, terrific cook, she liked to laugh but pretended not to have a sense of humour. I suspect she appeared in no more than ten photographs in her lifetime. What kind of database or algorithm will store and replicate her? To make good garments, you still need human seamstresses and tailors; and frankly I am not holding my breath for the day when a robot might be able to cook as well as she did. Hers was a world of gestures, of atavistic spiritualism, of sociality totally unmediated by email communication and barely by the telephone, for that matter. It would be hard to argue that she was data, harder still to meaningfully record HerLifeBits and quite impossible to imagine her digital afterlife as it is currently packaged. This failure to describe on the part of the digital is effectively a reversal of the lowering of the threshold of description described by Foucault and discussed in the last chapter, and points to the plight of those whom the French call sans-papiers, the persons without papers who inhabit society illegally and who are a recurring presence in the late work of Derrida36. Except the failure to describe applies here not only to the outsider but to scores of newly or soon to be disenfranchised persons (effectively legal aliens) who used to be integral to society but now find themselves unable to fit the template of the regime of information. The dedication of Katherine Hayles recent study My Mother was a Computer reads To my mother, who was not a computer37. For my part, I

See especially Paper Machines and theUndocumented Person and Paper or Me, You Know (New Speculations on a Luxury of the Poor) in Jacques Derrida, Paper Machine, translated by Rachel Bowbly (Stanford: Stanford University Press, 2005), pp. 1-4 and 4166 respectively.

36

187

would have to say that my grandmother is one of the reasons why this is not a study of cyberculture or of digital memory of the kind whose strong implication or outright argument is that a shift is occurring, or has already occurred, in which the analogue, or the actual, is the past which we are collectively shedding and the digital, or the virtual, the future we are bound to embrace. I resist the suggestion that my grandmother might be a person of the past, of a time before a transition that is to come for all of us; that the wired citizens of cyberspace, among which I should count myself, are the vanguard of a migration that represents the future of humankind. Earlier in this chapter we were introduced to Dr. Ian Pearson of British Telecom. By no means the most sophisticated or subtle thinker when it comes to imagining the strife and the traumas that such a migration might entail, he reckons that rich people can expect to be able to have their minds downloaded onto machines by 2050, while the poor will probably have to wait until 2075 or 2008 when its routine38. Everybody will willingly take their place on the ark; it is a way of thinking that we are by now familiar with. But it is more troubling to find a similar implication in the work of far more astute theorists who deal with the complex task of exploring new modes of electronic mediated textuality and subjectivity. To an extent, this implication is produced by the very act of theorising about (discussing) such important vectors of the culture. Pierre Lvy explicitly acknowledges this in the epilogue of Becoming Virtual, and most extraordinarily just as he is about to claim, -la Barlow, that the virtual is humanitys new home:
I love that which is fragile, evanescent, unique, carnal. I appreciate singural and irreplaceable beings and locations, an atmosphere that is forever associated with a situation or a moment. I am convinced that a major element of our morality consists simply in acceptance of being in the world, in not fleeing it, in being there for others and for oneself. But since the subject of this book is virtualization, I have written about
37 38

Hayles, My Mother Was a Computer, unpaginated.

David Smith, 2050 - and Immortality is within our grasp, The Observer (May 22, 2005), http://observer.guardian.co.uk/uk_news/story/0,6903,1489635,00.html.

188

virtualization. This does not imply that I have forgotten the other aspects of being, and I would ask the reader not to neglect them either. For the actual is so precious that we must, and at once, attempt to recognize and acclimate the virtualization that destabilizes it. I am convinced that the suffering that arises from submitting to virtualization without understanding it is one of the major causes of the madness and the violence of our time.39

Digital transcendence is predicated on the wilful forgetting of those other aspects of being, and much important recent cultural work has been aimed at making this elision visible (Hayles) or conversely showing how seemingly vestigial forms of knowledge creation have infiltrated the digital and are busy subverting its technoscientific ethos (Erik Davis). But Lvys point has a somewhat broader applicability, and is not widely acknowledged. Writing about virtualisation, it is difficult to account for the actual; writing about the digital, it is difficult to account for the analogue. Theories that are concerned with the intervention of new media in society are structurally inclined to (over)emphasise change, most often at the expense of the areas of resistance or inability to change. And indeed we find that conceptualisations and imaginings that are much more sophisticated than those of the proponents of digital transcendence operate similarly traumatic elisions, sometimes in the process of arguing for epochal shifts that entail using the past tense to refer not only to certain modes of understanding of the real but also to certain subjects and social configurations that are not quite dead yet. Take the transition from the mode of production to the mode of information as theorised by Mark Poster, already mentioned in the introduction. It is one thing to claim that a category such as the ownership of the means of production may not have the same meaning or breadth of applicability it once had; it is quite another to write that [a]rtisans and assembly-line workers, whatever their differences,

Pierre Lvy, Becoming Virtual: Reality in the Digital Age, translated by Robert Bononno (New York and London: Plenum Trade, 1998), p. 183.

39

189

had an active, hands-on relation to the materials used in production40. No doubt Poster needs not be reminded that artisans and factory workers (not to mention farmers) account for a stubbornly large portion of the workforce, even in the Western countries he selectively set out to discuss; but he has a problem, that these people do not fit the shift he is attempting to describe, and yet he cannot claim unlike Lvy that he is not writing about them, since the aim of his book is to account for social context and not a subset or splinter of the social. His solution lies in the use of the past tense, authorised perhaps by a statistical extrapolation: there are in fact, in the West at least, far fewer artisans and factory workers and farmers than there once were; so it is perhaps not too far fetched to assume that one day there will cease to be any. Automated industry, craft and agriculture will make all of those professions redundant, and anybody who is employed will sit in front of a computer screen. Does it matter so much that this is not yet so? The lively debate concerning the extent of the digital divide, and whether it is in fact deepening or on its way to being resolved, makes much the same basic assumption: that we live in an information age, and therefore justice demands that everybody have access to information, so they can all participate fully in society. Social participation becomes equated then not only to having access to computers, but to actually using them. Down this path, the same transition is bound to occur as it did in digital transcendence, and the social will be perforce conceptualised as something that takes place online. But do not worry: if you have forgotten your login or username, they will be emailed to you. It is not hard to see how the need to repress aspects of everyday life that are not consistent with ones successful entry into the information age would generate at the very least a diffuse and vague state of anxiety even among those who are more or less on board the digital. Nobody can transition as smoothly as all that. It is not possible to subscribe to cyberculture, navigate
40

Poster, The Mode of Information, p. 128. The emphasis is mine.

190

electronic textuality and engage in telepresence without once stopping to go the bathroom, or to fix a sandwich, and it is at such times that one is reminded of the extent to which the transition is metaphoric. Similarly, digital transcendence prefigures the most wholesale of changes, but its imagery betrays how deeply troubled and anxious the project really is. The agonising body that goes into spasms and dies at the same time as the consciousness that used to inhabit it wakes up on the screen in Moravecs scenario, the meek scientists who kidnap and murder in cold blood the detective who has caught a whiff of their bold design in Platts clumsy novelization are symptoms of this darker reckoning with the realities and subjects that have been shut out, shed, repressed, forgotten in order to make way for a new order of being. In the narratives of anxiety that are the subject of this dissertation, that which is not digital haunts future-oriented imaginings as a ghost, an interference, complicating the work of theory and the task of describing new social configurations that do not go in a single direction, but are traversed by contrary forces. The interplay of these forces oversees our last set of memory anxieties, which is the subject of the next chapter.

191

Chapter Four The Transmission of Memory

I have to believe in a world outside my own mind. Leonard (Guy Pearce) in the movie Memento

From Being Digital to Being Postmodern


Regarding the axis of anxiety I have endeavoured to describe thus far, a first conclusion has been drawn: that the seemingly opposite and equally apocalyptic visions of amnesia and hypermnesia, of a crisis of memory and of total recall, endlessly projected by consumer culture, stand in a complex relation which is reciprocal and inclusive as opposed to antithetical and exclusive. This perspective echoes Andreas Huyssens call to think memory and amnesia together rather than simply to oppose them1, a call he answered by inscribing this bipolar conception of memory inside a larger phenomenon, the crisis of the structures of temporality in the transition from modernity to postmodernity. As a positive value, memory in Huyssens view
represents the attempt to slow down information processing, to resist the dissolution of time in the synchronicity of the archive, to recover a mode
1

Huyssen, Twilight Memories, p. 7.

192

of contemplation outside the universe of simulation and fast-speed information and cable networks, to claim some anchoring space in a world of puzzling and often threatening heterogeneity, non-synchronicity, and information overload. (p. 7)

This position echoes that of Stewart Brand, the advocate of informationally sustainable knowledge growth whose ideas we encountered in chapters one and two. An appeal to memory framed in these terms, however, implies that the speed of thought of cybernetworks and the scrambling of time sequences which Huyssen takes to be characteristic of postmodernity are inherently negative and destructive forces, to which memory thus defined would be the remedy2. But it also sets up a direct opposition and mutual exclusivity between memory as a faculty rooted in modernity, and amnesia as a malaise generated by postmodernity. And indeed, by referring in the title of his major work on this subject to a twilight of memory, Huyssen suggests that the night of forgetting is where Western culture is heading, adding to our collection of anxious imaginings. Quite a different challenge, and yet consistent with Huyssens own view that memory, while it may be an anthropological given, is so deeply rooted in external forms of representation as to be transformed in each successive epoch along with changes in the nature and character of the prevailing forms of mediation, would be to attempt to define not the kind of memory that fights against postmodernity, but rather the kind of memory that operates within it. I will attempt to take up or at least better formulate this challenge at the end of this final chapter. But first I have to account for a shift in my analysis, which thus far has concerned itself with mapping an imaginary of

Patrick Bazin attributes a similar position in defence of slowness against the threat to amnesia brought on by dematerialization of the mediums and speed of inscription to the many fellow French intellectuals who fear, with George Steiner, that the humanities may be reaching their twilight because of the speed at which cultural memory is now forced to operate. In his critique of this position, Bazin argues that the new knowledge space promotes in fact the recontextualisation of heritage objects, thereby increasing their memorial value, and that its architecture enables this memory work to exist as part of an exchange economy. See Patrick Bazin, Reconfigured Memory, in Giulio Blasi (editor), The Future of Memory (Turnhout: Brepols, 2002), p.21.

193

anxiety in opposition to conceptions of memory in the digital largely produced by techno-scientific discourse. This chapter will pull postmodernism into the discussion, both as a form of cultural understanding and critique of this articulation which has to account too for certain biases, and as an anxiogenic discourse in its own right. The essays that make up Twilight Memories were written, as Huyssen explains, in the years following the wide-ranging debate about postmodernism of the early to mid-1980s (p. ii). It may be that in the meantime postmodernism, as a critical term, has lost some of its applicability and relevance, and that we have moved particularly for the subset of concerns involving memory, representation and technology into a phase that requires a recalibration of the critical tools used for its analysis and description. Darren Tofts proposes, aligning himself in turn with Jim Collins, that the culture is so heterogeneous as to pull out of the orbit of any of the prevailing narratives (post-Enlightenment, postmodernism, Information Age) commonly used to define it, and that we live in an age without a Zeitgeist3. Others, such as Susannah Radstone, are critical of crude mappings of epochal shifts, which regard memory as an a-historical constant instead of taking into account that its conceptualisation shifts over time. She explains further that, while it is tempting to map the contemporary cultural foregrounding of memory onto a postmodern overturning of modernitys (blind) faith in futurity, progress, reason and objectivity, [] such a move would paint too monochrome a picture of modernity and would establish an exaggerated opposition between modernity and the present.4 To these caveats I shall add another, namely, that terms such as the postmodern and the posthuman are double edged, in that they can be useful focal points around which to organise ones observations and critique of cultural phenomena, but can also lead to self-

3 4

Quoted in Tofts, Memory Trade, p. 18.

Susannah Radstone, Working with Memory: an Introduction, in Memory and Methodology, ed. Susannah Radstone (Oxford and New York: Berg, 2000), p. 3.

194

fulfilling analyses that cannot but produce outcomes consistent with their distinctive way of segmenting and historicizing culture. Accordingly, I have attempted to employ them as loose threads, rather than precisely mapped pathways. I am not interested, for instance, in arguing whether the posthuman, a term that I inherit more or less wholesale from Katherine Hayles, starts with Plato or Vannevar Bush or Philip K. Dick, nor whether it has in fact a beginning. The title of Hayles book, How We Became Posthuman, proposes that in certain respects it makes sense to claim that we live in such an age, but leaves its point of origin in the indefinite past. I like that. I like it because it offers a genealogy and a critical lens through which to observe the interaction between humanity and technology at any given point in time, while still allowing to distinguish usefully between a posthuman present, or imminent future, and a human past, however mythical, whose traits are up for contestation. Within the confines of this discussion, the posthuman helps me to conceptualise how memories interact with the body, how materiality shapes and is shaped in turn by the information flow. In the case of the postmodern, my main debt is to Lyotards characterisation of it as a condition characterised by an incredulity towards metanarratives5, of which postmodernism would be the conscious, express articulation6. From such a perspective, one would be compelled to ask: does it even make sense to still talk of memory? Memory used to be defined as the recollection of past events or impressions, and while it was always allowed that it would to some extent distort, misrepresent or in some cases even make up part of the picture, at least the past and the real did exist, and memory was supposed to lead back to them, whatever its shortcomings. For

5 Jean-Franois Lyotard, The Postmodern Condition: A Report on Knowledge, translated by Geoff Bennington (Minneapolis, University of Minnesota Press 1984), p. xxiv.

To clarify what I mean, Lyotard would be a philosopher of the postmodern when he describes the prevailing features of the epoch, and a postmodernist philosopher when he explicitly aligns his own thinking with those features, as he does particularly in The Differend.

195

Plato in fact memory and truth were one and the same, and philosophy was a form of anamnesis, of unforgetting. By the time of Cicero rhetoricians were practicing their ars memoriae in an explicit recognition that memory had to be carefully trained and disciplined in order to serve as a useful tool. Later still, Giordano Bruno used this practical toolset to take another shot at the Platonic dream, using the art of memory to reach back into the secrets of the cosmos as they were imprinted in the mind. Erik Davies reminds us that Gnosticism is alive and well and feeling quite comfortable in the folds of the new technologies of memory and communication. For the Gnostics, as for the rational positivists, the past does exist and memory is one of the chief tools for its retrieval. But postmodernism rejects the view that the past can be known in the same way that the physical structure of a crystal can be known, proposing rather that it is a place of contestation in which various epistemologies do battle. It is possible to regard this as an empowering, emancipating move: if no single view of the past can be imposed by those in power, it will be so much more difficult to reinstate totalitarianism in any form, and the groups traditionally left out of the writing of history will be able to bring their own pasts into the knowledge fray. But if the past is infinitely negotiable, then the memory that purports to refer to it must be also. Therefore we would have to speak of it in the plural, as in the case of our various histories and pasts and truths. Not one memory, but as many as there are people, and as wavering and inconstant as opinions and moods are. What this constant negotiation also foregrounds is that memory is not solely nor even primarily a faculty of the individual as the novel, Freudian analysis and cognitive science variously encourage us to believe but of society. The central point of Maurice Halbwachs seminal study on collective memory is precisely that it makes no sense to regard the memory of the individual in isolation from the group that he or she belongs to. And yet this tendency is as pronounced today as it was when Halbwachs was

196

making his observations, in the 1930s7. The controlled experiment in the laboratory, which tests its subjects through the use of shapes, numbers and so-called nonsense syllables, goes to great and quite deliberate lengths in order to manufacture its object of study, the memory of the individual, but its careful constraints underscore how artificial this category is, and just how difficult it is to shut language and society out. In truth there is no memory without the social and cultural means of its transmission. In the concluding paragraph of his book, Huyssen writes that postmodern, post-Auschwitz culture is fraught with a fundamental ambiguity, a destructive dynamics of memory and forgetting. This dynamics has been the overarching subject of my dissertation thus far, but I want to examine it now using as a starting point the adjacency of the qualifiers chosen by Huyssen to describe our age, postmodern and post-Auschwitz; an adjacency that bears strong implications regarding the social dimension of memory and the social imperative of remembering.

Memoricide
Postmodernism contends that one can speak with certainty about, say, certain physical properties of objects, but not about any event which enters the sphere of the social, that is to say which involves human interaction and communication mediated by language. Historical truth, to name the most obvious and important place of contestation, should therefore never be spelt with a capital tee or used in the singular, but rather seen as contingent on the nexus of power relations and epistemological tools prevalent at the time and place of its production. This is not to say that other worldviews do not now or did not in the past ever allow for scepticism concerning the validity of absolute pronouncements on the historical past, but rather that in

La Memoire Collective was published posthumously in 1954.

197

postmodernity this scepticism, Lyotards incredulity, is so deeply ingrained as to have become a defining feature of the culture. For those affected by the postmodern condition, claims concerning the past are just that, claims, and any discussion about their relative merits entails the critical work of tracing, observing and describing their interplay. This approach is especially challenging in the arena of politics, where the shaping of ideas and programmes aimed at affecting the present has such a profound correlation to conflicting versions of the past that are increasingly difficult to evaluate and share. More and more often, umbrella terms such as emancipatory politics are used to encompass a great variety of projects, and their corresponding worldviews, which come together in focal points such as the regular meetings of the WTO or the G8, but which, aside from these geographical collisions, often appear to have little in common. Labour and market relations are no longer the principal coordinates used to map the past and the present. For many, the history of the last two centuries is shot through with different themes, such as the rape of the environment or the struggle for gay rights. This leads to the telling of very many different (hi)stories, which are traced and narrated according to the analytical and rhetorical framework peculiarly suited to each of them. What differs in these movements is not only what different people choose to care about, but the set of memories they bring to the task, each of them vying for a centrality around which the relevant political projects should be organised. Given the lack of overarching metanarratives and of common protocols for validating these sets of memories and assigning them relative value, the political body is now more fragmented than ever before. Its crisis is the crisis of History with a capital aitch. Again, this is not a radically new phenomenon: political theories have always looked to interpret the past, if only in order to describe the present situation, and do so in implicit or explicit competition with other theories.

198

What postmodernism does is shine a light on the assumptions that make it possible to formulate statements of fact about the social, highlighting the power relations and language formations that enable certain subjects to speak with authority about certain topics while at the same time preventing other subjects from doing the same. In the sense in which Lyotard uses it, postmodernism itself plays a meta-role: it is a theory of how theories are constructed, the knowledge of how knowledge operates. So much for postmodern. But what about post-Auschwitz? From a white Western perspective, if there is a single memory that needs to be emphatically carried out of the twentieth century, if there exists a single trauma whose mark is so deep and lacerating that it defines the culture that has produced it, and challenges it to avoid its recurrence, it is the Holocaust. Keeping its memory alive is therefore seen nigh universally as a moral imperative, so much so as to justify judging an epistemology on whether or not it allows us to do so. This explains why the Holocaust, its memory, its knowability, is one of the fundamental problems that postmodernist philosophers, themselves in the main the product of white Western societies, have had to grapple with and fit into their conception of how knowledge is constructed. Lyotard for one makes it the central concern of one of his major works, The Differend. Here he makes the claim that
with Auschwitz, something new has happened in history [] which is that the facts, the testimonies which bore the traces of heres and nows, the documents which indicated the sense or sense of the facts, and the names, finally the possibility of various kinds of phrases whose conjunction makes reality, all this has been destroyed as much as possible.8

This is an unusual formulation of what is unique about the Holocaust, in that it focuses not so much on the crime itself as on the destruction of the means to know about it. It is also more than a little problematic in a book that

Jean-Franois Lyotard. The Differend: Phrases in Dispute, translated by Georges Van Den Abbeele (Manchester: Manchester University Press, 1988), p. 57.

199

opened, following a series of preambles, with the following hypothetical question:


You are informed that human beings endowed with language were placed in a situation such that none of them is now able to tell about it. Most of them disappeared then, and the survivors rarely speak about it. When they do speak about it, their testimony bears only upon a minute part of this situation. How can you know that the situation itself existed? That it is not the fruit of your informant's imagination? (p. 1)

Delete the last four words in the first quotation, and the content of these two passages will be identical. The what if proposed by Lyotard is really about Auschwitz, then, and asks us to answer a very uncomfortable question: if there was no material evidence, how could we believe? The complete annihilation of the Jews and of other groups and populations under Nazi rule along with every trace of the genocidal machine set in place to do so was in fact the aim of the architects of the Holocaust; an aim that they may well have been able to carry out, were it not for Germanys military defeat. Had they completely succeeded, there would indeed have been no survivors, no blueprints, no document, no mass graves, no death camps turned into museums, no gas-saturated bricks. No body of evidence, nor bodies proper, alive or dead. Just an enormous hole in the population of Europe, the mystery of a crime with an obvious perpetrator, but no hard evidence nor meaningful testimony. Had the Nazis succeeded, the enormity of the human tragedy would be thoroughly matched by the gap in our knowledge, a gap worthy of Lyotards efforts to rescue truth and memory from such depths. But succeed they did not, not completely. Those four words, as much as possible, faintly recognise this gap. Millions were killed, but some survived. Not all the corpses were burned. Not all the evidence could be destroyed or erased. As a result, we are in fact able to prove, under the broadest parameters of historical reconstruction, that the Holocaust took place. This is thanks to the survivors themselves, and our ability to verify their testimonies by examining not only other testimonies and the statements of intention on the part of the Nazis, but also the 200

considerable extant mass of documentary and physical evidence. Naomi Mandel has in fact called the Holocaust the most thoroughly documented atrocity in human history,9 a pronouncement borne out by the many minute reconstructions of the inner workings of the Nazi death machinery, down to the chilling banality of bureaucratic routines10. But even these broadest of parameters are open to question, if one believes that what is ultimately observable are things like viruses, and nebulae11, but that there is a quite insurmountable gap between what Jameson calls the meaningless materiality of the body and nature, and the meaningendowment of history and of the social12. Hence the uncomfortable feeling that Lyotard is pushing the Holocaust too toward the margins of what can be observed, that is to say known with cognitive certainty. In the opening page of The Differend, after the second passage reproduced above, he goes on to quote none other than Robert Faurisson, a former professor of Literature at the University of Lyon and a leading figure in Holocaust denial, claiming: I have analysed thousands of documents. I have tirelessly pursued specialists and historians with my questions. I have tried in vain to find a single former deportee capable of proving to me that he had really seen, with his own eyes, a gas chamber13. In establishing Faurisson as his antagonist, and accepting however provisionally his challenge to seek to reconstruct the Holocaust solely through conversation and argument, only to came face to face with the obvious (and odious) paradox that a victim of a gas chamber could not testify to his or her experiences, Lyotard engages in a dangerous

Naomi Mandel, Rethinking After Auschwitz: Against a Rhetoric of the Unspeakable in Holocaust Writing, boundary 2 (2001), p. 205.
10 For one such account, see Deborah Dwork and Robert Jan Van Pelt, Auschwitz 1270 to the Present (New York: Norton 1996). 11 The telling choice of exemplary items, from the infinitely small to the infinitely large and distant, is Lyotards own in The Differend, pp. 4-5.

Fredric Jameson, Postmodernism, or The Cultural Logic of Late Capitalism, New Left Review 146 (July/August 1984), p. 59.
13

12

Robert Faurisson quoted in Lyotard, The Differend, p. 3.

201

game indeed. But it could also be that what he is really doing is rising to a different challenge, that of dismantling the dangerous contiguity that had developed between postmodernism and Holocaust denial. It is important to observe that the deniers dont work in a vacuum, observes Deborah Lipstadt in her important study of this phenomenon, before launching into a description of the intellectual terrain that favoured its global intensification in the ten to fifteen years leading up to Lyotards penning of The Differend. The deniers, she explains, are plying their trade at a time when much of history seems to be up for grabs and attacks on the Western rationalist tradition have become commonplace. This tendency can be traced, at least in part, to intellectual currents that began to emerge in the late 1960s. Various scholars began to argue that texts had no fixed meaning. The readers interpretation, not the authors intention, determined meaning14 Noam Chomsky, probably the most well-known intellectual to have argued publicly that Holocaust deniers should be listened to and argued against in the name of intellectual freedom, in most respects stands well outside the intellectual orbit of postmodernism; but Lipstadts point is well taken, even as she moves on to suggest that postmodernist practices are in fact dishonestly co-opted by the deniers, rather than being integral to them15. In noting that Faurisson used to describe himself as a practitioner of the criticism of texts and documents, investigation of meaning and countermeaning, of the true and the false, she underscores how ironic this definition is, given that the practice of deniers consistently involves the

14

Deborah Lipstadt, Denying the Holocaust: the Growing Assault on Truth and Memory (London: Penguin, 1994), pp. 17-18.

15 Pierre Vidal-Naquet is less generous in his critique that does not mention postmodernism specifically, but gestures at a contemporary culture (the inverted commas are his) that is not hard to characterise as postmodern when he talks in the context of the work of the deniers about inexistensialism, that is to say the move whereby social, political, intellectual, cultural and biological realities which were assumed ot be well established are suddenly declared nonexistent. These include sexual relations, woman, domination, oppression, submission, history, the real, the individual, nature, the state, the proletariat, ideology, politics, madness and trees. Pierre Vidal Naquet, Assassins of Memory: Essays on the Denial of the Holocaust, translated by Jeffrey Mehlman (New York, Columbia University Press, 1992), p. 3.

202

creation of facts where none exist and the dismissal as false of any information inconsistent with their preconceived conclusions,16 in a sort of warped and self-serving mock-deconstructivist approach. Lipstadt refuses in fact explicitly to use the term revisionist, which the deniers claim for themselves and most of their critics use, precisely because it gives denial an apparent inclusiveness, and the character of a practice the continual revision of a body of knowledge which has not ceased to be held as an intellectual ideal in the transition to postmodernity. But even if one posited the entirely reasonable proposition that Holocaust denial has latched on parasitically to the body of postmodernism in order to restore legitimacy to a totalitarian political project that is in fact radically incompatible with it, the contiguity is worthy of being explored, for it gets to the heart of what many commentators Huyssen amongst them regard as the crisis of memory in postmodernity. To reiterate the point made earlier, the postmodern privileges a set of epistemological attitudes, some of them of older coinage, that call into question the veracity and permanence of our remembered past(s), and that are explicitly critical of attempts to establish absolute facts and truths about our historical past(s). Insofar as it has managed to preserve its prerogatives and its role, the discipline of history, which is quite independent from social memory at the same time as it contributes to it, will conceivably be able to defend its conclusion that the constellation of events that is referred to by convention with the name of Holocaust (as well as World War II, the battle of Hastings or the Conquista) did in fact occur, even in the absence of a lineal social memory of those events, for as long as it will be able to examine and cross-examine recorded testimony, documents and other forms of evidence, which in the examples at hand are extraordinarily abundant17.

Lipstadt, Denying the Holocaust, p. 9. A similar position is attributed to historian Arthur Danto by Christopher Kent in Historiography and Postmodernism, Canadian Journal of History 34.3 (Dec 1999), pp. 385-431.

16

203

But the public discourse at large, where social memory is transmitted and contested, is far more porous, not tied to fixed protocols of enquiry and argument, and less able to develop antibodies against the kind of paranoid reading practices on which the work of Holocaust deniers is grounded. Paranoia allows for instance to contend that a certain individual or group is responsible for a crime, and only thereafter proceed to selectively collect and connect the evidence that proves their culpability as well as the crime itself. In the realm of fiction our friend Leonard Shelby from Memento is guilty of this, but so is the character of William of Baskerville in Umberto Ecos novel The Name of the Rose, which is an exploration of precisely these practices18. A paranoid mode of reading compels one to see only the connections that lead to a conclusion which is also the premise, according to the principles of abduction (as opposed to deduction) and at the expense of every alternative explanation encountered along ones path of reasoning. Contrary to the common view that in deconstruction tout se tient, anything goes which is hardly the case anyway paranoid practices allow in fact for one interpretation and one only, dictated at the outset and relentlessly pursued under the appearance of seeking a plurality of meanings. It is by appealing in bad faith to a pluralistic and contingent view of history that they attribute among others to Foucault that the deniers have asked that their position be allowed to be debated on an equal footing with those for whom the Holocaust needs to be studied and discussed but who do steadfastly agree that it did take place19.

This characterisation of the relationship between social memory and historical reconstruction is developed by Paul Connerton in How Societies Remember (Cambridge: Cambridge University Press, 1989), pp. 13-15.
18

17

See note 34 on page 83.

19 Recent and much-publicised attempts to introduce the study of intelligent design alongside evolution in schools in the United States also generally appeal to the need for fairness in the study of competing theories, in spite of the rather obvious fact that intelligent design is not a theory, but a belief. The obfuscation of this difference is achieved rhetorically by the use of expressions such as pro-evolution propaganda, used most recently (and closer to home) by Renton MacLachlan in a letter to the editor published in The Dominion Post of September 9, 2005, p. B4.

204

The ultimate aim of the deniers strategy, however, is not to put forward a contrary, dialectical position but to annihilate the counterargument, exposing it as a hoax and erasing the memory of the Holocaust along with that of its victims. Such an aim is truly a murder of memory, as suggested by the title of Pierre Vidal-Naquets study of Holocaust denial Assassins of Memory, or a memoricide, to employ the term used by El Pais journalist Juan Goytisolo to describe the shelling with incendiary grenades of the national library at Sarajevo by Karadzics forces in 1992, which was designed to obliterate the cultural patrimony of the Bosnian Muslims20.

Fig. 16 - The National Library of Sarajevo in 2004

The accusation could of course equally be levelled at the Nazis themselves, for the Holocaust was in a sense retroactive: it aimed firstly to erase every living trace of the Jewish presence in Europe, and then to rewrite the record

20 See Juan Goytisolo, El memoricidio de Sarajevo, El Pais (9 October 2004), www.elpais.es, article search El memoricidio de Sarajevo. In his novel State of Siege, Goytisolo writes that the smoke rising from the ruins of the library was as thick as that rising from the chimneys of the death camps. Juan Goytisolo, State of Siege, translated by H. Lane (San Francisco: City Light Books, 2002 [first Spanish edition 1995]), p. 82.

The historical antecedents of such acts of course abound. Bruce Sterling, in his discussion of what he calls mnemonicide, mentiones the burning of the Mayan libraries and of the knotted strings of the Incas. See Bruce Sterling, The Birth and Death of Memory, in Giulio Blasi (editor), The Future of Memory (Turnhout : Brepols, 2002), p.89.

205

of that presence via Nazi anthropology and history21. What is most striking in the case of the deniers, however, is that they are able to pursue this objective of erasure without the might and resources of a totalitarian regime and its war machine, exploiting a radically different set of cultural conditions from the one in which the Nazis seized power and thrived. And yet the wound they seek to inflict is also profound. Denial of an individual's or a groups persecution, degradation, and suffering is the ultimate cruelty, writes Lipstadt, on some level worse than the persecution itself (pp. 2728).

Informatics, the Archive and the Holocaust

The dawn of the information age began at the sunset of human decency. Edwin Black, IBM and the Holocaust Forgetting extermination is part of extermination, because it is also the extermination of memory, of history, of the social, etc. Jean Baudrillard, Holocaust

To destroy at once a people and a culture is an attack on memory at the same time as an exercise in mass murder; both a memoricide and a homicide. Having created the political conditions for this project, the National Socialist regime needed to set up an archive of such extent as to serve its obsessive aim to rid the people of Germany and of the countries under its control of their Jewish population, as well as of Gypsies, Slavs and other minorities. In order to round up and exterminate them, Germany needed first of all to create these groups, that is to say, to define what being Jewish or Gypsy or Slavic meant before the law, and compile the roll of
Gretchen Schafft explains that the collection of data on the populations slated for extermination carried out with increasing urgency by German anthropologists in the late 1930s and early 1940s served precisely this purpose, and hardly constituted an attempt to preserve the memory of those cultures and their presence. Gretchen E. Schafft, From Racism to Genocide: Anthropology in the Third Reich (Urbana and Chicago: University of Illinois Press, 2004).
21

206

their members. This process began in haste in 1933, in the first weeks of the Third Reich, when Hitler put Friedrich Burgdorfer, director of the Reich Statistical Office and of the Race Political Office, in charge of a census of all Germans. To limit this brief analysis to the case of the Jews22, the Nazi doctrine of racial genetics dictated that belonging to this group was not a form of living religious or cultural affiliation, but strictly a matter of ancestral bloodline. As Edwin Black documents in his extraordinary book IBM and the Holocaust, the building of the archive of racial Jews was a monumental task carried out thanks to the integral merging of the discipline of statistics with Nazi philosophy, and with the support of the information machinery provided by Dehomag, a German subsidiary of IBM. Dehomag customdesigned the punch card machines in use throughout the Reich to keep tabs on several categories of citizens the Jews paramount among them and ran the whole operation as a contractor. Its then director, Willy Heidinger, compared the running of the census to the work of a physician, in that it entailed dissecting, cell by cell, the German cultural body23. The results of this examination, carried out by an army of interviewers with the aid of storm troopers and SS officers, were tabulated at a pace made frantic by the urgency of the Reichs need to know:
A continuous "Speed Punching" operation ran two shifts, and three when needed. Each shift spanned 7.5 hours with 60 minutes allotted for "fresh air breaks" and a company-provided meal. Day and night, Dehomag staffers entered the details on 41 million Prussians at a rate of 150 cards per hour. Allowing for holidays and a statistical prediction of absenteeism, yet ever obsessed with its four-month deadline, Dehomag decreed a quota of 450,000 cards per day for its workforce. Free coffee was provided to keep people awake. A gymnast was brought in to

As Pierre Vidal-Naquet notes, the deniers have been preoccupied exclusively with denying the existence of the Jewish victims of the Holocaust. See Vidal-Naquet, Assassins of Memory, p. xxiii. In Edwin Black, IBM and the Holocaust: the strategic alliance between Nazi Germany and Americas most powerful corporation (London: Little Brown, 2001), p. 50.
23

22

207

demonstrate graceful aerobics and other techniques to relieve fatigue. Company officials bragged that the 41 million processed cards, if stacked, would tower two and a half times higher than the Zugspitze, Germany's 10,000-foot mountain peak. Dehomag intended to reach the summit on time. (pp. 57-58)

Statistics and advanced information processing were so integral to the building of this archive that it becomes difficult to conceive of the racial policies of Hitlers regime without them. Friedrich Zahn, publisher of the German archive of statistics magazine, claimed at the time that German statistics has not only become the registering witness . . . but also the creative co-conspirator of the great events of time (sic) (p. 49). The Nazi census of Jewry, echoing Derridas pronouncement on the archival death wish we encountered in chapter two, contained the seed of its own destruction, in that it was built in order to be erased. Although open air killings and deaths directly caused by enforced poverty were common among the Jewish population under German rule before the outbreak of the war, it was from 1940 that the Nazi apparatus proceeded systematically to destroy the archive along with the people it identified and helped to apprehend. This second part of the project, carried out in the extermination camps of Auschwitz, Treblinka, Majdanek, Chelmno, Sobibor and Belzec, demanded that no documentary trace be left, so that future censuses would not even need to reserve hole number 8 in the Dehomag card, the one indicating Jewish descent24. But the obsessive effort to eradicate, to erase memory and erase from memory, was evident in many aspects of the operation of the camps. In the account of his imprisonment at Monowitz-Auschwitz, from 1943 until the liberation of the camp by the Soviet army, Primo Levi is especially keen to explore this memoricidal dimension, seen in the radical fracture between past and present effected by the transition into what he elsewhere calls the
Hole 3 signified homosexual, hole 9 anti-social, hole 12 Gypsy. See Black, IBM and the Holocaust, p. 21.
24

208

concentrationary universe, a phrase common among commentators and historians that points in itself to the radical nature of the fracture. It is not so much that the memories of the prisoners were lost, explains Levi, but rather that they became inoperative, so profound was the disconnect between life before and after entering the camp. Thus the induction into the camps, for those who were not sent to the gas chambers at once upon disembarking their convoy, started in highly symbolic fashion with the confiscation of all remaining personal effects, the humiliating practices of the haircut and the delousing, and the replacement of the prisoners name with a number, which effected the literal translation from human being to statistical datum25. Nowhere did the power of this latter practice appear more profound and evident to Levi than in one particular companion, of whom he writes:
He is Null Achtzehn. He is not called anything except that, Zero Eighteen, the last three figures of his entry number; as if everyone was aware that only a man is worthy of a name, and Null Achtzehn is no longer a man. I think that even he has forgotten his name, certainly he acts as if this was so.26

Alongside this primal loss was a constellation of others, including the prisoners language and habits of mind and body. This took place again from the very outset of the imprisonment:
Nothing belongs to us any more; they have taken away our clothes, our shoes, even our hair; if we speak, they will not listen to us, and if they listen, they will not understand. They will even take away our name: and if we want to keep it, we will have to find ourselves the strength to do so,

25

By contrast in Slaughterhouse Five, a novel partly set on the days surroundnig the firebombing of Dresden, Kurt Vonnegut describes the experience of a group of American infantry being recorded on the POW camps roll by German officers as a re-entry into the world of the living. Everybody was legally alive now. Before they got their names and numbers in that book, they were missing in action and probably dead. The obvious difference is that in this case the archive was a safeguard against maltreatment and death, the exact opposite as the archive of the death camps. Kurt Vonnegut, Slaughterhouse Five (New York : Dell, 1968), p. 91.

Primo Levi, If This Is a Man; and The Truce, translated by Stuart Woolf (London: Everyman Publishers, 2000 [first Italian edition 1946]), p. 46.

26

209

to manage somehow so that behind the name something of us, of us as we were, still remains. We know that we will have difficulty in being understood, and this is as it should be. But consider what value, what meaning is enclosed even in the smallest of our daily habits, in the hundred possessions which even the poorest beggar owns: a handkerchief, an old letter, the photo of a cherished person. These things are part of us, almost like limbs of our body; nor is it conceivable that we can be deprived of them in our world, for we immediately find others to substitute the old ones, other objects which are ours in their personification and evocation of our memories. Imagine now a man who is deprived of everyone he loves, and at the same time of his house, his habits, his clothes, in short, of everything he possesses: he will be a hollow man, reduced to suffering and needs, forgetful of dignity and restraint, for he who loses all often easily loses himself. (pp. 26-27)

To make matters complete, a routine of infinite and senseless actions was created and enforced, producing a new and profoundly other mnemonic space of ritual and habit:
[E]very morning one had to make the "bed" perfectly flat and smooth; smear one's muddy and repellent wooden shoes with the appropriate machine grease; scrape the mud stains off one's clothes (paint, grease and rust-stains were, however, permitted); in the evening one had to undergo the control for lice and the control of washing one's feet; on Saturday, have one's beard and hair shaved, mend or have mended one's rags; on Sunday, undergo the general control for skin diseases and the control of buttons on one's jacket, which had to be five. (p. 35)

These enforced rites replaced what Paul Connerton calls the mnemonics of the body27, which have not only an individual, but a crucial social dimension. All these memoricidal actions, coupled of course with the brutality of the conditions and of the labour, trapped the prisoners into a perpetual present which not even the prospect of selection seemed able to

27

Connerton, How Societies Remember, p. 74.

210

delimit. Such will be our life. Every day, according to the established rhythm, Ausrucken and Einrucken, go out and come in; work, sleep and eat; fall ill, get better or die. For how long, you would soon learn not to ask. The future, outside of the immediate and urgent horizon of the day at hand how much one will eat [] if it will snow, if there will be coal to unload (pp. 37-38)28 simply ceased to have meaning. What renders memory inoperative in this setting, then, is that the past has ceased to have meaning, and there is no future to work towards by applying ones experiences to the present. Levis account constantly returns to this disconnect, the ways in which it operated and how painful it was to be reminded of it. He notes for instance how he and a group of other Italians had arranged to meet at a regular time and place, in the early weeks of imprisonment, but that the practice was soon abandoned, for it forced them to remember and to think (p. 39). And in his first sojourn in the camps infirmary, where the routine of labour and daily tasks was suspended and replaced by a sudden limbo of inaction, he remarks his amazement at finding the memories of his previous life painfully intact (p. 61). In Levis case, this trajectory of memory loss is followed by one of reappropriation after he is shifted to a group of prisoners in charge of a chemical laboratory in the camps annexed synthetic rubber factory. This change in fortunes he owes to his training as a chemist, which he also finds surprisingly intact in memory when he is called upon to demonstrate it in front of an Aryan scientist, during an examination surrealistically reminiscent of the many he had passed at the university of Turin. From this moment on, thanks to the relative improvement of his living conditions the work was also hard and dangerous, but at least he spent time indoors and had more opportunities to steal supplies and sell them for food Levis temporal horizon begins to widen somewhat, although not as far as to include conceiving of a time after the camp, and of the return to the life he

Later (p. 159) Levi remarks that in camp slang the concept never was expressed by the phrase Morgen frh, tomorrow morning.

28

211

had known before his capture. But he has now enough strength not only to reduce his chances of being selected, but also to rebel against memoricide. We witness this in one of the books most extraordinary and touching chapters, in which he tells of his feverish attempt to recite to a fellow prisoner, in a mix of Italian and impromptu French, the Canto of Ulysses from Dantes Divine Comedy. It is at this time too that he begins to write a clandestine memoir, on pieces of paper he immediately destroys for if found on his person these would lead to the charge of espionage and certain death and which survive in the book in the form of the sentence I write what I would never dare tell anyone29. In a more reflective and less testimonial later work, The Drowned and the Saved, Levi characterises the National Socialist regime as having carried out an Orwellian falsification of memory, falsification of reality, negation of reality,30 and it is difficult not to spot an echo of the doomed diary kept by Winston Smith in Orwells Nineteen Eighty-Four in those scribbled notes of Levis, which were no sooner written than made to disappear. If Zygmunt Bauman is correct in his claim that the Holocaust is a product of the age of modernity, as opposed to an aberrant act of unreason which deviated from it31, then Orwells post-war work can be read as one of the sharpest and best-articulated explorations of the memoricidal function of totalitarianism. More to the point of the present discussion, however, comparing the practices described in Orwells novel to the current modes of production and transmission but also wilful erasure of memory shows to what extent these have been retooled in the transition to the postmodern. Orwells depiction of total information and mind control is still resonant, of course, but reading

29 30

The circumstance is explained in the post scriptum of the Everyman edition, p. 456.

Primo Levi, The Drowned and the Saved, translated by Raymond Rosenthal (New York: Summit Books, 1988), p. 31. See Zygmunt Bauman, Modernity and the Holocaust (Ithaca: Cornell University Press, 1989).
31

212

Nineteen Eighty-Four from the vantage point of the present, one is struck by how cumbersome and somewhat contradictory the methods of the rulers of Oceania are, and how divergent from the ways of propaganda and manipulation in the current mediascape. Take the workings of the Department of Records, where Smith is employed. Here the party members of yesteryear are deleted from pictures; here garbled versions of poems which have become ideologically offensive are produced for use in anthologies; here, crucially, every pronouncement of Big Brother and the Party is retroactively made to fit the current reality.
In this way every prediction made by the Party could be shown by documentary evidence to have been correct, nor was any item of news, or any expression of opinion, which conflicted with the needs of the moment, ever allowed to remain on record. All history was a palimpsest, scraped clean and reinscribed exactly as often as was necessary. In no case would it have been possible, once the deed was done, to prove that any falsification had taken place.32

Even more ponderous are the efforts of the Ministry of Truth, with its three thousand rooms, whose primary job is not to reconstruct the past but to supply the citizens of Oceania with newspapers, films, textbooks, telescreen programmes, plays, novels with every conceivable kind of information, instruction, or entertainment, from a statue to a slogan, from a lyric poem to a biological treatise, and from a childs spelling-book to a Newspeak dictionary (p. 193). To further ensure the purity of the public record, the citizens of Oceania are forbidden from autonomously producing any documents, especially journals or diaries. But even an organization of this scope could not conceivably stem the circulation of subterranean, independent truths were it not for the rulers ability actually to control the parameters of experience and interpretation. This is not achieved solely through the intimidation and constant

George Orwell, Nineteen Eighty-Four (Oxford: Clarendon Press 1984 [date of original publication 1949]), pp. 190-191.

32

213

surveillance famously provided by the incessant gaze of Big Brother and of a myriad of two-way screens, but also through the development of a language, Newspeak, accurately engineered so that a thought diverging from the principles of Ingsoc [] should be literally unthinkable, at least so far as thought is dependent on words (appendix, p. 417), and of the cultural practice of reality control, or doublethink. As in Harold Pinters Mountain Language memoricide is carried out by forbidding a people to use its language, so in Nineteen Eighty-Four a new dictionary becomes the means to change the way experience is processed and truth is constructed:
To know and not to know, to be conscious of complete truthfulness while telling carefully constructed lies, to hold simultaneously two opinions which cancelled out, knowing them to be contradictory and believing in both of them, to use logic against logic, to repudiate morality while laying claim to it, to believe that democracy was impossible and that the Party was the guardian of democracy, to forget whatever it was necessary to forget, then to draw it back into memory again at the moment when it was needed, and then promptly to forget it again: and above all, to apply the same process to the process itself. That was the ultimate subtlety: consciously to induce unconsciousness, and then, once again, to become unconscious of the act of hypnosis you had just performed. Even to understand the word doublethink involved the use of doublethink. (p. 186)

Memoricide on this scale requires nothing less than a totalitarian reach, inclusive of the power to define and sanction the correct use of language. But it, too, fails. Winston Smith is proof of this: an unremarkable man, a product of Oceanias education system, conversant in Newspeak, a man and tool of the bureaucracy and yet also a hater of the Party. In order to convert him, the enforcers of the Party doctrine have to summarily arrest him and torture him for days on end. They have literally to hardwire the loyalty to the Party into him, one electric shock at a time. Now this may still reflect in some ways how totalitarian societies operate today, and the account was certainly resonant at the time making people un-exist, for instance, was a favourite technique of Stalins, who routinely arranged for his former allies to be deleted from official photographs. I shall return later 214

to the example below, showing the deletion of secret police boss Nikolai Yezhov, following his execution in 1939, from a photograph in which he could previously be seen strolling by the Volga alongside Stalin33.

Fig. 17 The same photograph of Joseph Stalin with and without the chief of the secret police Nikolai Yezhov

Insofar as photography contributes to our record of the past, this is a literal act of memoricide. But notice how this technique requires the ability to gather and destroy the old texts to be replaced with the new (falsified) ones, leaving the regime powerless to explain away the extant contradictory ones which are bound from time to time to escape its reach. This is why the rulers of Oceania ultimately need Newspeak to short-circuit logic and reason, so as to neutralise the effect of a comparison of two such pictures side by side. The idea of banning certain thoughts by manufacturing a language that does not allow them to be formulated is seductive, but then when Orwell goes into details, explaining for instance that the word free in Newspeak could only be used for such statements as this dog is free from lice, but not in phrases such as politically free or intellectually free, he asks us to accept a very limited and limiting notion of how language operates. Newspeak is not a qualitatively new language, simply one with a more restricted vocabulary (as the soon to be unperson Syme explains to Winston, writing the Newspeak dictionary involves the destruction of words, cutting the

33

See Robert Conquest. Inside Stalins Darkroom, The Los Angeles Times Book Review (January 4, 1998).

215

language down to the bone). But nothing in its structure would prevent a group of dissenters from assigning a new meaning, or re-assigning an old one, to a word or cluster of words. A language whose syntax prevented the formulation of criticism, now that would be a guarantee against political opposition. But Newspeak is not it. Let us take a step back and consider the following passage, describing a typical task of Winstons at the Records Department:
[I]t appeared from the Times of the seventeenth of March that Big Brother, in his speech of the previous day, had predicted that the South Indian front would remain quiet but that a Eurasian offensive would shortly be launched in North Africa. As it happened, the Eurasian Higher Command had launched its offensive in South India and left North Africa alone. It was therefore necessary to rewrite a paragraph of Big Brothers speech, in such a way as to make him predict the thing that had actually happened. (p. 189)

Following the novels logic, one might ask: why bother to modify the record of Big Brothers speech, when the Party could just as easily report that an offensive had in fact taken place in North Africa? For a regime that has total control over the media, it would seem easier to operate in that way than to maintain an obsessive grip on the production and circulation of texts, constantly reviewed to ensure their internal logical and factual coherence with respect to the conduct of the administration, and to make them withstand the scrutiny and criticism of whom exactly? Even if there were any historians or scholars in Oceania, which seems unlikely and the novel gives no indication of, they would not be able to practice except with the authorisation of the Party and for the production of officially sanctioned truths. Surely nobody else would have access to the records. I am not making these observations in order to highlight internal contradictions in the novel, but rather to show how ultimately outdated the operation of Orwells memoricidal machine is, and how it, as well as Orwells critique of it, still depends on the belief in a direct and univocal 216

correspondence between language and things, the record and the real. Although one could point at the very least to the work of Nietzsche and Freud, if not Plato, for a critical perspective on the foundations of this belief, it is in the postmodern that its remaining vestiges have been blown apart. Could it be, then, that it is the syntax of the postmodern that, under the appearance of promoting freedom of interpretation and highlighting the historical situatedness of our epistemologies, has finally prevented us from formulating meaningful criticism, speaking truth to power, challenging the status quo? It would be tempting to dismiss the question outright as an exercise in reactionary thinking, and to argue that the upholding of what Peter Novick calls the noble dream of objectivity34 is a far from innocent exercise that serves to defend the entitlements of the institutions traditionally in charge of the production of knowledge and truth. As James Kloppenberg writes in his review of Novicks study, reason itself is historical, and the insight should be welcomed as the good news of postmodernism35. But, as he goes on to observe, in reality this does not occur, and the spectre of radical historicism causes to the contrary many historians to retreat to a fortress of unexamined assumptions concerning the nature of the knowledge they can provide (p. 1011). The stakes for those who choose to go down the path of incredulity are simply too high, and ultimately involve questioning whether in our imminent future there will be any knowledge, any memory to preserve and transmit. If we turn to the public discourse at large, the negative connotations that have been piled upon the term relativism indicate that this is a resonant anxiety that should not only be critiqued but also probed, interrogated in order to understand how collective memory is seen to operate in the present.

Cf. Peter Novick, That Noble Dream: The "Objectivity Question" and the American Historical Profession (Cambridge: Cambridge University Press, 1988). James T. Kloppenberg, Objectivity and Historicism: A Century of American Historical Writing, in The American Historical Review 94.4 (October 1989), p. 1011.
35

34

217

In the first instance, it would appear that the democratic societies of the West are still capable of circulating remarkable distortions and outright lies, but without the need to brutalise and starve of information their whole populations. On the contrary, in the lead up to the second Gulf War we saw how easily the intelligence agencies of the worlds richest, most technologically advanced nations whose citizens ought to be the most media-savvy were able to gather a body of evidence in order to produce a factual claim that had to be established in advance to support the case for war. This fits quite neatly the definition of paranoid reading proposed earlier, but then of course, as Gilles Deleuze and Felix Guattari convincingly show, paranoia is the organising principle of modern intelligence collection and analysis. According to Deleuze and Guattari, the agencies justify their existence according to three propositions which, together, neutralise all present and future attempts to make them accountable according to traditional criteria of rational inquiry:
The first [proposition] is that in the secret world it may be impossible to distinguish success from failure. A timely warning of attack allows the intended victim to prepare. This causes the aggressor to change its mind; the warning then appears to have been wrong. The second proposition is that failure can be due to incorrect analysis of the agencys accurate information []. The third proposition is that the agency could have offered timely warning had it not been starved of funds.36

In the example at hand, the factual claim was that Iraq was in possession of a substantial arsenal of weapons of mass destruction, and that (minor corollary) these weapons would be found once the country had been invaded and its regime subjugated. As it started to become clear that the weapons would not be found, the occupiers did not bother to fabricate hard evidence to the contrary, nor did their leaders attempt to falsify the record of their case for war; instead, they loaded the argument with more claims, some factual (the weapons were hidden, or moved to other countries), others of

Gilles Deleuze and Flix Guattari, A Thousand Plateaus: Capitalism and Schizophrenia, translated by Brian Massumi (Minneapolis: University of Minnesota Press, 1987), p. 406.

36

218

intent (our aim was always to effect regime change), others still questioning whether the falsehood of the original claim necessarily implied dishonesty on the part of the leadership (we acted on the basis of what we believed to be true).

This type of loading should not be confused with the kind we encountered in chapter two, culminating in the discussion of White Noise, whereby an excess of information caused a dulling of memory and of ones critical faculties. That one was the unintended by-product of a media-saturated society; this one is a calculated exploitation of that known effect in a very specific and important area of public conversation, pertaining to the accountability of government in critical acts of decision-making. Such processes are not memoricidal per se, although often in Western democracy forms of temporary censorship are enforced or the institution of the state secret is employed in order to seal a record and defer its public recollection until a future moment in time. But widely circulated texts and public pronouncements could not be confiscated and rewritten; the existence of media independent from government and the proliferation of personal archives do not allow for it. The strategy then is to exploit the plurality of points of view, infiltrate the very permeable means of communication and transmission of shared knowledge, in such a way as to promote ambiguity concerning the past, or pasts, that are being contested.

In the WMD example, the claims post facto serve more or less the same function as screen memories in Freudian theory, in that they differ from the original claim in a few substantive respects, but are otherwise similar enough (also rhetorically and contextually) to successfully interfere with the act of recall. Therefore, although the White House was never successful in establishing a link between Al Qaeda and Saddam Hussein, it spoke of the two together on so many occasions that a number of surveys have shown, the majority of Americans believe that the case for their complicity was in

219

fact made37. This strategy of letting the public fill in the blanks is also effective in fudging the record (that is to say, blurring the memory) of whether certain claims were actually made or not by the Government. And in case the situation was perceived as not being commensurate with the total abolition of public discourse in Orwells novel, I would like to point out that the sentence Oceania was at war with Eurasia: therefore Oceania had always been at war with Eurasia (p. 185) could easily be applied to the American position in the Middle East, if one replaced Oceania with the United States and Eurasia with either Saddam Hussein or Osama bin Laden. Again, it is not the case that those former allegiances are denied, nor their record destroyed or falsified, but rather that the relevant disputes are now framed in such profoundly different terms as to have severed the past they belong to from the present of political debate and action. They are memories that nobody really knows what to do with. *** The cultural vectors and biases generally attributed to postmodernity are, I believe, relevant to any investigation of the contemporary intersection of memory, representation and society. If anything, the postmodern insistence on questioning the authenticity of experience is becoming more and more entrenched and radicalised as society becomes more widely and deeply computerised38. Debords famous pronouncement that all that was once directly lived has become mere representation39 is easily applied not only to the spectacle of the media, but also to the lived, at the same time as they are virtual, communities of cyberspace. Nowadays one would be more likely to
37 A Harris poll taken in February 2005 estimated this number at 64% of the adult population. Various organisations have been keeping track of this bellweather of public support for the conflict since the months leading up to it. See the Harris Interactive website at http://www.harrisinteractive.com/harris_poll/index.asp?PID=544 (as at November 1st, 2005). 38 39

I shall discuss other implications of this convergence in the final part of the chapter.

Guy Debord, The Society of the Spectacle, translated by Donald Nicholson-Smith (New York: Zone Books, 1994 [first French edition 1967]), p.12.

220

object that there is nothing mere about representation, but at any rate that pejorative connotation is nowhere to be found in Debords original phrase (Tout ce qui tait directement vecu s'est loign dans une reprsentation40). So we can quote him out of context and pretend that he might have foreseen new possibilities at the same time as he was deploring the fragmentation and recombination of the real into a pseudo-world apart (p. 12), that which we would nowadays call the virtual. Except that Debords pseudo-world was an object of detached contemplation, whereas the virtual, as has been argued by very many commentators and is evidenced by some of the cyberspace narratives discussed in the last chapter, can be more meaningfully inhabited. In the early 1950s, Ray Bradbury imagined the then neonate television as an immersive medium to be projected against the four walls of a specially rigged room in the house. The spectator, standing in the middle of this architectural/electronic space, the parlour, would be assigned a character in the televised play being performed and the actors on screen would pause to allow him or her to deliver their lines. The medium would achieve in this way a pretence of interactivity and participation, giving to the spectator a powerful illusion of being there, on the screen. This is something that television does all the time, of course, sometimes in only slightly less literal ways the great popularity of the quiz in the early stages of both radio and television can be in part explained by its ability to generate identification between the spectator and the quiz participant. But Bradburys imagining is very apt at demonstrating the full mobilising power of Debords world apart, at the same time as its ability to devalue and push to the margins existence on this side of the screen. The novel is Fahrenheit 451, and its protagonist couple, Montag and Mildred she an avid consumer of televised plays, he a fireman in charge of burning books have forgotten how they first met only ten years earlier. As is consistent with the scenario I discussed in chapter two, the erasure of
40

Guy, Debord, La socit du spectacle (Paris: Gallimard, 1992), p. 15.

221

memory comes not only in the form of destructive streams of fire but also through the airwaves, in the form of packaged entertainment and the alienating sheen of electronic media. But while the novel can be read superficially as a lament for the demise of print, it does not idealise the old medium nor demonise the new one(s), but rather highlights the role of the society in which texts are produced and human interactions occur. As Faber explains to Montag:
Its not books you need, it's some of the things that once were in books. The same things could be in the parlour families today. The same infinite detail and awareness could be projected through the radios and televisors, but are not. No, no, it's not books at all you're looking for! Take it where you can find it, in old phonograph records, old motion pictures, and in old friends; look for it in nature and look for it in yourself. Books were only one type of receptacle where we stored a lot of things we were afraid we might forget. There is nothing magical in them at all. The magic is only in what books say, how they stitched the patches of the universe together into one garment for us.41

Bradbury understood the effect of media in the same way as Harold Innis did, that is to say, as a bias as opposed to a deterministic inevitability, and articulated the immediacy, the blasting power of the four-wall televisor as a potential threat to memory only insofar as the then present cultural conditions allowed things to unfold that way. Critics such as Debord, or the Neil Postman of Amusing Ourselves to Death, argue that they have done so, and both ultimately blame what Debord, in his 1986 Considerations on the Society of the Spectacle, calls the owners of the world42. To reiterate an earlier point, this memoricide differs from the amnesia caused by informational overload precisely in that it is a calculated effect, a malicious use of the media as weapons of mind control on a grand scale. But notice the distinctly postmodern shift that occurred between Orwell and Bradbury

41 42

Ray Bradbury, Fahrenheit 451 (New York: Ballantine Books, 1953), pp. 82-83.

Guy Debord, Comments on the Society of the Spectacle, translated by Malcolm Imrie (London : Verso, 1998), p. 6.

222

in the space of less then a decade. In Fahrenheit 451, the citizens are no longer crushed by excessive work, hunger or physical pain, but by entertainment. The construction of a true world apart is precisely what the rulers of Oceania stopped short of aiming for. Absorbed by their obsession to continually revise the archive of public writings so as to match the real, they never ceased to believe in the referentiality of language, and did not understand that true doublethink would entail not the reduction, but the multiplication of meanings. Worlds apart create virtual doubles and sets of memories not based on direct lived experience, but equally as powerful. If we set this consideration against Freuds concept of screen memory, we find that an accident of language (Freud of course would contend that there is no such thing) has given us a new metaphor: the electronic screen of television, cinema and the computer becomes the instrument of concealment, the key to a successful repression and substitution of memory. Meanwhile the underlying psychic distress, left unacknowledged and unexplored, is allowed to fester under the surface of a stupor-like happiness, only to surface in explosive acts such as Mildreds attempted suicide after a normal and seemingly happy day spent in the company of the parlour family at the beginning of Fahrenheit 451. Bradbury understood too the link that Debord makes explicit between the society of the spectacle and the imperatives of market capitalism, and in one of his sharpest satirical moments has a character remark that
Christ is one of the 'family' now. I often wonder if God recognizes His own son the way we've dressed him up, or is it dressed him down? He's a regular peppermint stick now, all sugar-crystal and saccharine when he isn't making veiled references to certain commercial products that every worshipper absolutely needs. (p. 81)

Ersatz sociality projected on the television screen, the regimentation of the public into consumers, the marginalisation of the media and modes of communication more biased towards reflection and criticism: this is the 223

blueprint of the society of the spectacle critiqued by Debord and vividly fleshed out by Bradbury. But as we step further into the postmodern tangle, we find that this conception too falls short of accounting for the prevailing cultural conditions in that it assumes on the one hand the primacy of lived experience, and on the other that the two worlds can successfully be kept apart. As Sherry Turkle documents in landmark study Life on the Screen, the actual and quite conscious everyday experience of many citizens of high-tech societies is that the virtual and the real continually bleed into each other, mirroring the scrambling of the binary pairs of inside and outside, natural and artificial, human and machinic which has accompanied my discussion thus far. But at the same time, as indeed was the case with those seemingly outdated and yet persistent divides, we are constantly reminded of the resilience of the real. With good reason, reviewers of Turkles book never fail to quote one of her informants quip that RL [Real Life] is just one more window, and it's usually not my best one.43 This brilliant line not only supports the case of those who claim that the virtual is a place of legitimate and meaningful socialisation, and who more and more often come to privilege life on the screen in terms that are far richer than the reductionist vision of the likes of Moravec or Negroponte, but it also shows that that one window of bodies of flesh and sharp corners, of nourishment and sex and sleep, has not quite been transcended yet and, however begrudgingly, we find ourselves still compelled to call it its old name, the real.

The Great Moon Hoax


However uncertain the timing and extent of the transition from modernity to postmodernity, a convincing case can be made with respect to the transmission of memory that there is a progression, a troubled shift from one cultural logic to the other, and that many texts and phenomena can be

43

Sherry Turkle, Life on the Screen (London: Weidenfeld & Nicholson, 1996), p. 13.

224

plotted along this trajectory. Thus, in the years immediately following the end of the war, neo-Nazi proselytisers used to argue that the case for the Holocaust had been overstated by the enemies of Germany, but that at any rate the mass murder of the Jews had been a justified act of defence against an ethnic group of known spies and traitors; and in time, their strategy shifted from justification to denial of that original sin altogether, and the exposing of the Holocaust hoax. Hoax, alongside paranoia, is another keyword of this transformation, and although it predates the postmodern, its cultural status has greatly intensified in the course of the last three decades. Beginning from 25 August 1835, the New York Sun daily published a series of six articles, purportedly reprinted from the Edinburgh Journal of Science, reporting the extraordinary discoveries of life on the moon made by British astronomer John Herschel from his observatory at the Cape of Good Hope. The Edinburgh Journal of Science did not exist, but Herschel was an actual astronomer, and would later learn of the extravagant discoveries, including unicorns and a hominoid race of man-bats, attributed to him by the Sun. The pattern of referring to a secondary source and to actual institutions and people that could potentially be traced gives to this pioneering hoax, enormously popular at the time and nowadays credited with starting the fortunes of the newspaper, a character consistent with the narratives that are found today in the World Wide Web, and in the looser network of circulation of so-called urban legends, an extraordinarily fertile ground. But at the same time as events and records are manufactured, as in the case of real hoaxes, real events are called into question and painted as hoaxes. How do we know which is which? That the answer does not come quite so readily to us is due to the postmodern tangle I am endeavouring to describe. In the case of the Holocaust, even to debate the proposition that it might be a hoax requires the rejection of all traditional standards of historical proof and enquiry. But other events, more tightly wrapped in the representational media, are open to greater contestation. They occupy a somewhat greyer

225

area. Such is the case of the Moon landing of the Apollo 11 spaceship in 1969. The website of the American Patriot Friends Network features in its page on the subject the well-known photograph of Buzz Aldrins footprint on the lunar surface, above a caption that reads Did man really set foot on the moon?44

Fig. 18 - The footprint of astronaut Buzz Aldrin on the Moon

The truly remarkable thing about this photograph is that it looks so unremarkable. The footprint resembles a million others and could have been impressed upon any similarly rugged and arid terrain on Earth. It becomes special only if one accepts the truth statement offered by the photograph when accompanied by one of its common captions (such as Aldrins own Solo footprint on the Moon). Change the caption, as the APFN did, and the photograph actually invites disbelief. This is the supreme irony underlying the Great Moon Hoax debate: the documentary evidence of the presence of Armstrong and his crew on the Moon could easily have been forged, in fact at times it even looks forged. The waving flag in many of the photos and videos taken during the mission is a classic example: how could a flag wave

The American Patriot Friends Network, Was the Apollo Moon Landing Fake? http://www.apfn.org/apfn/moon.htm.

44

226

in the absence of wind, many sceptics ask? The explanation put forward by one of the more comprehensive archives of the anti-hoax camp, the Internet Moon Hoax Page, states that
NASA appreciated that there would be no wind on the moon, so any normal flag would just hang limply and unattractively down the pole. To make things look better they added a bar that stood out at 90 degrees from the pole. The flag was really hanging from this, rather than from the pole. The bar was also not quite the full width of the flag, so that it was slightly furled to give a 'wave look' to it. 45

If one accepts this explanation, it was to make the flag look more realistic, that is to say more as a flag would look in a film, where the right wind conditions are always made available, that NASA manipulated this image, and in so doing unwittingly served the cause of its future critics. This is a delicious Baudrillardian twist: since we seem unable to apprehend the real except in a more and more highly mediated form, in order to document a real event we feel we have to stage it, to produce it as memory in the hyperreal. There is an added touch of irony here, in that the mission was inscribed in a propaganda race against the Soviet Union and in that regard it is easy to argue that it was, in fact, a spectacle. Thus the careful choreography, the flag flapping in the wind and the filming of Armstrongs first step from the Moons surface through the installation of an automatic video camera which landed before any of the astronauts another detail that the sceptics would seize upon in order to argue that the mission never took place. What we are talking about, clearly, are two quite separate ontological levels. On the one hand, there is the question of whether a hunk of tin was sent into space with three people on board and landed on the Moon on such and such a date; on the other, there are questions related to the meaning of the event, and in particular whether the landing was a staged spectacle or not. It is quite possible to answer in the affirmative to both sets of questions, but the

45

Moon Hoax website. http://www.redzero.demon.co.uk/moonhoax/.

227

former is governed entirely by a binary logic: either the mission, in that very limited sense, did take place, or it did not. And how do we know that it did? All the empirical evidence is in the possession of the scientific-military complex that organised the mission, which also took full charge of its reporting. But the rock specimens are no conclusive proof, for it cannot be demonstrated that they come from the Moon except by going up again and kicking some more around, and that the filming could have been made on a sound stage on Earth is something which simply cannot be disproved. Conversely, there is no way to conclude empirically that the mission took place46, and countering each of the sceptics proofs will not do either. For both positions, in a sense are preconceived. One assumes that NASA lied, the other does not, and everything else follows. In the days before postmodernity, Occams razor would have been quite sufficient to slash through the sceptics argument, and perhaps it is still the case, insofar as conceivably only very few crackpots actually believe that the Moon landing was a hoax, but the rest of us quite enjoy hearing the argument. The 1978 film Capricorn One, in which the failure of a mission to Mars moves NASA to simulate it on Earth, only to find itself having to murderously silence those within the organisation that refuse to participate in the deception, shows both how implausible and how entertaining the scenario is. Entertaining, too, because it is fun to entertain the idea and besides to believe that nobody ever went to the Moon, or conversely that there are aliens hidden at Area 51, seems a lunacy innocuous enough lending an ear to such fantasies does not taint us.

I mean currently. It may very well be possible in the future to build a telescope powerful enough to see the flag and the remains of the lander from Earth.

46

228

Of Recovered, False, Post- and Prosthetic Memories


What makes reading The Differend so profoundly uncomfortable, on the other hand, is the feeling one gets that Lyotard believes the Holocaust too to have occurred in outer space, or in a laboratory environment where it makes no sense to talk of proof or evidence, for reality is manufactured ex nihilo and governed by those in control. Even though his ultimate aim is to argue that a philosophy truly rooted in postmodernism is the exact antithesis of the logic of National Socialism, it is distressing that Lyotard should be willing to grant Faurisson the status of antagonist and concede to the deniers that part of the argument, allow them to propose that we shall never know, for sure, what happened in the extermination camps, or if the gas chambers actually existed, for it is a matter of testimony and counter testimony all that is left once what could be observed has been removed. Framed in these terms, the Holocaust becomes an emotional trauma whose uncertain nature can only be investigated by means of personal recollection, as opposed to social memory and historical enquiry. But the value of personal recollection in the reconstruction of actual, lived experience has come under a sustained attack, too, and is if not quite discredited certainly much less of a given than it has been by long tradition in the judicial forums of most cultures. As late as 1981, United States Justice William Brennan was able to quip that there is almost nothing more convincing than a live human being who takes the stand, points a finger at the defendant, and says Thats the one!47 Since then a number of researchers, the most widely quoted of whom is perhaps Elizabeth Loftus, have tried to give empirical substance to the proposition that remembering is a form of imaginative reconstruction, this time with the express aim of questioning whether the privileged status of eyewitness testimony in the judicial setting is actually justified by the degree of accuracy of the findings thus obtained.

In Roger B. Handberg, Expert testimony on eyewitness identification: a new pair of glasses for the jury, American Criminal Law Review 32.4 (Summer 1995), p. 1013.

47

229

This argument was helped along and made altogether more urgent by a new trend in testimony which emerged in the 1980s, when the practitioners of so-called recovered memory therapy helped a number of patients to uncover repressed memories of past sexual abuse, often at the alleged hand of family members. The first convictions sparked a concerted and extremely organised backlash led by organisations in the United States and Britain but also in New Zealand, which resulted in many of the verdicts being overturned. Rather than questioning the good faith of the accusers, the critics sought to question the technique which enabled the memories to be recovered (or, as they would have it, fabricated and implanted). As part of this campaign, London psychotherapist Janet Boakes declared in a 1995 article in The Lancet the existence of a false memory syndrome and launched an attack on the unqualified therapists whom she deemed responsible for its outbreak. Boakes was unwavering in her characterisation of where the truths and untruths of the matter lay (the emphases are mine):
False memory syndrome describes the "recovery" of vivid memories of events which did not take place. Affected adult patients accuse their parents of childhood sexual abuse which had been "forgotten" until triggered during therapy, and which those accused say never happened. There is no doubting the sincerity and conviction of these patients, nor the equal vehemence with which parents deny the allegations. Large numbers of accused families have formed themselves into societies to proclaim their innocence and to combat the spread of iatrogenic memories.48

While Boakes criticism of those largely untested methods was valid, her equally untested certainty that recovered memories simply had to be false violated the same criteria of empirical methodology and professional objectivity that she claimed to uphold. Giving a name to the problem, calling it a syndrome, was a key element of her rhetorical strategy, and much of the ensuing battle has hinged on the presumptive acceptance or

Janet Boakes, False memory syndrome, in The Lancet 346.8982 (Oct 21, 1995), p. 1048.

48

230

rejection of the term. Seeking to restore some balance to the debate, professor of psychiatry Judith Lewis Herman found fault instead in the media campaign which had brought the recovered memory/false memory question to the attention of the public:
These stories were constructed like nesting boxes from a set of three questionable propositions: first, that false claims of sexual abuse are common and increasing; second, that claims based on delayed recall are especially likely to be spurious; and third, that fictitious memories of abuse have been inculcated wholesale in a gullible populace by quack psychotherapists, self-help support groups, and religious fundamentalists. Since none of these points can be documented empirically, reporters relied heavily on anecdote, speculation, and the opinions of a small group of professional experts. The overall effect of these stories was to favour the position of those accused of sexual abuse, allowing them to claim the support of educated opinion, while relegating their accusers to the realm of mass hysteria.49

This perspectival shift highlights that the problem of the status of recovered or false memories lies in the failed convergence of personal, scientific, journalistic and judicial truth, in an exemplary case of what has become an endemic and radicalised situation in postmodernity. The Holocaust offers again some of the most virulent symptoms of this systemic clash of epistemologies, as is evidenced by the work of the deniers, whose cause was helped by the publicity surrounding a few fraudulent memoirs, most famously Bruno Grosjeans Fragments, which purported to be a chronicle of the authors experiences in Majdanek and Birkenau, and novelist Jerzy Kozinskys The Painted Bird. As well as assuming a false identity, claiming to be a survivor by the name of Benjamin Wilkomirski, Grosjean revealed that his memories of that period had been recovered through therapy. It is no surprise then that some of the arguments used to combat recovered memory

Judith Herman, Presuming to Know the Truth, in Nieman Reports 48.1 (Spring 1994), p. 44. For a similar analysis, see Mike Stanton, U-Turn on Memory Lane, Columbia Journalism Review (July/August 1997), http://www.cjr.org/year/97/4/memory.asp.

49

231

therapy, notably those of Elizabeth Loftus, were co-opted by Holocaust deniers in order to cast wholesale doubts on the testimony of any survivor. But new and sympathetic theoretical approaches to remembering the Holocaust, such as the influential notion of postmemory, have also produced some ambiguity regarding the status of direct testimony. In her 1997 book Family Frames, Marianne Hirsch defines postmemory as a powerful and very particular form of memory pertaining in its initial formulation to children of Holocaust survivors, but applicable to other second generation memories of cultural or collective traumatic events and experiences.50 While Hirsch later retooled the concept to allow for a broader and less problematic definition to go alongside the original one, this positing of a second order recollection raises a number of issues. By prefixing a paper on the subject with the famous passage in which Susan Sontag described her life as being divided into the time before and after she saw her first photographs of a concentration camp, at the age of twelve, Hirsch herself underscores how tenuous the privileging of filial identification is: by her own admission, Sontag knew little of the Holocaust when she saw those pictures, and her family, while Jewish, lived in New York at the time. Should we disbelieve her, then, when she writes: When I looked at those photographs, something broke. Some limit had been reached, and not only that of horror; I felt irrevocably grieved, wounded, but a part of my feelings started to tighten; something went dead, something is still crying51? What does it mean to say that the children of survivors remember, albeit in inverted commas, whereas as it is implied others do not? And if the appeal to memory is reserved to the children of survivors, what of the children of those who did not survive, or who did, but as is quite common chose not to speak to their children or indeed anyone else of their trauma? And finally, but perhaps more importantly, will the cultural, collective act of

Marianne Hirsch, Family Frames: Photography, Narrative, and Postmemory (Cambridge: Harvard University Press, 1997), p. 22.
51

50

Susan Sontag, On Photography (New York: Anchor Doubleday, 1989), pp. 19-20.

232

remembering the Holocaust cease altogether when the last of the children of the survivors dies, or will the following generation engage in postpostmemory? As I mentioned, Hirschs later work on postmemory defuses some of these questions by leaving the door open to a weaker interpretation of the term, whereby familial ties only facilitate, as opposed to determine, identification on the part of the children of trauma victims, and strong bonds of other kinds can make access to postmemory more broadly available. Thus postmemory could come to designate an intersubjective transgenerational space of remembrance, linked specifically to cultural or collective trauma,52 a category so broad as to lack meaningful theoretical applicability. All it tells us is that feelings of personal involvement are an important element in the transmission of collective memory, but I would suggest we did not need to look any further than Sontags quotation to figure that out. Ruth Franklin, an especially scathing critic of those who practice or promote what has come to be called second generation Holocaust testimony, has likened the phenomenon to a grotesque form of identity theft, citing one authors flaunting of the use of his own fathers tattoo number as a code for his ATM card53. While the judgment is certainly harsh, there is in the notion of postmemory an attempt to authorise a transgenerational identity transfer aimed at prolonging a direct, privileged and therefore truthful link with lived experience. In this regard it is telling that the term was coined so late in the piece, at a time when this second generation had reached its middle age. I would argue that what made this theoretical move so timely and urgent was perhaps another factor, namely the imminence of the collectively dreaded time when the last of the survivors would die. This prospect
52

Marianne Hirsch, Surviving Images: Holocaust Photographs and the Work of Postmemory, The Yale Journal of Criticism 14.1 (2001), p. 10. Ruth Franklin, Identity Theft True Memory, False Memory, And The Holocaust, New Republic (31 March 2004), http://www.tnr.com/doc.mhtml?pt=KvE0N9Qg2Zl/U3JSXyXLSx==
53

233

sparked in the 1990s a number of documentary projects, the most notorious and well-funded of which is the Shoah Visual History Foundation Archive associated with the name of Steven Spielberg, which has collected to date nearly 52,000 testimonies54. But postmemory defers the time when living memory of the Holocaust will simply cease to be available, and keeps alive a generation of people able to answer new questions. At the same time, Hirsch has woven this theoretical move which seeks to restore a privileged and authentic place of experience with another which highlights the importance and describes the role of the photographic image in the transmission of memory. The result is a complex mix of, on the one hand, the formative narratives imparted by eyewitnesses to their children with an emphasis on the continuity of transmission and, on the other, photographic fragments that are discontinuous in time and place, but together fill a space in recollection that words alone could not occupy. The result, paradoxically, is to further confuse us as to what bearing witness even means, who is authorised to fulfil this role and what are the social and technological instruments that enable them to do so. Alison Landsbergs notion of prosthetic memory, briefly touched upon in the introduction, is another important recent theoretical construct that seeks to broaden the availability of direct experience. As Landsberg explains
this new form of memory [] emerges at the interface between a person and a historical narrative about the past, at an experiential site such as a movie theater or museum. In this moment of contact, an experience occurs through which the person sutures himself or herself into a larger history. In the process that I am describing, the person does not simply apprehend a historical narrative but takes on a more personal, deeply felt memory of a past event through which he or she did not live. The resulting prosthetic memory has the ability to shape that persons subjectivity and politics.55

54 55

See the Website of the USC Shoah Foundation Institute at http://www.vhf.org/.

Alison Landsberg, Prosthetic Memory: The Transformation of American Rememberance in the Age of Mass Culture (New York: Columbia University Press, 2004), p. 2.

234

The emphasis again is on the ability of cultural technologies to synthesise the depth of feeling traditionally reserved to lived experience, and to produce memories that maintain a sensuous component and will thereafter be worn on the body (p. 20) by the audience. The main point of difference from Hirschs theory resides in the fact that prosthetic memories produced by mass-mediated representation do not presuppose the need for familial kinship and are able therefore to span much greater temporal and subjective distances, transcending race, class and gender (p. 21). In her analysis, Landsberg explicitly departs from the respective positions of Baudrillard and Jameson, whom she regards as unwittingly longing for some earlier moment when, for Baudrillard, there was a real or, for Jameson, people experienced history in an authentic way (p. 33). In the continuum that links the modern to the postmodern, she argues, both the real and the authentic have always been limit cases, ideal states, always dependent on a contingent negotiation regulated by the prevalent cultural conditions. Hasty pronouncements on the death of the real, too, are in Landsbergs view belied by the popular obsession with authentic experience, as evidenced by the reenactments of historical events such as the American Civil War or the boom of the experiential museum. Landsbergs theory hinges on a highly sophisticated understanding of the role of representations in the transmission of cultural information, and is both cogently argued and lucid in its assessment of the positive role that these acts of mass mediation can play in the production and articulation of individual subjectivity and political consciousness. In a reversal of Debords position, she contends that prosthetic memory does not alienate from reality but on the contrary can suture us into the past, and teach ethical thinking by fostering empathy (p. 149). But what is strikingly lacking from her analysis, in that it is only mentioned but never grappled with, is precisely that this positive outcome is in no way determined by the nature of representation, and that the same media that can foster greater understanding

235

and empathy for the plight of other peoples can just as easily make us blind, or dull, or forgetful of them too. Thus her in-depth study of Roots (1976), a committed and highly successful television series, adapted from Alex Haleys novel, which is credited with raising in its worldwide audience the awareness of and the emotional stakes in the suffering caused by the slave trade, is matched by a single, passing mention of D.W. Griffiths The Birth of a Nation (1915). Yet this enormously successful film served for years as the principle tool of propaganda of the Ku Klux Klan and is seen by many as having been instrumental to its revival and the birth of the second clan, which would grow to include as many as four to five million members in the 1920s. From within Landsbergs theoretical framework, Griffiths revisionist narrative of an American South rid of its black and mulatto population by the heroic clansmen has to be regarded as just about the single most powerful prosthetic memory in the history of the cinema, and there can be no denying its role in the formation of both the individual subjectivity and collective political consciousness of its domestic audience. But this most exemplary of cases falls curiously outside the scope of her enquiry, as does the possibility that mass mediated representations may graft onto the viewing public bad history along with the good56, and foster intolerance (think The Passion of the Christ) by exactly the same means as tolerance. This problem requires a reliance on traditional canons of historical reconstruction, as opposed to the real and the authentic which she blames Baudrillard and Jameson for not being able to shake off, as the implicit, invisible and yet essential condition for the production of prosthetic memories. Any questions as to what ensures that representations remain faithful (neutral) to the past, and how this fidelity could ever be tested, are simply not addressed. Instead, Landsberg proposes that our allegiance to this or that memory, and by implication to this or that version of the past, comes down to a matter of product choice in a commodity-saturated capitalist society. But the notion of a marketplace of memory, too, is

The only judgment she allows herself is that the film was far from a neutral depiction (p. 55).

56

236

problematic, for it implies a set of uncomfortable correlations, chiefly between fashion or aesthetics and historical truth. Could it be that the version of the past with the best production values and the most creative marketing department is the one that finally prevails? The only other likely motivator of consumer behaviour would have to be ones political consciousness (as in her appeal to progressively minded persons (p. 155) to recognise and make use of the power of prosthetic memory) which would in turn make the empathy produced by prosthetic memory conditional on an ethical predisposition of origin unclear, but by necessity not itself produced with the aid of mass mediated representations. I do not wish to knock Landsbergs work too heavily, for I think it is in many ways timely and important. What the bind she gets into demonstrates, however, is how difficult it is to be optimistic about the value and effects of representation and technology while speaking in the language and grappling with the imagery of eroded certainties that characterises the postmodern. The syntax of the age is great for articulating difference and offering a limitless procession of caveats, which explains in my view why the rhetoric of techno-enthusiasts is so consistently infused with the utopian outlook of high modernism, which casts a hopeful look at the future rather than a worried glance at the past. Landsbergs self-avowedly utopian project, by contrast, is articulated from a position that is inherently ambivalent, as is indeed sharply revealed by some of the inevitable but prickly associations that her chosen metaphor evokes. Talk of a prosthesis conjures the image of a traumatic amputation, and of an anxious, supine patient at the mercy of a medical machinery beyond his or her comprehension and control. Then, in the aftermath, the likelihood that the new limb will not function as well as its predecessor, but mimic it clumsily. The fine example of Donna Haraway notwithstanding, it is difficult to be optimistic about the viability of this patched up political body, far easier to despair. And despair is, predictably, the default setting of the narratives of memory of postmodernity.

237

A New Breed of Cartesian Demons

So you want to have gone to Mars. Very Good. Philip Kindred Dick, We Can Remember It For You Wholesale

Yes, The Conscience of Zeno is an autobiography. It just isnt mine. Italo Svevo, author of The Conscience of Zeno

You have to begin to lose your memory, if only in bits and pieces, to realize that memory is what makes our lives. Our memory is our coherence, our reason, our feeling, even our action. Without it, we are nothing. Luis Buuel, My Last Sigh

We remember events that may never have occurred. We can so easily forget even our own name and who we are. This is the other axis of anxiety orthogonal to the one stretching between amnesia and hypermnesia that traverses the contemporary imaginary of mediated memory. It is the dark side of the prosthetic metaphor. It is the true memory of a false past, a memory so powerful that we cannot help being inhabited by it. A good argument to this effect could be sustained simply by pointing to the rediscovery by Hollywood of the work of Philip Kindred Dick, following his death in 1980 and the release in 1982 of Ridley Scotts Blade Runner. Stanley Kubrick had proved with 2001: A Space Odyssey that sciencefiction in the cinema could be an important vehicle of imagery and ideas, and Blade Runner arguably owes its success more to its atmospheres, that brilliantly set the stage for what fin de siecle angst would look like, than to Dicks somewhat eluded existential questions. But those too would be articulated in a sequence that includes Total Recall (Paul Verhoeven, 1990), 238

Blade Runner The Directors Cut (Ridley Scott, 1991), Minority Report (Steven Spielberg, 2002), Impostor (Gary Fleder, 2002) and Paycheck (John Woo, 2003). Every feature-length cinema adaptation of a Dick short story or novel to date, with the exception of Christian Duguays Screamers (1995), has hinged thematically on the unsettled, uncertain nature of memory and experience, and on the questioning of the relationship between memory and identity. While these concerns are at the forefront of Dicks opus, by the time Hollywood digested them they were ready to go live on a much greater scale, and scholarly critical attention has not been lacking, particularly to Total Recall and both versions of Blade Runner. David Harvey for instance has cited Blade Runner as an exemplary text that holds up to us, as in a mirror, many of the essential features of the condition of postmodernity,57 whereas Alison Landsbergs first foray into her theory of prosthetic memory was based on a strong reading of Blade Runner and Total Recall58. What these narratives share is a critique of the relationship between memory and identity, commonly understood in terms that, I would argue, do not greatly differ from those expressed by Luis Buuel in the passage from his autobiography quoted at the beginning of this section, and that could be condensed even further in the dictum: our memories are who we are. Dick likes to fracture and recombine this equation in as many ways as there are iterations between a body, a machine and one or more sets of recollections, and the end result is always a broken equilibrium in which the two variables in the equation, memory and identity, cannot even be expressed in the same language of symbols. The perplexing personalities displayed in his novel Do Androids Dream of Electric Sheep (the inspiration for Blade Runner) offer as good an example as any from his work of this unsolvable equation. On the one hand, we have

David Harvey, The Condition of Postmodernity: An Enquiry into the Origins of Cultural Change (Oxford: Basil Blackwell, 1989), p. 323.
58

57

Landsberg, Prosthetic Memory, in Cyberspace/Cyberbodies/Cyberpunk, pp. 175-189.

239

a caste of androids furnished with synthetic memories and totally indistinguishable from humans except for their inability to feel the inordinate degree of compassion and empathy towards animals which humanity has come to develop; on this basis, which any reader cognizant of the treatment of animals in our society will not fail to find rather ironic, the androids who stray from their assigned tasks are hunted down and destroyed. On the other hand, we have a society of human misfits, deemed genetically unworthy of inhabiting any of the new colony worlds and addicted to machines that regulate their moods and feed them happiness on command. What constitutes authentic experience and how this translates first of all into memory and then into a coherent, stable identity worthy of the right to life and self-determination are problems that simply cannot be solved within the parameters of the novel, which as is the hallmark of Dicks major works moves progressively away from any resolution or closure. Thus the protagonist, Rick Deckard, a bounty hunter in charge of retiring the androids, no sooner comes to question what it is that distinguishes a life made of actual experiences from a simulated one that subjectively feels just as real, than he is captured by androids and taken to a parallel pseudo-police headquarters where the tables are turned and the accusation of being a deviant (first in the sexual sense, then in the androidwho-passes-himself-as-human one) are levelled against him, forcing him to entertain the possibility that his memories may be false implants. I believe this kind of descent into solipsism, so frequent in Dicks fiction, to have been an important source of his appeal to filmmakers, who of late have been rather preoccupied with the ability of computer-mediated virtuality to incarnate the Cartesian demon (think The Matrix). But in Dick this solipsism is produced primarily not by the alienating spectacle of representation but by the disruption of the social, which is brilliantly captured and amplified in Scotts adaptation. Dicks Deckard has no friends and is trapped in a marriage with an emotionally distant woman who refuses to switch on the happiness-on-tap offered by the mood organ; Scotts

240

Deckard has no friends, lives alone in a house with many photos which project his sociality into a disconnected past (but could also be the accessories of a manufactured set of memories, as is strongly suggested in the Directors Cut), and falls in love with one of the androids he is meant to retire. What Do Androids and Blade Runner keenly portray is that not knowing who we are, which is the primal anxiety produced by the crisis of memory, is a state best reached when there is nobody around to remind us. Recall how Leonard Shelby lived his awful day-to-day, another instance of perpetual present, in the most anonymous of suburban motels, without family and friends, unable to produce a past except in loneliness, solipsistically. The same template is found repeated in many contemporary narratives which deal with the erasure, the replacement or the dysfunctional transmission of memory. Kathryn Bigelows Strange Days (1995) offers perhaps the starkest cinematic image of this disruption, with a Los Angeles in a state of perpetual rioting forming the backdrop for the illicit trade in synthetic memories in which the protagonist engages. There is nothing benign or empowering in this Debordian spectacle, a heightened form of pornography that desensitises instead of fostering empathy and, if accidentally amplified, can literally fry the brain. The memoricidal nature of the media is also the subject of Oliver Stones typically didactic Natural Born Killers (1994), whose protagonists appear to have no past, but construct their identities narcissistically as they see themselves reflected in the myriad present instants produced by television. The raging violence of Stones offering is matched by Paul Verhoeven in both the suburban Detroit of RoboCop (1987) and the Martian colony of Total Recall, whereas incommunicability, loss of intimacy and the inability to process emotional pain in a middle class urban environment are at the root of the disconnect in Eternal Sunshine of the Spotless Mind (Michel Gondry, 2004) and The Forgotten (Joseph Ruben, 2004). Finally, the expansion of market capitalism until it has engorged every form of social and political

241

interaction, leaving corporations as the sole entities in charge, oversees the erosion of the social sphere in Paycheck and Johnny Mnemonic (Robert Longo, 1995). In their darkest hours, the people-islands of these films recall Samuel Becketts Krapp, whose loneliness reverberated in the maniacal production of autobiographical recordings. But technology is not always implicated literally in the exploration of the unsettled nature of memory which these films variously engage in. Most notably in the case of The Forgotten and of Dark City (Alex Proyas, 1998), the memory erasers are an alien force, whose powers are magical even when mediated by earthly tools (such as the syringe used for inoculating memory in Dark City). But even when the technological apparatus is subsumed or invisible, we have to appreciate the shift between the laborious, mechanical drudgery of the old Orwellian system, which manipulated texts, images and recordings on a massive scale59, and the swift manipulations which characterise these more contemporary imaginings, matched by those made possible by the digital. Thus, in The Forgotten, whole diaries are blanked out and little boys are magicked out of pictures instantly, without so much as the need to fire up Photoshop, let alone take a trip to Stalins dark room. But the suddenness of these transformations also heightens the shock to the system, pressing the question of how in the world the disappeared image could possibly have ever been there, or, conversely, how one that was made to appear could possibly have not ever been there. Just as a paragraph of type (or a day in ones life, as in the example from Omar Naims The Final Cut we encountered in chapter two) can simply vanish at the touch of the DELETE button, and another just as quickly be pasted in its place, disrupting the

59 Terry Gilliams brilliant and at times irreverent cinematic rewriting of Orwells novel, Brazil (1985), inscribes the tragic fate of its protagonist precisely in the gap between the will to total control over the public and private pasts, and the impossibility of operating a machinery of such might. Thus the name of an enemy of the state, plumber-cum-saboteur Archibald Buttle, is misspelled as Tuttle in the arrest warrant due to a cockroach falling into the printing machine, and the wrong person is apprehended, tortured and finally terminated.

242

sense of a temporal sequence in the production of the text, so the case of the two pictures of Stalin walking along the river Volga, with or without Nikolai Yezhov, becomes infinitely more puzzling if recontextualised in the age of the digital. Which picture came first? Was Yezhov added in, or cancelled out? Was Stalin ever in fact there? And, perhaps more pressingly: is Bert truly evil?

Figure 19 - Example of Bert is Evil photomontage

Deconstructing Lilies
Concurrent with this shift to postmodernity is the technological transition from analogue to digital recording technologies, which may indeed constitute, to repeat a point made elsewhere, a quantitative difference well on its way to becoming qualitative60. It appears to be so for William Mitchell, author of The Reconfigured Eye, who characterises what he calls the post-photographic era as being dominated by a growing tension between a technology which offers the promise of more truthful representation, and the realisation of its potential for ever more sophisticated

60

Referring to Moores Law, which accurately predicted in 1965 that the number of components on a chip would double every year, Stewart Brand notes that [a]ccording to a rule of thumb among engineers, any tenfold quantitative change is a qualitative change, a fundamentally new situation rather than a simple extrapolation. Stewart Brand, The Clock of the Long Now: Time and Responsibility (New York: Basic Books, 1999), p. 14.

243

and covert modes of deception. Thus the aptly named photographic falsifier (an expression coined by Mitchell) holds up not a mirror to the world but a looking glass through which the observing subject is slyly invited to step, like Alice, into a place where things are different--where facts seem indistinguishable from falsehoods and fictions and where immanent paradox continually threatens to undermine established certainties.61 Photographic falsification of course is as old as the inception of the technology, and was widely practiced by the end of the nineteenth century, particularly in the field of the paranormal. In a notorious incident, made all the more significant by the identity of its victim, Arthur Conan Doyle was taken in by a crude photographic fake devised by two young girls, who managed to convince him that they had captured the image of fairies (in reality pictures they had cut out from a book) buzzing over a childs head. That the father of the detective par excellence could be so easily deceived shows how misguided the faith in the indexical nature of photography has always been, since long before the digital, except perhaps in the sense meant by Roland Barthes when he defined the photographic referent as not the optionally real thing to which an image or a sign refers but the necessarily real thing which has been placed before the lens, without which there would be no photograph.62 The cardboard cut-outs were after all actually dangled in front of the camera, so on that level the photograph shown to Conan Doyle was not deceitful. But even this vestige of referentiality is definitively swept away with the shift to the digital. Watching wildlife photographer Joseph Holmes at work on one of his prints, Kenneth Brower describes just how radical the process of decomposition and reconstitution of the image is as it passes through the electronic medium:

William J. Mitchell, The Reconfigured Eye: Visual Truth in the Post-Photographic Era (Cambridge: MIT Press, 1994), p. 191.
62

61

Barthes, Camera Lucida, p. 76.

244

Holmes proceeded to deconstruct his lilies. With light touches of stylus on pad he instructed the computer to open an image with lots of edit layers and then to peel away layers. As he removed each layer, a kind of shiver traveled through the image. It was ominous, somehow, as if a gust of ill wind had blown across Reelfoot Lake. The lilies would darken or lighten imperceptibly. The edges of the pads would shift and refocus in subtle, indescribable ways. These tiny metamorphoses caused me to consider the great mystery of how the world--its shapes and colors--looks to others. The computer, it seemed, was trying to solve the mystery--to educate the solipsist in me. This was like gazing at Reelfoot Lake through a succession of different corneas.63

This dense passage highlights the full extent of the postmodern tangle, and can generate any number of questions, including whether there is in fact any manipulation going on. Could it be rather that this procedure is simply aimed at making the camera see in a way that approximates that of the human eye? And ultimately, are not these electronic interventions analogous to the ones that occur in the translation of images into electrical currents along the optic nerve?64 But if perception itself is representational, then did it ever make sense to talk of an authentic experience? *** Philip Dick played the solipsism game ad nauseam, setting the stage for the cyberpunk trope of nested realities, some or possibly all of which are in fact digitally crafted illusions. The cinema, powerful machine for creating illusions that it is, felt perhaps that its turf had been unduly invaded, and started its own new wave of questioning the tangle, more often than not subjecting another medium, the computer, to bizarre Debordian critiques. In the most stunning case to date of the pot vs. the kettle, The Matrix blamed

Kenneth Brower, Photography in the age of falsification, Atlantic Monthly 281.5 (May 1998), p. 99. One could equally propose that the referentiality of the photographic image has actually improved since we can produce images in colour and no longer only in black and white.
64

63

245

digital machines for feeding peoples minds with an illusionary world apart while at the same time their inert bodies, plugged in side by side to enormous batteries, are farmed for bio-energy. The end result, again, is a total breakdown of the social, for there can be no collective agency, no meaningful human space of interaction in the arbitrary pattern of collisions of self-deluded digital avatars which occurs in the Matrix. Except the computer is not only another medium, of course, but also an integral part of cinemas own latest weaponry of illusion, and nowhere is this truer than in The Matrix itself, a film mostly shot not on location, but rather on digital sets generated from the urban landscape of Sydney, Australia, which is meant in the diegesis to represent an (American) simulated metropolis perceived by its inhabitants as being real. This dizzying game of Chinese boxes poses quite some problem in the positioning of the spectator and the orienting of the critique, and invites a question: could it be that the cinema is in fact the malevolent machine that implants memories into its passive spectators, who are asked for money and then made to sit silently side by side in large, dark rooms where they are not supposed to talk to each other nor communicate with the outside world (Please switch off your mobile phones). Is this not a technology that exploits, a technology that disconnects? There is some truth in Landsbergs claim that some of these cinematic narratives offer positive outcomes, and point to uses of memory technologies for a creative reinvention of the subject. This would seem to be the overt moral of Total Recall, whose protagonist chooses to believe that the implanted memories he has been shown are false, precisely because they give him a better ethical option than restoring his old, true (or at any rate truer) identity. He is then able to cross over from enemy of the oppressed minority to leader of a political insurrection. But this type of outcome is as much a rarity in film as Landsbergs stance is in the criticism of mass mediated representations and the museum. Far more common is the longing for a prelapsarian time when the truth was the truth, the past was the past,

246

and the bounded subject was not threatened by new and perplexing technologies of memory which are inevitably also technologies of identity and consciousness. The character of Commander Data in the Star Trek Next Generation television and movie series is in this regard as much of an everyman as Mementos Leonard Shelby. For Shelby remembered nothing, but he knew, or thought he knew, who he was; whereas Data remembers everything, except who he is. That the mainstream, heavily industrialised cinema of today should only be able to form this anxiety, and seldom or never the creative possibility, should come as no surprise. For it is itself a largely sealed-off world apart ruled by a handful of large corporations, much like the future fantasised about by so many science-fiction authors. And a technology that connects, and that gives options in the construction of personal and social identity, is bad for marketing, bad for hauling in the opening weekend crowds, bad for business.

Do You See What I See?

And when they ask us what we're doing, you can say, We're remembering. Ray Bradbury, Fahrenheit 451

I want to now return to the question suggested to me at the beginning of this chapter by the work of Andreas Huyssen: is it still possible, in our digital, postmodern world, to think of memory as a functioning faculty whose contents can be meaningfully shared, except by retreating (impossibly) to older media and older cultural configurations? As I hope to have shown, the question can be rephrased as follows: how do we know that the Holocaust occurred? To which my answer would be: we know because the failed convergence of instruments of knowledge that polarises the recovered memory dispute and lies at the heart of Lyotards 247

notion of the differend is not the only outcome, nor the most exemplary or frequent, that our culture can expect. And for as long as we can read or listen to the testimonies, examine the sites, study the documents, pass on the stories, look at the photographs, touch the remains, it does not really matter that each individual piece of physical evidence or memoir or picture may be forged, as some of them are. The weight of evidence is simply too great, and none is heavier than the sum of the words spoken by the people who knew, and who saw. We know enough. It is all in the pronoun. We know. I, personally, do not. There is very little, if anything, that I know, which is after all how Descartes argument began. But so long as I am unable to connect to others, I shall never know very much at all. It is when I am alone, too, that the electronic trail becomes frightening. I have allowed myself to be translated into data. My credit rating, my account numbers, my credit card PIN have become my identity, and all of a sudden they can be stolen. Identity theft: the most overreaching, and anxiogenic, metaphor of them all. The cogito allows me to say that I am, but not who I am; and, since I am alone, there is nobody who will vouch for me. I am not suggesting that technology in general, and memory technologies in particular, are in no instance alienating, or able to intensify the disconnect; merely that applying technology to the single user, a more or less ideal consumer (as consumer culture is structurally inclined to do), or to the selffulfilling image of a decentred digital subject, can so easily lead to limit cases where anxiety is the only rational feeling. This is just as true of amnesia (what I forget, others may still remember) as it is of total recall. Remember Cherchi-Usais numbers: it would take a single human more than 171,000 years to watch all the images humanity produced in 1999. But could we not also say that it would take a single carpenter as many years to hammer in every single nail manufactured in 1999? Arguably, this does not tell us much about whether we are in fact churning out too many nails, or if

248

we are using them rightly. We would have to find ways of interrogating the global community of carpenters in order to be able to tell if there is in fact a glut. The great computer networks, too, are cause for anxiety if we think of them as connecting data-hungry machines, which they certainly do, and then probing and tapping into ones electronic self. As the Dogg says, Prick yourself and you don't bleed, you download65: the prospect of genetic data banks accessible by insurance companies, employers and law enforcement, to name but one possible development of the technology, is undeniably quite frightening. But what the networks of computers also connect at the most basic level are people, allowing communal knowledge and memory to be exchanged in ways which we have only just begun to understand and exploit. The project of the global Internet archive, as I argued in chapter two, misses the point because it freezes this communal space, which is highly dynamic. That is in fact one of the few things we know about it with some certainty: that it moves, that it grows. The World Wide Web is a site of collective memory, but this memory does not reside in the sum of its pages at any given point in time. It is in the nature of the space, in the patterns, in the exchange what we could describe, borrowing Darren Tofts suggestive phrase, as the memory trade. In order to function, much like a human brain, the Web needs to be (a)live. In Collective Intelligence, Pierre Lvy argues that the space of knowledge which we have come to call cyberspace can offer refuge against the alienation produced by market capitalism, and become the catalyst for new, creative forms of collective intelligence. [T]he major architectural project of the twenty-first century, he writes, will be to imagine, build, and enhance an interactive and ever changing cyberspace.66 The metaphor will

lury.gibson, Blood Data (London: Transworld, 2002), p. 298. The data detective known as the Dogg, whose investigations take place entirely online, is the protagonist of the novels of Adam Lury and Simon Gibson.

65

249

be familiar, its genealogy leads back to the memory palaces of antiquity and the Renaissance. Except this time there need not be a single owner-occupier: we can all share in the riches, so long as we are able to connect. Erik Davis has noted that Saint Augustines description of the caverns of his personal memory, this inner place, which is as yet no place, leads not only to the realms of artificial memory but also to the evanescent grids of cyberspace67, and it does not seem an undue leap to amend Lvys project, from collective intelligence to collective memory, for the two, as we know, are so closely related. Personally, I am rather more inclined to see in the space(s) of community, rather than in the specific instantiation that is cyberspace, the true and more inclusive alternative to the homogenous space of commodity, and not only because getting onto cyberspace still requires a certain literacy, a cultural predisposition, and the disbursement of a fee; but also because it is possible, and indeed should be encouraged, to think of cyberspace independently of the technology that sustains it. Not because computers are bad, but because they are not the point. That is what Bradbury suggests to us in the closing chapter of Fahrenheit 451, when Montag is accepted into a wandering crowd of self-taught mnemonists, each in charge of remembering a different book (Im Platos Republic, p. 151), each a node of a communal space of memory. That is Bradburys frankly utopian blueprint, the way out from the society of the spectacle, and it is curious but also fitting that it should be so reminiscent of Platos platform, which placed knowledge in human memory and conversation as opposed to the medium which yet would enable him to make his thoughts known to us. We are book burners, too (p. 152), says Granger to Montag explaining that burning books after having read them was the only way that they could be preserved without endangering their clandestine owners. And perhaps digitisation too is what allows texts,

Pierre Lvy, Collective Intelligence: Mankinds Emerging World in Cyberspace, translated by Robert Bononno (Cambridge, Massachussetts: Perseus Books, 1997), p. 10.
67

66

Davis, TechGnosis, p. 198.

250

stripped of their commodity value, their materiality reconfigured and partly compromised, to enter the communal memory space, as knowledge, and be part of the conversation that keeps memory alive. This is by no means the only utopian view of cyberspace arguably the one that is far more commonly touted, especially out of the cradle of California, is the construction of the perfect market place, a vision that is consistent with that of the digital afterlife, in which we will be consumers of bandwidth forever. This other utopia, which regards cyberspace as a means of sustaining community, and thus of containing the entropic dissipation of knowledge, is in some ways even antithetical to that one, chiefly in that it is positively hostile to monetary gain and it looks to preserve the past instead of making way for the future. Both can be seen informing a variety of current practices, from the extraordinary ability of Google to connect every word or image or phrase to a piece of advertising, on the one hand, to the innumerable collective bodies of knowledge, from Usenet groups to the Wikipedia, on the other. From the point of view of collective memory work, Wikipedia is as good a place as any other to start looking for glimpses of the architecture of Levys utopian cyberspace. This distributed, democratised encyclopaedia of the people, by the people, for the people is itself a case study of how knowledge can be compiled, contested and distributed collaboratively with the mediation of a peer-to-peer network such as the World Wide Web. There is a hierarchy in Wikipedia, in that its founder, Jimmy Wales, has the ultimate power over the content and the rights of individual editors to participate in the project; it is just that he chooses not to wield it very often. But setting aside the interesting issue of the ownership of this vast body of shared knowledge, the fact remains that it is being shared, right now, and can as of this moment be written and edited by anyone who is able to go online68.

Nupedia, the original project that spawned Wikipedia, was an online Encyclopaedia written by experts, and there has been throughout the history of the project much talk about

68

251

And yet in spite of the fact it is produced by amateurs worldwide, Wikipedia as it is often remarked is surprisingly accurate and impervious to wilfully introduced errors or pranks. As Roy Rosenzweig extensively documents, as far as the discipline of history is concerned the encyclopaedia would appear to be more factually accurate than at least one if its chief commercially produced (and commercially distributed) rivals, Encarta69. Still, as he also notes, getting your dates and names right is one thing, but traditionally an encyclopaedia also needs to be objective and neutral, two tenets of the Wikipedia philosophy which editors often find hard to put in practice, especially when it comes to the always contentious interpretation of past history. The sometime furious edit wars in which many of the entries get caught are often regarded as the chief problem faced by Wikipedia, and many a collaborator including one of the projects initiators, Larry Sanger has been discouraged from taking further part in it as a result. But I would like to propose that these wars and the ensuing debates are in fact one of the most interesting aspects of Wikipedia, for they make it a place for contesting, as opposed to simply presenting, cultural and historical memory, as well as stressing once again the increasing untenability of terms such as neutrality and objectivity. The version history of each debated entry documents this process and remains accessible alongside the entry itself, offering an extraordinarily interesting account of how competing versions of the past can do battle and sometimes find a common ground. When they do not, as it is most often the case, it is perhaps lamentable that only one entry should exist on the surface of Wikipedia, and the subject of whether and how to allow entries to branch off to better reflect the lack of consensus that underlies them is also the cause of much debate among Wikipedians. But even as it currently stands, Wikipedia does a great job of illustrating, in actual social practice, how to produce a body of knowledge which makes

whether and how to give a special status to the work of editors who are experts in their field. The latest is Citizendium. Roy Rosenzweig, Can History Be Open Source? Wikipedia and the Future of the Past, The Journal of American History 93.1 (Jun 2006), pp. 117-146.
69

252

room for competing pasts, rather than presenting one version, one interpretation, one side of each given idea or event. The irreducible plurality of knowledge positions is also the ultimate subject of The Differend. Here Lyotard is not interested in articulating how conflicts between differently grounded arguments can be resolved, but aims to show on the contrary that phrases constituted according to a certain set of rules, or regimen, can never be translated into another regimen, making it impossible for different genres of discourse not only to pass judgment on one another, but also to reach a meaningful synthesis. Put so uncompromisingly, this conception is anxiogenic too because of how uncomfortable the idea is that memory, history and the past should be forever splintered and fractured, and that such statements as this happened should only make sense according to the rules of ones discourse. And if you do think that you are personally uncomfortable with this, Lyotard challenges you to think of the Holocaust as one such place of permanent contestation, as a memory that is open to being endlessly doubted and questioned and about which a consensus will never be reached. This anxiety borne of the postmodern is of a different, more radical nature than the ones I probed earlier in that it precedes the moment of inscription; taken to its extreme, it says that you can never ultimately know, therefore making it quite irrelevant to ask whether or not you might be able to remember, either by yourself or with the help of a machine. It also makes it meaningless, once and for all, to seek to draw the boundary between mind and language, biology and technology, instinct and culture. I have not returned thus far to the question of the undocumented persons with which I concluded the last chapter. It is easy to argue for a convergence between the digital and a postmodern sensibility, proposing that the recombinant nature of the information that circulates electronically, in contrast with the stubborn fixity of the printed word, makes the new media

253

an ideal environment in which to enact the crisis of metanarratives, and that in turn postmodernism understood this time as a philosophy is ideally suited to critique and understand the social transformations produced by these same media. As early as 1979, at the outset of The Postmodern Condition, Lyotard characterised our age as being inextricably postindustrial and postmodern, and expressly set out to discuss knowledge in [c]omputerised societies70. Eleven years later, in The Mode of Information, Mark Poster found that each further step taken by Western societies into the postindustrial age made the theories of the leading figures of poststructuralism appear more compelling, the linguistic turn more apposite; and that, reciprocally, this new environment provided the theories with a social context that brought them into much sharper focus71. But this feedback mechanism is also problematic, for if it is true that the new language formations ushered in (or else privileged, biased toward) by the digital are so well captured and described by postmodernism, then it stands to reason that in order to remain relevant postmodernism and the other discourses that address technoculture need the digital to attain what Mark Dery suggestively called escape velocity and cause a cascade of social transformations consistent with their descriptions/predictions. This feedback defuses in part the capacity for critique of these discourses and makes them act some times like chambers of resonance that amplify certain aspects of the culture; those same aspects that because of their bias cause whole categories of persons to remain undocumented. The imaginary of anxiety that I have endeavoured to describe could be interpreted then as a reaction to the destabilising effects on subjectivity and

70

Lyotard, The Postmodern Condition, p. 3.

71 I hope to demonstrate that new forms of language wrapping are imposing significant chances in the social field, and that poststructuralists theory offers a uniquely appropriate strategy of interpretation in relation to these new phenomena. Conversely I hope to show the ways in which electronically mediated communication becomes a social context for poststructuralist theory; that turning a poststructuralist gaze upon the social phenomena introduces a feedback effect on that theory, compelling a recalibration of its interpretive habits in important ways. Poster, The Mode of Information, p. 11.

254

memory set in motion both by technoscientific discourses and by their postmodern critical shadow, and hence also to the marginalisation of those who are unable or unwilling to learn to speak in the new languages and philosophies of the electronic age, and who conversely are not spoken by them. This complex and layered imaginary would act in this way also as a form of resistance against the most wanton acts of theory, instead promoting the efforts that painstakingly historicize and deconstruct the current mediascape, tracing its archaeology and describing its many sites of tension. The forced loss or trade of ones past and identity, the bloating of memory to the point of incapacitation, dreams of technologically mediated immortality that clash against their own radical impossibility and turn sour all these would appear to be symptoms of a memory impoverished and imperilled; but how much of this imaginary is fuelled by technophobic conservatism; how much by a lack of imagination regarding future possibilities; how much by an inability or reluctance to adapt; and finally how much by the urgent yearning for a space and the means to reclaim at least in part the memory agency relinquished to computers and to come up with meaningful alternatives to the dominant narratives of reconfiguration, is of course impossible to say. An anxiety is by definition ill suited to form a conscious, explicitly motivated critique. All the same, these imaginings strike a number of nerves and pose a number of problems, as I hope to have successfully shown; problems that, as I have also argued, seem more insurmountable if faced alone. Being disconnected (and I want you to think again of the life of Leonard Shelby at this point) is bound to be one of the principal fears of any society in which computer networks are increasingly prominent and pervasive. The salespersons of the digital are the first to stoke this fear by implying that you had better get on board, if you value your place in society and wish to remain employed. By pressing the concerns of the isolated, of those who have been left out, the narratives I have been looking at implicitly point to the solution, that is to

255

say, a more inclusive effort to reconnect (Bruno Latour would say reassemble) the social, in a world that is significantly inhabited but hardly exhausted by the digital. In very many respects, memory is alive and well and equally as comfortable dwelling in the perplexing rhizomatic folds of technoculture as it once was and still remains in the more familiar print-based and oral forms of transmission. Alongside one another and in constant communication with one another, these memory sites create in fact a redundancy, a multiplication of voices and recollections that it is very tempting to be optimistic about. But this optimism too needs to be alert to the shifting of power relations, the muting of marginalized subjects, the loss that comes whenever a new form of mediation intervenes on its predecessors (we simply cannot remember what it was like to be an oral society). The notion of contested knowledge and, by extension, of contested memory has the potential to ground a more broadly aware reconnection with the social, with the place of contestation and of reaching of troubled and precarious equilibriums, of unstable new configurations that are bound not to exactly resemble the landscapes of futurology or postmodern and posthuman theory, precisely because the stubborn inertia of our old ways of thinking and means of remembering will be perforce accounted for, if only in terms of disturbance or interference, and never completely subsumed.

256

Bibliography

All Web references are current at the time of printing (December 15, 2006) unless otherwise stated. Films and television shows are listed separately.

A beginners guide to Vmyths. http://vmyths.com/resource.cfm?id=56&page=1. American Patriot Friends Network, The. Was the Apollo Moon Landing Fake? http://www.apfn.org/apfn/moon.htm. Aronowitz, Stanley (editor). Technoscience and Cyberculture. New York: Routledge, 1996. Aug, Marc. Oblivion. Translated by Marjolijn de Jager. Minneapolis: University of Minnesota Press, 2004. Baker, Warren. Cross Browser Compatibility, webpronews.com, 17 January 2006. http://www.webpronews.com/expertarticles/expertarticles/wpn-6220060117CrossBrowserCompatibility.html. Barlow, John Perry. Declaration Of Independence for Cyberspace. http://www.nootrope.net/barlow.html. Barthes, Roland. Camera Lucida: Reflections on Photography. Translated by Richard Howard. London: Jonathan Cape, 1982. Bartle, Richard. Early MUD History. http://www.mud.co.uk/richard/mudhist.htm. Bateson, Gregory. Steps to an Ecology of Mind. New York: Ballantine Books, 1980.

257

Baudrillard, Jean. Holocaust. In Simulacra and Simulation, translated by Sheila Faria Glaser. Ann Arbor: The University of Michigan Press, 1994, pp. 49-52. . The Precession of Simulacra. In Simulacra and Simulation, translated by Sheila Faria Glaser. Ann Arbor: The University of Michigan Press, 1994, pp. 1-42. Bauman, Zygmunt. Modernity and the Holocaust. Ithaca: Cornell University Press, 1989. . Postmodernity and Its Discontents. New York: New York University Press, 1997. Bazin, Patrick. Reconfigured Memory. In The Future of Memory, edited by Giulio Blasi. Turnhout: Brepols, 2002, pp. 21-34. Beckett, Samuel. Krapps Last Tape, and other dramatic pieces. New York, Grove Press, 1960. Bell. David F. Infinite Archives. SubStance 33.3, 2004, pp. 148-161. Berger, Harry Jr. Phaedrus and the Politics of Inscription. In Plato and Postmodernism, edited by Steven Shankman. Glenside: Aldine Press, 1994, pp. 76-114. Bey, Hakim. The Information War. C-Theory.net, 1996. http://www.ctheory.net/text_file.asp?pick=64. Blasi, Giulio. Introduction to The Future of Memory, edited by Giulio Blasi. Turnhout: Brepols, 2002, pp. 7-20. (editor). The Future of Memory. Turnhout: Brepols, 2002. Blatner, David. Things to Do on the Web When Youre Dead. http://www.afterlife.org/info.html, as at March 9, 2005. Bolter, Jay David and Richard Grusin. Remediation: Understanding New Media. Cambridge, MIT Press, 2000. Bolter, Jay David. Writing Space: The Computer, Hypertext and the History of Writing. Hillsdale: Lawrence Erlbaum Associates, 1991. Borges, Jorge Luis. Borges at Eighty, conversations edited by Willis Barnstone. Bloomington: Indiana University Press, 1982. . Funes, the Memorius. In Fictions, translated by Anthony Kerrigan. London: John Calder, 1965. pp. 99-106. . The Library of Babel. In Fictions, Translated by Anthony Kerrigan. London: John Calder, 1965. pp. 72-80.

258

. Ramn Lulls Thinking Machine. In The Total Library: Non-Fiction 1922-1986, translated by Esther Allen, Suzanne Jill Levine and Eliot Weinberger. Harmondsworth: Penguin, 1999, p. 155-159. . The Total Library. In The Total Library: Non-Fiction 1922-1986, translated by Esther Allen, Suzanne Jill Levine and Eliot Weinberger. Harmondsworth: Penguin, 1999, p. 214-216. Boutin, Paul. The Archivist. Slate, April 7, 2005. http://www.slate.com/id/2116329/. Bradbury, Ray. Fahrenheit 451. New York: Ballantine Books, 1953. Brand, Stewart. Written on the Wind. Civilization, Oct/Nov 1998, pp. 7072. . The Clock of the Long Now: Time and Responsibility. New York: Basic Books, 1999. Brin, David. Kiln People: A Future Thriller. London: Orbit, 2002. . The Postman. London: Bantam, 1987. Britons growing digitally obese. BBC News Online, December 9, 2004. http://news.bbc.co.uk/2/hi/technology/4079417.stm. Brooks, Rodney. Flesh and Machines. New York, Pantheon Books: 2001. Brower, Kenneth. Photography in the age of falsification. Atlantic Monthly 281.5, May 1998, p. 92-111. Brown, John and Paul Duguid. The Social Life of Information. Boston: Harvard Business School and New York: McGraw-Hill, 2000. Bujold, Lois McMaster. Memory. New York: Baen Books, 1996. Bukatman, Scott. Terminal identity: the Virtual Subject in Postmodern Science Fiction. Durham: Duke University Press, 1993. Buuel, Luis. My Last Sigh. Translated by Abigail Israel. New York: Knopf, 1983. Bush, Vannevar. As We May Think. Atlantic Monthly. July 1945, pp. 101108, reprinted electronically at http://sloan.stanford.edu/mousesite/Secondary/Bush.html Calvino, Italo. Cybernetics and Ghosts. In The Uses of Literature: Essays, translated by Patrick Creagh. San Diego: Harcourt, 1982, pp. 3-27. . If on a Winters Night a Traveler. Translated by William Weaver. London: Picador, 1982. 259

Castells, Manuel. The End of the Millennium, The Information Age: Economy, Society and Culture, Vol. III. Cambridge and Oxford: Blackwell 1998. . The Internet Galaxy. Reflections on the Internet, Business and Society. Oxford: Oxford University Press, 2001. . The Power of Identity, The Information Age: Economy, Society and Culture, Vol. II. Cambridge and Oxford: Blackwell 1997. . The Rise of the Network Society. The Information Age: Economy, Society and Culture, Vol. I. Cambridge and Oxford: Blackwell 1996. Chartier, Roger. The Order of Books. Cambridge: Polity Press, 1994. Cherchi-Usai, Paolo. The Death of Cinema: History, Cultural Memory and the Digital Dark Age. London: bfi, 2001. . Keynote address at the 10th Film & History Conference Biennial Conference of the History and Film Association of Australia and New Zealand. The Film Centre, Wellington, 30 November 2000. Connerton, Paul. How Societies Remember. Cambridge: Cambridge University Press, 1989. Conquest, Robert. Inside Stalins Darkroom. The Los Angeles Times Book Review, January 4, 1998. Consenstein, Peter. Memory and Oulipian Constraints. Postmodern Culture 6.1, September 1995. Corns, Thomas S. The Early Modern Search Engine: Indices, Title Pages, Marginalia and Contents. In The Renaissance Computer: Knowledge Technology in the First Age of Print, edited by Neil Rhodes and Jonathan Sawday. London and New York: Routledge, 2000, pp. 95-105. Davis, Erik. Synthetic Meditations: Cogito in The Matrix. In Prefiguring Cyberculture: An Intellectual History, edited by Darren Tofts, Annemarie Jonson and Alessio Cavallaro. Cambridge: MIT Press, 2003, pp. 12-27. . TechGnosis: Myth, Magic, and Mysticism in the Age of Information. London: Serpents Tail, 1999. Day, Michael. Collecting and Preserving the World Wide Web. A Feasibility Study Undertaken for the JISC and Wellcome Trust. Version 1.0 - 25 February 2003, www.jisc.ac.uk/uploaded_documents/archiving_feasibility.pdf. 260

Day, Ronald E. The Modern Invention of Information: Discourse, History, and Power. Carbondale and Edwardswille: Southern Illinois University Press, 2001. de Magalhes, Joo Pedro. Homo Sapiens Cyber? http://jp.senescence.info/thoughts/cybernetics.html Dead Media Project. www.deadmedia.org Debord, Guy. Comments on the Society of the Spectacle. Translated by Malcolm Imrie. London: Verso, 1998. . La socit du spectacle. Paris: Gallimard, 1992. . The Society of the Spectacle. Translated by Donald Nicholson-Smith. New York: Zone Books, 1994. Deibert, Ronald J. Parchment, Printing and Hypermedia: Communication in World Order Transformation. New York: Columbia University Press, 1997. Deleuze, Gilles and Flix Guattari. A Thousand Plateaus: Capitalism and Schizophrenia. Translated by Brian Massumi. Minneapolis: University of Minnesota Press, 1987. DeLillo, Don. White Noise. London: Picador, 1999. Dennett, Daniel. Consciousness Explained. London: Penguin Press, 1992. Derbyshire, David. Chip is 400th the size of grain of salt. The Daily Telegraph Online. http://www.telegraph.co.uk/news/main.jhtml?xml=/news/2003/02/15 /wsci315.xml Derrida, Jacques. Paper Machines and theUndocumented Person. In Paper Machine, translated by Rachel Bowbly. Stanford: Stanford University Press, 2005, pp. 1-4. . Paper or Me, You Know (New Speculations on a Luxury of the Poor). In Paper Machine, translated by Rachel Bowbly. Stanford: Stanford University Press, 2005, pp. 41-66. . Platos Pharmacy. Translated by Barbara Johnson. In Dissemination. London: Athlone, 1981, pp. 61-172. . Archive Fever: A Freudian Impression. Chicago and London: The University of Chicago Press, 1995. . Paper Machine, translated by Rachel Bowbly. Stanford: Stanford University Press, 2005.

261

Dery, Mark. Escape Velocity: Cyberculture at the End of the Century. New York: Grove Press, 1996. . Memories of the Future: Excavating the Jet Age at the TWA Terminal. In Prefiguring Cyberculture: An Intellectual History, edited by Darren Tofts, Annemarie Jonson and Alessio Cavallaro. Cambridge: MIT Press, 2003, pp. 294-303. Descartes, Ren A Discourse on Method. Edited by A.D. Lindsay. London: J.M. Dent & Sons, 1912. . Philosophical Writings. Translated by Norman Kemp Smith. London: Macmillan, 1952. Dick, Philip Kindred. We Can Remember It for You Wholesale. In We Can Remember it For You Wholesale. New York: Citadel Press, 1987, pp. 35-52. . Do Androids Dream of Electric Sheep? New York: Ballantine Books, 1982. . Ubik. New York: Vintage Books, 1991. Donald, Merlin. Origins of the Modern Mind. Cambridge: Harvard University Press, 1991. Draaisma, Douwee. Metaphors of Memory. Cambridge: Cambridge University Press, 2000. Dunant, Sarah and Roy Porter (editors). The Age of Anxiety. London: Virago, 1996. Dwork, Deborah and Robert Jan Van Pelt. Auschwitz 1270 to the Present. New York: Norton 1996. Eco, Umberto. Chiamatelo platipo od ornitorinco, fatto sta che molto popolare. LEspresso, 8 August 1996. . Foucaults Pendulum. Translated by William Weaver. San Diego: Harcourt Brace Jovanovich, 1989. . The Name of the Rose. Translated by William Weaver. London: Picador, 1984. Egan, Greg. Distress. New York: HarperPrism, 1998. . Permutation City. London: Millennium, 1994. . Axiomatic. London: Millenium, 1995.

262

Erns, Wolfgang. Arsenals of Memory: The archi(ve)texture of the Museum. Mediamatic 8.1 (1994). http://www.mediamatic.net/article-5884-en.html. Feafunnoll, A.C. Human Contact Spreads PC Viruses. PCMag.com, April 1st, 2005. http://www.pcmag.com/article2/0,1759,1781208,00.asp. Featherstone, Mike and Roger Burrows (editors). Cyberspace/cyberbodies/cyberpunk: cultures of technological embodiment. London: Sage, 1995. Feldman, Alan. Political Terror and the Technologies of Memory: Excuse, Sacrifice, Commodification, and Actuarial Moralities. Radical History Review 85, Winter 2003, pp. 58-73. Figwit Lives! Page. http://www.figwitlives.net/. Flaubert, Gustave. Bouvard and Pcuchet. Translated by D.F.Hennigan. London: H.S. Nichols, 1986. Forty, Adrian and Susanne Kchler (editors). The Art of Forgetting: Materializing Culture. Oxford: Berg, 1999. Foucault, Michel. Discipline and Punish: The Birth of the Prison. London: Penguin Books, 1997. . The Order of Things: An Archaeology of the Human Sciences. London: Routledge, 2001. Franklin, Ruth. Identity Theft: True Memory, False Memory, And The Holocaust, New Republic, 31 March 2004, http://www.tnr.com/doc.mhtml?pt=KvE0N9Qg2Zl/U3JSXyXLSx== . Franzen, Jonathan. The Corrections. New York: Picador, 2002. Freud, Sigmund. A Note Upon the Mystic Writing Pad. Translated from the German under the general editorship of James Strachey. In The Standard Edition of the Complete Psychological Works of Sigmund Freud, Volume XIX (1923-1925): The Ego and the Id and Other Works. London, Hogarth Press, 1952, pp. 225-232. Fuchs, Mathias. Are you sure you want to do this? Mediamatic 8.1 (1994). http://www.mediamatic.net/article-5887-en.html. Gates, Bill. The Road Ahead. New York: Viking, 1996. Geary, Patrick J. Phantorms of Remembrance: Memory and Oblivion at the End of the First Millennium. Princeton: Princeton University Press, 1994. Gere, Charlie. Digital Culture. London: Reaktion Books, 2002. 263

Gerstner, John. The other side of cyberspace. Interview with professor Manuel Castells. Communication World 16.4 (March 1999) pp. 1116. Gibson, William. Johnny Mnemonic. In Burning Chrome. London: Harper Collins, 2000, pp. 1-21. . Neuromancer. London: Harper Collins, 1995. . The Winter Market. In Burning Chrome. London: Harper Collins, 2000, pp. 124-150. Goldsmith, Zac. And Some Good News. The Ecologist 29 (Jan-Feb 1999), p. 15. Golfera, Gianni. Larte della memoria di Giordano Bruno. Milan: Anima Edizioni, 2005. Goytisolo, Juan. El memoricidio de Sarajevo. El Pais, 9 October 2004.www.elpais.es, article search El memoricidio de Sarajevo. . State of Siege. Translated by H. Lane. San Francisco: City Light Books, 2002. Gretchen E. Schafft. From Racism to Genocide: Anthropology in the Third Reich. Urbana and Chicago: University of Illinois Press, 2004. Hakken, David. Cyborgs @ Cyberspace? AnEthnographer Looks to the Future. New York and London: Routledge, 2001. Halbwachs, Maurice. The Collective Memory. Translated by Francis J. Ditter, Jr. and Vida Yazdi Ditter. New York: Harper & Row, 1980. Hall, Gary. Culture in bits: the Monstrous Future of Theory. New York: Continuum, 2002. Halperin, David M. Plato and the Erotics of Narrativity. In Plato and Postmodernism, edited by Steven Shankman. Glenside: Aldine Press, 1994, pp. 43-75. Handberg, Roger B. Expert testimony on eyewitness identification: a new pair of glasses for the jury. American Criminal Law Review 32.4, Summer 1995, pp. 1013-1064. Haraway, Donna. A Cyborg Manifesto: Science Technology, and SocialistFeminism in the Late Twentieth Century. In Symians, Cyborgs and Women: The Reinvention of Nature. New York: Routledge, 1991, pp. 149-181. . Modest_Witness@Second_Millennium.FemaleMan_Meets_OncoM ouse. New York: Routledge, 1997. 264

. Teddy Bear Patriarchy: Taxidermy in the Garden of Eden. In Primate Visions. Gender, Race and Nature in the World of Modern Science. New York and London: Routledge, 1989, pp. 2658. Harvey, David. The Condition of Postmodernity: An Enquiry into the Origins of Cultural Change. Oxford: Basil Blackwell, 1989. Havelock, Eric. Preface to Plato. Oxford: Basil Blackwell, 1963. Hayles, N. Katherine. Boundary disputes. In Virtual Realities and Their Discontents, edited by Robert Markley. Baltimore and London: Johns Hopkins University Press, 1996, pp. 11-38. . How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago & London: The University of Chicago Press, 1999. . My Mother Was a Computer: Digital Subjectivity and Literary Texts. Chicago: The University of Chicago Press, 2005. . Writing Machines. Cambridge: MIT Press, 2002. Herman, Andrew and Thomas Swiss. The World Wide Web and Contemporary Cultural Theory. New York: Routledge, 2000. Herman, Judith. Presuming to Know the Truth. Nieman Reports 48.1 (Spring 1994), pp. 43-45. Hesiod. Theogony, Work and Days. Translated by M.L. West. Oxford: Oxford University Press, 1999. Hirsch, Marianne, Family Frames: Photography, Narrative, and Postmemory. Cambridge: Harvard University Press, 1997. . Surviving Images: Holocaust Photographs and the Work of Postmemory. The Yale Journal of Criticism 14.1 (2001), pp. 5-36. Holderness, Mike. The Librarian of Babel: The key to the stacks. Ariadne 9 (1997). http://www.ariadne.ac.uk/issue9/babel/. Hrabal, Bohumil. Too Loud a Solitude. Translated by Michael Henry Heim. San Diego, New York and London: Harcourt Bruce Jovanovich, 1990. Huxor, Avon. Xanadu: The Conversation of the Digital Text. Mediamatic 8.1 (1994). http://www.mediamatic.net/article-8992-en.html. Huyssen, Andreas. Present Pasts: Media, Politics, Amnesia. Public Culture, 12.1 (2000), pp. 21-38.

265

. Twilight Memories: Marking Time in a Culture of Amnesia. New York and London: Routledge, 1995. Innis, Harold Adams. The Bias of Communication. Toronto: University of Toronto Press, 1951. Inside the Wayback Machine. .net 112, July 2003. http://www.maxpc.co.uk/features/default.asp?pagetypeid=2&articlei d=18463&subsectionid=736&subsubsectionid=609. Internet Collapses Under Sheer Weight Of Baby Pictures. The Onion, 28 July 2004, www.theonion.com/content/node/32891?issue=4228&special=2004 Ivic, Christopher and Grant Williams (editors). Forgetting in Early Modern Literature and Culture: Lethes Legacies. London and New York: Routledge, 2004. Jameson, Fredric. Postmodernism, or The Cultural Logic of Late Capitalism. New Left Review 146, July/August 1984, pp. 53-92. Janes, Andrew. Library captures digital heritage. The New Zealand Herald, 29th March 2005. Janet Boakes, False memory syndrome. The Lancet 346.8982, Oct 21, 1995, pp. 1048-1049. Johnson, George. In the Palaces of Memory: How We Build the Worlds Inside Our Heads. New York: Alfred A. Knopf, 1991. Johnson, Steve. Goldman sets up hedge fund clone. The Financial Times, December 5, 2006. https://registration.ft.com/registration/barrier?referer=http://www.go ogle.com/search?q=%22Goldman%20Sachs%20has%20become%20 the%20first%22&location=http%3A//www.ft.com/cms/s/5b8331c082fc-11db-a38a-0000779e2340.html Kafka, Franz. In the Penal Colony. Translated by Willa and Edwin Muir. In The Metamorphosis, In the Penal Colony, and Other Stories. New York: Schocken, 1995, pp. 191-229. Kasparov vs. Deep Blue: The Rematch. IBM website. http://www.research.ibm.com/deepblue/home/html/b.shtml Kent, Christopher. Historiography and Postmodernism. Canadian Journal of History 34.3, December 1999, pp. 385-431. Kittler, Wolf. The Dioskuroi: Masters of the Information Channel. Configurations 10, 2002, pp. 111-127.

266

Kitzmann, Andreas. Pioneer Spirits and the Lure of Technology: Vannevar Bushs Desk, Theodor Nelsons World. Configurations 9, 2001, pp. 441-459. Klein, Andy. Everything you wanted to know about Memento. Salon.com, 28 June 2001. http://dir.salon.com/ent/movies/feature/2001/06/28/memento_analys is/index.html?pn=1. Kloppenberg, James T. Objectivity and Historicism: A Century of American Historical Writing. The American Historical Review 94.4, October 1989, pp. 1011-1030. Knowledge Lost in Information. Report of the NSF Workshop on Research Directions for Digital Libraries. www.sis.pitt.edu/~dlwkshop/report.pdf. Kozinski, Jerzy. The Painted Bird. London: Bantam, 1982. Kurzweil, Ray. The Singularity Is Near: When Humans Transcend Biology. New York: Viking, 2005. Kurzweil, Ray, Vernor Vinge and Hans Moravec. Singularity Math Trialogue, KurzweilAI website, http://www.kurzweilai.net/meme/frame.html?main=/articles/art0151. html? Landsberg, Alison. Prosthetic memory: Total recall and Blade Runner. In Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, edited by Mike Featherstone and Roger Burrows. London: Sage, 1995, pp. 175-189. . Prosthetic Memory: The Transformation of American Rememberance in the Age of Mass Culture. New York: Columbia University Press, 2004. Latour, Bruno. Reassembling the Social: an Introduction to Actor-NetworkTheory. Oxford: Clarendon, 2005. Lesk, Michael. How Much Information is There in the World? http://www.lesk.com/mlesk/ksg97/ksg.html. Levi, Primo. If This Is a Man; and The Truce. Translated by Stuart Woolf. London: Everyman Publishers, 2000. . The Drowned and the Saved. Translated by Raymond Rosenthal. New York: Summit Books, 1988. . Becoming Virtual: Reality in the Digital Age. Translated by Robert Bononno. New York and London: Plenum Trade, 1998.

267

Lvy, Pierre. Collective Intelligence: Mankinds Emerging World in Cyberspace. Translated by Robert Bononno. Cambridge, Massachussetts: Perseus Books, 1997. . Cyberculture. Translated by Robert Bononno. Minneapolis: University of Minnesota Press, 2001. Lipstadt, Deborah. Denying the Holocaust: the Growing Assault on Truth and Memory. London: Penguin, 1994. Locke, Chris. Digital Memory and the Problem of Forgetting. In Memory and Methodology, edited by Susannah Radstone. Oxford and New York: Berg, 2000, pp. 25-36. Loftus, Elizabeth F. Eyewitness Testimony. Cambridge: Harvard University Press, 1996. Lovink, Geert. We no longer collect the Carrier but the Information. Interview with Tjebbe van Tijen. Mediamatic 8.1 (1994). http://www.mediamatic.net/article-8989-en.html. Luria, Alexander. The Mind of a Mnemonist: A Little Book about a Vast Memory. Translated by Lynn Solotarof. Harmondsworth: Penguin, 1975. lury.gibson. Blood Data. London: Transworld, 2002. . Dangerous Data. London: Bantam, 2001. Lyman, Peter, Hal R. Varian et al. (editors). How Much Information? Research Project, School of Information Management and Systems of the University of California, Berkley, 2000. http://www.sims.berkeley.edu/research/projects/how-muchinfo/internet.html. Lyotard, Jean-Franois. The Differend: Phrases in Dispute. Translated by Georges Van Den Abbeele. Manchester: Manchester University Press, c1988. . The Postmodern Condition: A Report on Knowledge. Translated by Geoff Bennington. Minneapolis: University of Minnesota Press 1984. MacLachlan, Renton. Letter to the editor of The Dominion Post, September 9, 2005. Maechler, Stefan. The Wilkomirski Affair: A Study in Biographical Truth. New York: Schocken, 2001. Mandel, Naomi. Rethinking After Auschwitz: Against a Rhetoric of the Unspeakable in Holocaust Writing. boundary 2 (2001), pp. 203228. 268

Manoff, Marlene. The Symbolic Meaning of Libraries in a Digital Age. Portal: Libraries and the Academy 1.4, 2001, pp. 371-381. Marcus, Leah S. The Silence Of The Archive And The Noise Of Cyberspace. In The Renaissance Computer: Knowledge Technology in the First Age of Print, edited by Neil Rhodes and Jonathan Sawday. London and New York: Routledge, 2000, pp. 18-28. Markley, Robert (editor). Virtual Realities and Their Discontents. Baltimore and London: Johns Hopkins University Press, 1996. McKenzie. D.F. Bibliography and the Sociology of Texts. Cambridge: Cambridge University Press, 1986. McLuhan, Marshall. Understanding Media. London and New York: Routledge, 1964. Memento website. www.otnemem.com. Milburn, Colin. Nanotechnology in the Age of Posthuman Engineering: Science Fiction as Science. Configurations 10 (2002), pp. 261-295. Miller, R. Bruce and Milton T. Wolf (editors). Thinking Robots, An Aware Internet, and Cyberpunk Librarians. Chicago: Library and Information Technology Association, 1992. Milton, John. Aeropagitica. Westminster: A. Constable and Co., 1903. Mitchell, William J. The Reconfigured Eye: Visual Truth in the PostPhotographic Era. Cambridge: MIT Press, 1994. Moon Hoax Page. http://www.redzero.demon.co.uk/moonhoax/. Mooney, Ted. Easy Travel to Other Planets. New York: Farrar, Straus & Giroux 1981. Moore, Jeffrey. The Memory Artists. Toronto: Viking, 2004. Moravec, Hans. Mind Children: The Future of Robot and Human Intelligence. Cambridge and London: Harvard University Press, 1988. . Pigs in Cyberspace. In Thinking Robots, An Aware Internet, and Cyberpunk Librarians, edited by R. Bruce Miller and Milton T. Wolf. Chicago: Library and Information Technology Association, 1992, pp. 13-22. Moulthrop, Stuart. Error 404: Doubting the Web. In The World Wide Web and Contemporary Cultural Theory, edited by Andrew Herman and Thomas Swiss. New York: Routledge, 2000, pp. 259-275.

269

Mulgan, Geoff. High Tech and High Angst. In The Age of Anxiety, edited by Sarah Dunant and Roy Porter. London: Virago, 1996, pp. 1-19. Museum of Web Art. http://www.mowa.org. National Library to lead electronic harvesting, Media release of the National Library of New Zealand, 26 September 2003. http://www.natlib.govt.nz/bin/media/pr?item=1064531843. Negroponte, Nicholas. Being Digital. New York: Knopf, 1995. Nelson, Theodor Holm. Literary machines: the report on, and of, Project Xanadu concerning word processing, electronic publishing, hypertext, thinkertoys, tomorrows intellectual revolution, and certain other topics including knowledge, education and freedom. Sansalito: Mindful Press, 1992. Nestle, Marion. Food Politics: How the Food Industry Influences Nutrition and Health. Berkeley, Los Angeles and London: University of California Press, 2002. Newton, Melanie. Y2paniK. In Digital Hyperstition. Cybernetic Culture Research Unit website, http://www.ccru.net/digithype/Y2Panik.htm. Nolan, Jonathan. Memento Mori. http://www.esquire.com/features/articles/2001/001323_mfr_mement o_9.html Noon, Jeff. Automated Alice. London: Black Swan, 2000. . Crawl Town.In Pixel Juice. London: Anchor, 1998, pp. 293-307. . Falling Out of Cars. London: Doubleday, 2002. . Metaphorazine.In Pixel Juice. London: Anchor, 1998, pp. 47-49. . Nymphomation. London: Doubleday, 1997. . Pixel Juice. London: Anchor, 1998. . Pollen. New York: Crown Publishers, 1996. . Vurt. New York: Crown Publishers, 1993. Nora, Pierre. Between Memory and History: Les Lieux de Memoire. Representations 26, pp. 7-25. Norris, Pippa. Digital divide: Civic Engagement, Information Poverty, and the Internet Worldwide. Cambridge and New York: Cambridge University Press, 2001. 270

Novick, Peter. That Noble Dream: The "Objectivity Question" and the American Historical Profession. Cambridge: Cambridge University Press, 1988. Ong, Walter J. Orality and Literacy: The Technologizing of the Word. London: Methuen, 1982. Orwell, George. Nineteen Eighty-Four. Oxford: Clarendon Press, 1984. Palmer, Christopher. Philip K. Dick: Exhilaration and Terror of the Postmodern. Liverpool: Liverpool University Press, 2003. Penrose, Roger. Emperors New Mind: Concerning Computers, Minds, and the Laws of Physics. Oxford: Oxford University Press, 2002. Perec, Georges. Approaches to What? In Species of Spaces and Other Pieces, translated by John Sturrock. Harmondsworth: Penguin, 1997, pp. 209-211. . Attempt at an inventory of the Liquid and Solid Foodstuffs Ingurgitated by Me in the Course of the Year Nineteen Hundred and Seventy Four. In Species of Spaces and Other Pieces, translated by John Sturrock. Harmondsworth: Penguin, 1997, pp. 244-250. . Life, A Users Manual. Translated by David Bellos. London: Vintage 2003. . Spieces of Spaces. In Species of Spaces and Other Pieces, translated by John Sturrock. Harmondsworth: Penguin, 1997, pp. 195. . Species of Spaces and Other Pieces. Translated by John Sturrock. Harmondsworth: Penguin, 1997. . Think/Classify. In Species of Spaces and Other Pieces, translated by John Sturrock. Harmondsworth: Penguin, 1997, pp. 188-206. . W, or, The Memory of Childhood. Translated by David Bellos. Boston: D.R. Godine, 1988. Pinker, Steven. How the Mind Works. New York: Norton, 1997. . The Language Instinct: The New Science of Language and Mind. Harmondsworth: Penguin, 1995. Pinter, Harold. Mountain Language. London: Faber and Faber, 1988. Plato. Meno. Translated by W.K.C. Guthries. In The Collected Dialogues of Plato. Princeton: Princeton University Press, 1961, pp. 353-384.

271

. Phaedrus. Translated by C.J. Rowe. Warminster: Aris & Phillips, 1984. . The Seventh Letter. Translated by J. Haward. In The Dialogues of Plato. Chicago: Encyclopaedia Britannica, 1952. . Thaetetus. Translated by F.M. Cornford. In The Collected Dialogues of Plato. Princeton: Princeton University Press, 1961, pp. 845-919. Platt, Charles. The Silicon Man. San Francisco: Wired Books, 1997. Porush, David. Hacking the Brainstem: Postmodern Metaphysics and Stephensons Snow Crash. In Virtual Realities and Their Discontents, edited by Robert Markley. Baltimore and London: Johns Hopkins University Press, 1996, pp. 107-142. Poster, Mark. The Mode of Information. Cambridge: Polity Press, 1990. . The Second Media Age. Cambridge: Polity Press, 1995. Postman, Neil. Amusing Ourselves to Death: Public Discourse in the Age of Show Business. London: Methuen, 1986. . Technopoly: The Surrender of Culture to Technology. New York: Alfred A. Knopf, 1992. Preserving Our Digital Heritage: Plan for the National Digital Information Infrastructure and Preservation Program. Washington: The Library of Congress Press, 2002. Pynchon, Thomas. The Crying of Lot 49. New York: Bantam, 1967. Queneau, Raymond. Cent mille milliards de pomes. Paris: Gallimard, 1961. Rabinovitz, Lauren and Abraham Geil (editors). Memory Bytes: History, Technology, and Digital Culture. Durham: Duke University Press, 2004. Radstone, Susannah (editor). Memory and Methodology. Oxford and New York: Berg, 2000. . Screening Trauma: Forrest Gump, Film and Memory. In Memory and Methodology, edited by Susannah Radstone. Oxford and New York: Berg, 2000, pp. 79-105. . Working with Memory: an Introduction. In Memory and Methodology, edited by Susannah Radstone. Oxford and New York: Berg, 2000, pp. 1-22.

272

Rhodes, Neil and Jonathan Sawday (editors). The Renaissance Computer: Knowledge Technology in the First Age of Print. London and New York: Routledge, 2000. Rhodes, Richard (editor). Visions of Technology: A Century of Vital Debate About Machines, Systems and the Human World. New York: Simon & Schuster, 1999. Richard, Thomas. The Imperial Archive: Knowledge and the Fantasy of Empire. London: Verso, 1993. Roche Lecours, Andre and Yves Joanette. Linguistic and Other Psychological Aspects of Paroxysmal Aphasia. Brain and Language 10 (1980), pp. 1-23. Ropiequet, Suzanne (editor). CD-ROM, the New Papyrus: The State of the Art. Redmond: Microsoft, 1986 Rothenberg, Jeff. Ensuring the Longevity of Digital Information. www.clir.org/pubs/archives/ensuring.pdf Roy Rosenzweig. Can History Be Open Source? Wikipedia and the Future of the Past. The Journal of American History 93.1, Jun 2006, pp. 117-146. Searle, John R. Mind: A Brief Introduction. Oxford and New York: Oxford University Press, 2004. Seesholtz, Meel. Exotechnology: Human Efforts to Evolve Beyond Human Being. In Thinking Robots, An Aware Internet, and Cyberpunk Librarians, edited by R. Bruce Miller and Milton T. Wolf. Chicago: Library and Information Technology Association, 1992, p. 59-70. Seidel, Jorinde. Operation Re-store World. Mediamatic 8.1 (1994). http://www.mediamatic.net/article-8359-en.html. Seidman, Steven. Contested Knowledge: Social Theory in the Postmodern Era. Cambridge: Blackwell, 1994. Shankman, Steven (editor). Plato and Postmodernism. Glenside: Aldine Press, 1994. . Plato and Postmodernism. In Plato and Postmodernism, edited by Steven Shankman. Glenside: Aldine Press, 1994, pp. 3-28. Shannon, Claude E. and Warren Weaver. The Mathematical Theory of Communication. Urbana: University of Illinois Press, 1949. Shenk, David The Forgetting. Alzheimers: Portrait of an Epidemic. New York: Doubleday, 2001.

273

. Data Smog: Surviving the Information Glut. San Francisco: HarperEdge, 1997. Sherman, Chris. Why Search Engines Fail. Search Day 344, August 29, 2002. http://searchenginewatch.com/searchday/02/sd0829-searchfailure.html. Shermer, Michael and Alex Grobman. Denying History: Who Says the Holocaust Never Happened and Why Do They Say It? Berkeley: University of California Press, 2000. Slaughter, Richard A. From Future Shock to Social Foresight: Recontextualising Cyberculture. In Prefiguring Cyberculture: An Intellectual History, edited by Darren Tofts, Annemarie Jonson and Alessio Cavallaro. Cambridge: MIT Press, 2003, pp. 264-277. Smith, David. 2050 - and Immortality is within our grasp. The Observer, May 22, 2005. http://observer.guardian.co.uk/uk_news/story/0,6903,1489635,00.ht ml. Sniffen, Michael J. Pentagon tool records every breath. Australian IT. http://australianit.news.com.au/articles/0,7204,6536889%5e15841% 5e%5enbv%5e,00.html Sobchack, Vivian. Nostalgia for a Digital Object: Regrets on the Quickening of Quicktime. In Memory Bytes: History, Technology, and Digital Culture, edited by Lauren Rabinovitz and Abraham Geil. Durham: Duke University Press, 2004, pp. 305-330. Sofolulis, Zo . Cyberquake: Haraways Manifesto. In Prefiguring Cyberculture: An Intellectual History, edited by Darren Tofts, Annemarie Jonson and Alessio Cavallaro. Cambridge: MIT Press, 2003, pp. 84-103. Sontag, Susan. On Photography. New York: Anchor Doubleday, 1989. Spence, Jonathan D. The Memory Palace of Matteo Ricci. New York: Viking Penguin, 1984. St. Petersburg Hermitage Museum Website. http://www.hermitagemuseum.org/fcgibin/db2www/qbicSearch.mac/qbic?selLang=English. Stanton, Mike. U-Turn on Memory Lane. Columbia Journalism Review, July/August 1997, http://www.cjr.org/year/97/4/memory.asp. Stephen G. Nichols and Abby Smith (editors). The Evidence in Hand: Report of the Task Force on the Artifact in Library Collections. Washington: Council on Library and Information Resources, 2001. Stephenson, Neal. Snow Crash. London: ROC, 1993. 274

Sterling, Bruce. The Birth and Death of Memory. In The Future of Memory, edited by Giulio Blasi. Turnhout: Brepols, 2002, pp. 83-90. . Free As Air, Free As Water, Free As Knowledge. In Thinking Robots, An Aware Internet, and Cyberpunk Librarians, edited by R. Bruce Miller and Milton T. Wolf. Chicago: Library and Information Technology Association, 1992, pp. 23-32. . The Hacker Crackdown: Law and Disorder on the Electronic Frontier. London: Viking Publishing, 1992. Stier, Oren Baruch. Committed to Memory: Cultural Mediations of the Holocaust. Boston: University of Massachusetts Press, 2003. Strange Days. Earth Island Journal 11.4 (Fall 1996), p. 3. Sutton, David E. Remembrance of Repasts: an Anthropology of Food and Memory. Oxford: Berg, 2001. The Rosetta Project. http://www.rosettaproject.org. Tofts, Darren and Murray McKeich. Memory Trade: A Prehistory of Cyberculture. North Ryde: Interface, 1998. Tofts, Darren, Annemarie Jonson and Alessio Cavallaro (editors). Prefiguring Cyberculture: An Intellectual History. Cambridge: MIT Press, 2003. Toop, David. Haunted Weather: Music, Silence and Memory. London, Serpents Tail, 2004. Turkle, Sherry. Life on the Screen. London: Weidenfeld & Nicholson, 1996. Ulmerg, Gregory L. Reality Tables: Virtual Furniture. In Prefiguring Cyberculture: An Intellectual History, edited by Darren Tofts, Annemarie Jonson and Alessio Cavallaro. Cambridge: MIT Press, 2003, pp. 110-129. USC Shoah Foundation Institute website. http://www.vhf.org/ Varley, John. Overdrawn at the Memory Bank. In Visions of Wonder: The Science Fiction Research Association Anthology, edited by David G. Hartwell and Milton T. Wolf. New York: Tor Books 1996, pp. 490523. Velthoven, Willem. Editorial. Mediamatic 8.1 (1994). http://www.mediamatic.net/set-8421-en.html Venaille, Frank. The Work of Memory. Interview with Georges Perec. In Georges Perec, Species of Spaces and Other Pieces, translated by John Sturrock. Harmondsworth: Penguin, 1997, pp. 127-133.

275

Vidal-Naquet, Pierre. Assassins of Memory: Essays on the Denial of the Holocaust. Translated by Jeffrey Mehlman. New York, Columbia University Press, 1992. Vinge, Vernor et al. True Names: And the Opening of the Cyberspace Frontier. New York: Tor Books, 2001. Virtual Tour of the Louvre Museum. http://www.louvre.or.jp/louvre/QTVR/anglais/. Vonnegut, Kurt. Slaughterhouse Five. New York: Dell, 1968. Walker, W. Richard and Douglas J. Herrmann (editors). Cognitive Technology: Essays on the Transformation of Thought and Society. Jefferson: McFarland & Co, 2005. Wiggins, Richard. Digital Preservation: Paradox & Promise. LibraryJournal.com, 15 April 2001. http://www.libraryjournal.com/article/CA106209.html. Williams et al. US Patent number 6,754,472: Method and apparatus for transmitting power and data using the human body. http://patft.uspto.gov. Williams, Walter Jon. Voice of the Whirlwind. New York: T. Doherty Associates, 1987. Wolfe, Gene. Soldier of the Mist. London: Futura Publications, 1987. Wouterloot, Lex. Celestial Data for the End of Time. Mediamatic 8.1 (1994). http://www.mediamatic.net/article-5674-en.html. Yates, Frances. The Art of Memory. Harmondsworth: Penguin, 1969. Yoshimoto, Banana. Amrita. Translated by by Russell F. Wasden. New York, Grove Press, 1997.

Filmography

2001: A Space Odyssey (Stanley Kubrick, 1968) Blade Runner (Ridley Scott, 1982) Blade Runner The Directors Cut (Ridley Scott, 1991) Brazil (Terry Gilliam, 1985) Cyhper (Vincenzo Natali, 2002) 276

Dark City (Alex Proyas, 1998) Eternal Sunshine of the Spotless Mind (Michel Gondry, 2004) eXistenZ (David Cronenberg, 1999) Impostor (Gary Fleder, 2002) Johnny Mnemonic (Robert Longo, 1995) Last Year at Marienbad (LAnne Dernire Marienbad, Alain Resnais, 1961) Memento (Christopher Nolan, 2000) Minority Report (Steven Spielberg, 2002) Natural Born Killers (Oliver Stone, 1994) Paycheck (John Woo, 2003) RoboCop (Paul Verhoeven, 1987) Screamers (Christian Duguay, 1995) Strange Days (Kathryn Bigelow, 1995) That Obscure Object of Desire (Cet obscur objet du dsir, Luis Buuel, 1977) The Birth of a Nation (D.W. Griffith, 1915) The Final Cut (Omar Naim, 2004) The Forgotten (Joseph Ruben, 2004) The Matrix (Larry and Andy Wachowski, 1999) The Matrix Reloaded (Larry and Andy Wachowski, 2003) The Passion of the Christ (Mel Gibson, 2004) Total Recall (Paul Verhoeven, 1990) Tron (Steven Lisberger, 1984) Vertigo (Alftred Hitchcock, 1958) Wintersleepers (Winterschlfer, Tom Tykwer, 1997)

Television Shows

277

ER, The Peace of Wild Things (1999), directed by Richard Thorpe. Friends, The One Where Phoebe Runs (1999), directed by Gary Halvorson.

Contact: giovanni (dot) tiso (at) gmail (dot) com Research blog: http://bat-bean-beam.blogspot.com/

278

S-ar putea să vă placă și