10.08.2015 Views

P

terranova-network-culture

terranova-network-culture

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Network CulturePolitics for the Information AgeTiziana TerranovaPluto P PressLONDON • ANN ARBOR, MI


First published 2004 by Pluto Press345 Archway Road, London N6 5AAand 839 Greene Street, Ann Arbor, MI 48106www.plutobooks.comCopyright © Tiziana Terranova 2004The right of Tiziana Terranova to be identified as the author ofthis work has been asserted by her in accordance with the Copyright,Designs and Patents Act 1988.British Library Cataloguing in Publication DataA catalogue record for this book is available from the British LibraryISBN 0 7453 1749 9 hardbackISBN 0 7453 1748 0 paperbackLibrary of Congress Cataloging in Publication DataTerranova, Tiziana, 1967–Network culture : politics for the information age / Tiziana Terranova.p. cm.ISBN 0–7453–1749–9 (hb) –– ISBN 0–7453–1748–0 (pb)1. Information society. 2. Information technology––Social aspects.3. Information technology––Political aspects. I. Title. HM851.T47 2004303.48'33––dc22200400451310 9 8 7 6 5 4 3 2 1Designed and produced for Pluto Press byChase Publishing Services, Fortescue, Sidmouth, EX10 9QG, EnglandTypeset from disk by Stanford DTP Services, Northampton, EnglandPrinted and bound in the European Union byAntony Rowe Ltd, Chippenham and Eastbourne, England


ContentsAcknowledgementsviiiIntroduction 11. Three Propositions on Informational Cultures 6The Meaning of Information 6Information and Noise 10The Limits of Possibility 20Nonlinearity and Representation 272. Network Dynamics 39Network Time 39Of Grids and Networks 42The Paradox of Movement 47A Tendency to Differ 53Fringe Intelligence 63Afterthought 713. Free Labour 73The Digital Economy 75Knowledge Class and Immaterial Labour 80Collective Minds 84Ephemeral Commodities and Free Labour 88The Net and the Set 944. Soft Control 98Biological Computing 98From Organisms to Multitudes 101Searching a Problem Space 106Global Computation 108Social Emergence 114Hacking the Multitude 116The Unhappy Gene 122Coda on Soft Control 129


vi Network Culture5. Communication Biopower 131The New Superpower 131The Masses’ Envelopment 134An Intolerant World 144Networked Multitudes 153Notes 158Bibliography 171Index 179


AcknowledgementsMany people, groups and institutions played a fundamental part inthe writing of this book. The University of Berkeley’s Engineeringand Science Libraries (for their open doors policy of access andconsultation); Danan Sudindranath, Mr Gary Pickens and CarolWright for technical intelligence; the friends and former colleaguesat the Department of Cultural and Innovation Studies, Universityof East London (and in particular Ash Sharma, Paul Gormley, andJeremy Gilbert); the Signs of the Times collective; Martha Michailidou,Nicholas Thoburn and David Whittle (for reading and/or discussingparts of this work); Nick Couldry and the OUR MEDIA network; theESRC, for funding part of this project; the Department of Sociology atthe University of Essex (for granting two study leaves that allowed thewriting of this book); Andrew Ross at the American Studies Centre,NYU, for his friendship and enduring support; Patricia Clough andthe students at the CUNY Graduate Center for enthusiastic supportand challenging feedback; the collective intelligences of nettime,Rekombinant, Syndicate and e-laser (for everything); Marco d’Eramofor his kind interest and bibliographical gems; the staff and studentsat the Center for Cultural Studies, Goldsmiths’ College (Scott Lashand Axel Roch and for extra special support Sebastian Olma andSandra Prienekenf); Riccardo and Simonetta; Betti Marenko fromInSectCorp; the Mute collective (special thanks to Pauline van MourikBroeckmann, Ben Seymour, Josephine Berry Slater, and Jamie King);my mentors and friends at the Istituto Universitario Orientale inNaples (especially Lidia Curti, Ian Chambers, Silvana Carotenutoand Anna Maria Morelli); the friends of the Hermeneia project at theUniversitat Oberta de Barcelona (thanks to Laura Borràs Castanyer,Joan Elies Adell and Raffaele Pinto for some wonderful workshops);to Anne Beech and Judy Nash at Pluto Press, for their patience andfor trusting their authors; Kode 9 from hyperdub.com and DaddyG from uncoded.net (for the bass lines and visions); and of coursemy family and friends back in Sicily, who provided free sunshine,blue sea, good food and warm company during the last stages in thewriting of this book – the hot, hot summer of 2003.Finally, special thanks to Luciana Parisi – for being a friend and asister throughout our turbulent and co-symbiotic becomings.vii


To the Memory ofVito Terranova(1935–1992)


IntroductionThis is a book about (among other things) information andentropy, cybernetics and thermodynamics, mailing lists and talkshows, the electronic Ummah and chaos theory, web rings andweb logs, mobile robots, cellular automata and the New Economy,open-source programming and reality TV, masses and multitudes,communication management and information warfare, networkedpolitical movements, open architecture, image flows and the interplayof affects and meanings in the constitution of the common. It is abook, that is, about a cultural formation, a network culture, that seemsto be characterized by an unprecedented abundance of informationaloutput and by an acceleration of informational dynamics.In this sense, this is a book about information overload in networksocieties and about how we might start to think our way throughit. Because of this abundance and acceleration, the sheer overloadthat constitutes contemporary global culture, it was necessaryto assemble and reinvent a method that was able to take in thisbewildering variation without being overwhelmed by it. This methodhas privileged processes over structure and nonlinear processes overlinear ones – and in doing so it has widely borrowed from physics andbiology, computing and cybernetics but also from philosophy, andcultural and sociological thinking (from Baudrillard to Lucretius, fromDeleuze and Guattari to Stuart Hall and Manuel Castells, from MichelSerres to Henri Bergson and Antonio Negri). Above all, however, thisbook is an attempt to give a name to, and further our understanding of,a global culture as it unfolds across a multiplicity of communicationchannels but within a single informational milieu.To think of something like a ‘network culture’ at all, to dare to giveone name to the heterogeneous assemblage that is contemporaryglobal culture, is to try to think simultaneously the singular andthe multiple, the common and the unique. When seen close upand in detail, contemporary culture (at all scales from the local tothe global) appears as a kaleidoscope of differences and bewilderingheterogeneity – each one of which would deserve individual andspecific reflection. However, rather than presenting themselves tous as distinct fragments, each with its own identity and structure,they appear to us as a meshwork of overlapping cultural formations,1


2 Network Cultureof hybrid reinventions, cross-pollinations and singular variations.It is increasingly difficult to think of cultural formations as distinctentities because of our awareness of the increasing interconnectednessof our communication systems. It is not a matter of speculatingabout a future where ‘our fridge will talk to our car and remindit to buy the milk on its way’. It is about an interconnection thatis not necessarily technological. It is a tendency of informationalflows to spill over from whatever network they are circulating inand hence to escape the narrowness of the channel and to open upto a larger milieu. What we used to call ‘media messages’ no longerflow from a sender to a receiver but spread and interact, mix andmutate within a singular (and yet differentiated) informational plane.Information bounces from channel to channel and from mediumto medium; it changes form as it is decoded and recoded by localdynamics; it disappears or it propagates; it amplifies or inhibitsthe emergence of commonalities and antagonisms. Every culturalproduction or formation, any production of meaning, that is, isincreasingly inseparable from the wider informational processes thatdetermine the spread of images and words, sounds and affects acrossa hyperconnected planet.Does that mean, as Paul Virilio has recently suggested followinga prediction by Albert Einstein, that an unbearable catastrophe hasstruck the planet – that we are the victims, today, as we speak, ofan informational explosion, a bomb as destructive as the atomicbomb? 1 Information is often described as a corrosive, even destructiveand malicious entity threatening us with the final annihilation ofspace–time and the materiality of embodiment. Echoing a widespreadfeeling, Virilio suggests that we see information as a force able tosubordinate all the different local durations to the over-determinationof a single time and a single space that is also emptied of all real humaninteractions. From this perspective, contemporary culture is the site ofa devastation wreaked by the deafening white noise of information,with its ‘pollution of the distances and time stretches that hithertoallowed one to live in one place and to have a relationship with otherpeople via face-to-face contact, and not through mediation in theform of teleconferencing or on-line shopping.’ 2 As will become clearin the book, I do not believe that such informational dynamics simplyexpresses the coming hegemony of the ‘immaterial’ over the material.On the contrary, I believe that if there is an acceleration of historyand an annihilation of distances within an informational milieu, itis a creative destruction, that is a productive movement that releases


Introduction 3(rather than simply inhibits) social potentials for transformation.In this sense, a network culture is inseparable both from a kind ofnetwork physics (that is physical processes of differentiation andconvergence, emergence and capture, openness and closure, andcoding and overcoding) and a network politics (implying the existenceof an active engagement with the dynamics of information flows).The first chapter is a lengthy engagement with information theorywith the stated intent of understanding more about this mysteriousphysical entity as it has come to pervade the language and practicesof contemporary culture. I will start by freeing up the concept ofinformation from two prejudices that have actually hindered ourunderstanding of informational dynamics: the idea that informationis ‘the content of a communication’; and the notion that informationis ‘immaterial’. This interpretation of information theory (and inparticular of Claude Shannon’s 1948 paper on ‘The MathematicalTheory of Communication’) will play up those aspects of informationthat correspond to or explain the informational dynamics ofcontemporary culture (and hence the field of cultural politics). Ihave thus put forward a series of propositions linking informationtheory to something that we call ‘the cultural politics of information’and have tried to understand how such a shift has transformed andaffected the cultural politics of representation (both linguistic andpolitical). The relationship between physical concepts such as entropyand negentropy, noise and signal, micro and macro, nonlinearity andindeterminacy determines the production of a ‘materialistic’ theoryof information that could help us to make better sense of the ‘chaosof communication’ in which we live.The second chapter discusses the architecture of networks,and more specifically the architecture of the Internet. In thiscase, the Internet is taken as a technical diagram able to supportthe development of an informational space that is driven by thebiophysical tendencies of open systems (such as the tendencytowards divergences, incompatibilities, and rising entropic levels ofrandomness and disorganization). Here I take the Internet to be notsimply a specific medium but a kind of active implementation ofa design technique able to deal with the openness of systems. Thedesign of the Internet (and its technical protocols) prefigured theconstitution of a neo-imperial electronic space, whose main featureis an openness which is also a constitutive tendency to expansion.The chapter explores how the informational dynamics instantiatedby the design philosophy of the Internet is actualized in a series of


4 Network Culturetopological figures and cultural experimentation in phenomena suchas blogging, mailing lists, and web rings.Chapter three is an investigation into the question of the ‘digitaleconomy’ or the ‘New Economy’ (as it has become more commonlyknown). In particular, the chapter looks at the phenomenon of ‘freelabour’ – that is the tendency of users to become actively involved inthe production of content and software for the Internet. The difficultiesinherent in the relationship between such forms of volunteerand unpaid technocultural production and our understanding ofcontemporary capitalism will be a central focus of the chapter. Inorder to understand this relation I will draw on the Marxist notionof ‘real subsumption’ of society under capitalism. In particular, Iwill follow the Autonomist Marxist suggestion that the extensionof production to the totality of a social system (the ‘social factory’thesis) is related to the emergence of a ‘general intellect’ and ‘massintellectuality’ pointing to capital’s incapacity to absorb the creativepowers of labour that it has effectively unleashed.Chapter four is devoted to the problem of ‘control’ in chaotic andself-organizing systems – a leitmotif of early literature on the Internetand a field of intense controversy between the human and naturalsciences. Recent developments in biological computation (such asresearch on artificial life, neural networks and mobile robots) implythe production of a kind of ‘technical diagram’ of control that takesas its content the autonomous productive capacities of a large numberof interacting variables. Such a diagram entails the interconnectionof the many; the decentralization of command; the modulation oflocal rules and global behavior; and a kind of ‘unnatural selection’ inthe form of predesigned aims and objectives that operate to capturethe powers of emergence through the reconstitution of individuality.The chapter suggests that the dynamics of flows – once understoodin terms of nonlinear relations between a larger number of simplebodies – is far from constituting a utopian state of pre-Oedipal blissbut has become the field of operation of a new mode of cyberneticcontrol (or soft control).Finally chapter five looks at the implications of such distributedand internetworked informational milieus for our understanding ofthe political dimension of communication. Is it still possible to talkof the media as a ‘public sphere’ in an age of mass propaganda, mediaoligopoly and information warfare? Is the world splitting betweenan educated and internetworked public opinion and a passiveand manipulated mass of TV junkies? The chapter suggests that a


Introduction 5reappropriation of the properties of the ‘mass’ (or the implicationsof ‘forming a mass’) can help us to untangle the semantic propertiesof communication (the meaningful statements that it transmits)from its intensive, affective ones. If the mass is a field for thepropagation of affects, it does not exclude but includes and envelopswithin itself the segmentation of specialized audiences and theirfurther microsegmentation over the Internet. This common milieu,interconnected by the flow of images and affects, is the site for theemergence of new political modes of engagement (such as Internetorganizedglobal movements against neoliberal economic policiesand the Iraqi war). The chapter concludes by proposing such networkculture as a site of the political constitution of the common throughthe biopower of communication.Throughout this book, I have tried to find a way to map thesetransformations, not simply as technologies but also as concepts,techniques and milieus. These are concepts that have opened up aspecific perception and comprehension of physical and socialprocesses; techniques that have drawn on such concepts to developa better control and organization of such processes; and milieus thathave dynamically complicated the smooth operationality of suchtechniques. In no case have I noticed a linear relation of cause andeffect between technologies and social change, or, for that matter,between concepts, techniques, processes and milieus.From another perspective, I also have to warn the reader that Ihave willingly overemphasized the dimension of communication andinformation over other aspects of social and cultural change. In noway should this be taken as an indication of an alleged obsolescence ofother aspects of contemporary culture or politics. This overemphasisworks in this book as a kind of methodological device to temporarilyisolate a specific type of process for the purposes of analysis. Inparticular, the exceptional dynamism of such informational milieusmight lead one to overlook the persistence of stratifications andstructures across the domains observed. On the other hand, it is thisdynamic character that has drawn my attention to the subject andkept our interest throughout this project and I cannot but hope myreaders will feel the same.


1Three Propositions onInformational CulturesTHE MEANING OF INFORMATIONIs there an informational quality that defines twenty-first centuryculture – a quality that makes such culture unique, that gives it, soto speak, its most characteristic and peculiar trait? Such a questionwould appear to be based on two problematic assumptions: in thefirst place that there is something like a culture that defines a century;and, above all, that we do know what such informational quality isabout – that is, that we do know the ‘meaning of information’.If the notion of a culture raises important questions about therelationship between the heterogeneous and the homogeneous, theidea that we do not know what information is might appear as lessof an issue. After all, information has become such a common wordand is used so freely and with such ease that we should have noproblem at all in defining it. We know at least two things aboutinformation: that it is the content of a communication act; and thatthere is something less than material about it – at least judging fromthe ease with which it goes from mouth to ear and ear to mouth. Thisimmateriality of information has been further amplified by technicaldevelopments that have made possible the instant transmittal andmultiple distribution of any type of information at all (images, sound,music, words, software, statistics, projections, etc.). It is this ease ofcopying, it has been argued, that makes of information such a shiftyand yet valuable commodity. We know that information can be soldand bought and that a good deal of the world economy is driven byan emphasis on the informational content of specific commoditiesand we are also aware that information itself can be valuable (whenit is used for example to make a profit in the stock market). Weknow that anybody is always potentially an information-source oreven an information-storage device and that science suggests thatinformation constitutes the very basis of our biological existence (inas much as, we are told, we contain information that can be decodedwithin our very cells). In all these cases, information emerges as a6


Three Propositions on Informational Cultures 7content, as some kind of ‘thing’ or ‘object’ but one that possessesabnormal properties (ease of copying and propagation, intangibility,volatility, etc.) that contemporary technological developments haveexacerbated and amplified.These features of the informational commodity have opened allkinds of issues around the question of rights in the digital age – andmore specifically the right to own and copy information. Thus wehave a political struggle around the right to keep medical informationprivate; the right not to have one’s personal correspondence or datamonitored and/or sold; the right to copy and distribute music andvideo over the Internet; the right to make low-cost copies of patentedmedications in cases of national health emergencies (such as the AIDSepidemics in Africa); and the right to profit from information that hasbeen produced at great cost to the producer. In all these cases, however,information is still treated as a content of a communication – a contentto be protected whether in the interests of individuals, institutions,companies or the commonwealth at large. Surely, if there is a politicalstruggle around information at all, then it must be about issues such ascopyright and intellectual property. As far as the rest of contemporaryculture is concerned, surely it must be business as usual – with theusual conglomerates and political parties trying to manipulate mediarepresentations for their own hegemonic purposes.And yet, useful and important as such struggles are, they do notreally address for us the larger problem of the relation between‘culture’ and ‘information’. Information, that is, might be morethan simply the content of a communication. We are no longermostly dealing with information that is transmitted from a source toa receiver, but increasingly also with informational dynamics – thatis with the relation between noise and signal, including fluctuationsand microvariations, entropic emergencies and negentropicemergences, positive feedback and chaotic processes. If there is aninformational quality to contemporary culture, then it might be notso much because we exchange more information than before, oreven because we buy, sell or copy informational commodities, butbecause cultural processes are taking on the attributes of information– they are increasingly grasped and conceived in terms of theirinformational dynamics.It is thus important to remember that, as a historical conceptpointing to the definition, measurement, analysis and control ofa mathematical function, information does not coincide with therise of a digital media system. On the contrary, the appearance


8 Network Cultureof information theory parallels the emergence and developmentof modern mass media such as telegraphy, telephony, radio andtelevision. Unlike previous media such as print and writing, modernmedia, in fact, do not use the code of a workaday language, but‘make use of physical processes which are faster than humanperception and are only susceptible of formulation in the code ofmodern mathematics’. 1 We could refer to the informatization ofculture as starting with the analogue function of frequency, that iswith the encoding of sound in the grooves of a gramophone record,where speech phonemes and musical intervals were recognizedfor the first time as complex frequency mixtures open to furthermathematical analysis and manipulation. 2 For Friedrich Kittler, itis also with telegraphy that information, in the form of masslessflows of electromagnetic waves, is abstracted for the first time. Inthis sense, information is not simply the content of a message, or themain form assumed by the commodity in late capitalist economies,but also another name for the increasing visibility and importanceof such ‘massless flows’ as they become the environment withinwhich contemporary culture unfolds. In this sense, we can refer toinformational cultures as involving the explicit constitution of aninformational milieu – a milieu composed of dynamic and shiftingrelations between such ‘massless flows’.And yet, one could suggest that these massless flows are far frombeing immaterial (or at least not in the sense in which the termis used, that is in the sense of something that is not quite of thisworld). An assessment of the informational dynamics of cultureforces us to confront/address the analytical and political categoriesinforming our understanding of cultural politics and its relation tothe informational quality identified above. In the English-speakingworld in particular, the last 30 years have seen a predominant focus onanalytical categories such as meaning, identities and representationopening up onto a cultural politics of identity, representation, anddifference. The question of media and communications has thusbeen related mainly to the problem of how a hegemonic consensusemerges out of the articulation of diverse interests; and how culturalstruggle is waged within a representational space, marked by therelationship between self and other, or the identical and the different.The political dimension of culture has thus been conceived mainlyin terms of resistance to dominant meanings; and the set of tacticsopened up have been those related to the field of representation


Three Propositions on Informational Cultures 9and identity/difference (oppositional decodings; alternative media;multiple identities; new modes of representation).The emergence of informational dynamics has thus caught themore militant strands of media and cultural theory as if by surprise.Information is no longer simply the first level of signification, butthe milieu which supports and encloses the production of meaning.There is no meaning, not so much without information, but outsideof an informational milieu that exceeds and undermines the domainof meaning from all sides. Unless we want to resign ourselves to thenotion that culture has been made immaterial and transcendentby an informational deluge, we need to reassess the ways in whichwe understand the relationship between culture, power, andcommunication. What is proposed here is that an engagement withinformation theory is rich in analytical insights into the features ofcontemporary cultural politics where such informational dynamicsare increasingly foregrounded. In particular, it allows us to moveaway from an exclusive focus on meaning and representation as theonly political dimension of culture. In as much as communication isnot simply the site of the reproduction of culture, but also that of anindeterminate production crossing the entirety of the social (fromfactories to offices to homes and leisure spaces), it also constitutes akind of common informational milieu – open to the transformativepotential of the political.Keeping these questions in mind, in this chapter we will focus onthe ‘meaning’ of information. In particular, I will turn to informationtheory (and specifically the early work of Claude E. Shannon andthe cyberneticians) to catch the points where information ceasesto be simply the content of communication and gains, so to speak,a body – that is a materiality in its connection with the world ofphysics, engineering and biology. I will thus isolate three definitionsof information as related by Shannon’s 1948 paper: information isdefined by the relation of signal to noise; information is a statisticalmeasure of the uncertainty or entropy of a system; informationimplies a nonlinear and nondeterministic relationship betweenthe microscopic and the macroscopic levels of a physical system.These hypotheses are the basis out of which Shannon built hismathematical definition of information, but they also offer someother interesting considerations or corollaries on informationalcultures. These corollaries suggest that within informational cultures,communication is crucially concerned with the problem of noise andcontact; that the cultural politics of information are not only about


10 Network Cultureprivacy, property and copyright, but also open up the question ofthe virtual, that is the relation between the given and the (allegedly)unlikely; that information flows displace the question of linguisticrepresentation and cultural identity from the centre of culturalstruggle in favour of a problematic of mutations and movementwithin immersive and multidimensional informational topologies.INFORMATION AND NOISEThe fundamental problem of communication is that of reproducingat one point either exactly or approximately a message selected atanother point. 3Proposition I: Information is what stands out from noiseCorollary Ia: Within informational cultures, the struggle over meaningsis subordinated to that over ‘media effects’Corollary Ib: The cultural politics of information involves a return tothe minimum conditions of communication (the relation of signal tonoise and the problem of making contact)As elaborated by the researchers working for telecommunicationcompanies in the first half of the twentieth century, information theoryis fundamentally concerned with the accurate reproduction of an encodedsignal. The reproduction of information is at the heart of the communicationprocess in as much as the latter fundamentally involves the accuratereproduction of a pattern from a sender to a receiver through a channel.If such information is transmitted accurately, that is with minimumdistortion and corruption, then the communication act can be said tohave been successful. If the information is distorted or does not reach itsdestination, then the communication act has been unsuccessful. The newtechniques of communication management are crucially concerned withthe relation between signal and noise with the explicit intent of generatinga ‘media effect’. The nature of such media effects, however, needs to bereconsidered within the larger context of an ‘informational milieu’.In order to better understand the implication of introducing aninformational perspective into our evaluation of culture, we need toengage with information theory in more detail. The modern scientificconcept of information has a mixed and hybrid genealogy, at thecrossing of science and engineering, involving a cross-disciplinarydialogue between physics, mathematics, biology and even sociology


Three Propositions on Informational Cultures 11(in its positivistic, ‘social physics’ version). A rigorous mathematicaltheory of information, however, was developed only in the 1940sby members of the cybernetic group and by engineers at AT&T’s BellLaboratories (and in particular by R.V.L. Hartley, Harry Nyquist, andClaude E. Shannon). For Jérôme Segal, the milieu of communicationengineers working in corporate labs in the United States wasparticularly rife for such technical and scientific breakthroughs. Onthe one hand, the US engineers did not share the narrow vocationalfocus that kept their European peers within the social hierarchy of atheory/practice divide. North American telecommunication engineershad been trained in physics departments (for example, on MIT’sElectrical Engineering course) and had thus a good knowledge of themost abstract and complex physics debates. 4On the other hand, US communication engineers were alsoconfronted by complex problems of speed and accuracy in signalstransmission posed by the large telecommunication networks ofthe United States, where signals had to be repeatedly relayed beforereaching their destinations. It would also be hard to underestimatethe importance of the internationalization and interdisciplinarizationof science during and immediately after World War II – a process thatprovided the material circumstances for the constitution of a theoryof information linking physics, statistics and telecommunications andthat prepared the ground for the informatization of life in molecularbiology. The concept of information was part of research taking placewithin the field of ‘communication and control engineering’ – abranch of engineering that depended on a larger theory of messagesinvolving the contribution of linguistics and cryptoanalysis to theunderstanding of communication codes. Norbert Wiener went sofar as to argue that the difference between the older field of powerengineering and communication engineering marked a shift from thelong nineteenth century of the industrial society to a new cyberneticage of communication, command, and control. 5Shannon established his reputation as the pivotal point aroundwhich a century of attempts to conceptualize information as aphysical quantity revolved on the basis of a paper on what he called‘the mathematical theory of communication’, published in the BellSystem Technical Journal in 1948. Shannon’s paper advanced a set oftheorems that dealt ‘with the problem of sending messages from oneplace to another quickly, economically, and efficiently’. 6 Shannon’s‘Mathematical Theory of Communication’ was republished by theUniversity of Illinois Press in 1949, together with Warren Weaver’s


12 Network Cultureless mathematically oriented paper. As a result, the mathematicaltheory of communication is often referred to as the Shannon–Weavermodel – a ground-breaking effort in the field of communicationengineering and a necessary reference in all attempts to tackle someof the implications inherent in an informational understanding ofcommunication. The modern concept of information is explicitlysubordinated to the technical demands of communication engineering,and more specifically to the problems of the ‘line’ or ‘channel’.Shannon’s definition of information is dependent on the problematicof the accurate reproduction of a weak impulse or signal across a rangeof different media channels (telegraphy, telephony, radio, television,computing). Information is thus described through a mathematicalfunction that could be used to maximize the transmission of a signalthrough a channel. His logarithmic measure of information is stillfundamental to the ‘design of every modern communications device– from cellular phones to modems to compact-disc players.’ 7The problem identified by researchers at the Bell Laboratories iswell known to all communication engineers and high fidelity soundenthusiasts. When a signal travels through a channel, it often producesa characteristic background static that is not solved by amplification.In this sense, the signal is always identified in relation to whatthreatens to corrupt and distort it, that is noise. Communicationengineers identified the noise in channels with the discrete characterof the electrons carrying the current. Amplification did not correct thedisturbance because messages or signals ended up swamped by theirown energy. 8 The problem could not be solved simply by increasingor decreasing the amount of energy flowing through a channel, butvarious types of filters proved to be a partially effective solution. Whatwas needed, however, was a technique to encode the signal in sucha way that it would minimize loss of quality by some kind of errorcontrolinstructions. Engineers thus needed a function that wouldenable them to build systems that could distinguish noise from signaland hence correct the corruption of messages. But in which ways isa signal mathematically distinguishable from noise? This questionrequired a method for identifying information as an entity that couldbe separated from the meaning that could be made of it. For ClaudeE. Shannon, messages ‘[f]requently … have meaning; that is they referto or are correlated according to some system with certain physicalor conceptual entities. These semantic aspects of communication areirrelevant to the engineering problem’. 9 From the point of view ofinformation theory, ‘two messages, one of which is heavily loaded


Three Propositions on Informational Cultures 13with meaning and the other of which is pure nonsense, can be exactlyequivalent…’ 10In order to frame his concept of information properly, Shannondrew his famous diagram that, beyond becoming indispensable to thetechnical development of communication and information technologies,was also to have substantial repercussions for the emerging fieldsof mass communication research and media and cultural studies.Shannon’s diagram identified five moments or components in thecommunication process: an information source, a transmitter, themessage, the channel of communications and the receiver. It is adeceptively simple diagram. The information source, or sender, selectsa message to be coded into a signal that is then transmitted througha channel to a receiver. Information is the content of communication,in the sense that it is what needs to be transported with theminimum loss of quality, from the sender to a receiver (as if in anolder mode of communication when the latter mainly referred tophysical transport). At the same time, this content is not defined by itsmeaning, but by a mathematical function – a pattern of redundancyand frequency that allows a communication machine to distinguish itfrom noise. As all information theorists will emphasize, although wecan attribute meanings to information, the latter does not coincidewith its meaning. An encoded television signal or a piece of softwarehas no meaning in the conventional sense.Information, that is, is far from simply constituting a kind of degreezero of code, in the sense of a basic, denotative level of meaning.The postmodern lesson on the cultural politics of information is thatmeaning has evaporated as the main point of reference within thescene of communication. Or, in a different way, information can beunderstood to involve a larger spectrum than meaning, as F. J. Crassonsuggested when he compared information to phenomenologicalunderstandings of perception where ‘the meaningful is related tothe continuous fulfillment of expectations and is opposed therefore(by Husserl) to heterogeneous discontinuity or (by Merleau-Ponty) tocomplete homogeneity’. 11 Meaningful experiences thus disappearat each end, as it were, of the information spectrum, both itsmaximal and minimal points. In terms of its meaninglessness,maximal randomness in the visual field is hardy distinguishablefrom minimal randomness. Information is thus a term of far greaterextension than meaning. 12


14 Network CultureFrom an informational perspective, a meaningful perception, onethat can be made sense of and articulated, is a statistical compoundof the familiar and the unfamiliar. It is both redundant and random;we grasp it because it is partially new and partially familiar. ThusCrasson draws the conclusion that meaning is ‘what makes sense,produces no surprises, requires a minimal amount of informationto define its shape’. 13 Indeed, Crasson (like Jean Baudrillard 20 yearslater) will conclude that information and meaning might be inverselyproportional: the more information the less meaning. In this sense,the proliferation of information spells the drowning of meaningfulexperiences in a sea of random noise. In an informational culturethe middle zone of meaning is increasingly difficult to constructand maintain, in as much as the noise always implicitly carried byinformation hedges in from all sides. In this sense, an informationalculture marks the point where meaningful experiences are undersiege, continuously undermined by a proliferation of signs thathave no reference, only statistical patterns of frequency, redundancyand resonance (the obsessive frequency and redundancy of anadvertising campaign, the mutually reinforcing resonance of selfhelpmanuals and expert advice, the incessant bombardment ofsignifying influences). Holding on to the ‘message’ in order to drownthe noise of contrasting information is what allows the stabilitynecessary in order to establish a contact. But in this case, what stopsinformation from being just another name for brainwashing? Anddoes that imply that the scene of communication, the cultural politicsof information as such, is exclusively a theatre of manipulationfavouring the expertise and concentrated knowledge of a new breed ofcommunication managers? If information identifies an operationalityof communication strategies, spreading out from military technologyto civil society at large, isn’t it another name for communication asa sophisticated form of mind control?There is no doubt that the manipulation of affects and signs isan essential part of the politics of communication in informationalcultures. What is more difficult to uphold, however, is the behaviouristperspective that identifies the influence of the media with that of asimple command in a drastic simplification of the physical dynamicsof communication as such. Information theory, thus, highlightsthe minimum conditions for communication, and thus attributes asecondary importance to the question of the meaning of messages whencompared to the basic problem of how to increase the effectiveness of thechannel. Information can be anything: a sound, an image, a colour,


Three Propositions on Informational Cultures 15and of course also words. The problem of communication is reducedto that of establishing a bridge or a contact between a sender and areceiver. The two extremities of the channel ‘are on the same side, tiedtogether by a mutual interest: they battle together against noise’. 14Resistance to communication is not related to misinterpretation ordissent, but to an inhuman interference (what Michel Serres callsthe ‘demon of noise’). The scene of communication is reduced to itsminimum condition: that of making contact by clearing a channelfrom the threat of noise.This conception of communication is well suited to the technicaldemands of the channel within modern media such as telephony,radio and television, where the integrity of the signal is alwayspotentially undermined by the distortion of noise. At the sametime, however, this does not really imply the ultimate influence ofa technological determination, but more a return to the minimumcondition of communication as such. The minimum condition forcommunication (in the animal and the machine, as Wiener put it)is contact – a temporary suspension of the multitude of tiny andobscure perceptions out of which information emerges as a kind offleeting clarity, as if a space had been successfully cleared. It does notmatter who the sender or receiver are, whether they are machines,animals, bacteria, genetic sequences, or human organisms. Reasonand meaning, dialectics and persuasion, truth and falsehood areall temporarily evacuated from the scene. There is no longer aninterlocutor or an audience to address, there is no rhetorical playof ideas, but a kind of bare set, where all communication is reducedto a drive to clear out a channel for transmission between twopoints separated by space and united only by the channel. Froman informational perspective, communication is neither a rationalargument nor an antagonistic experience based on the capacity ofa speaker to convince a listener or to impose his perspective. Theinformation flow establishes a contact between sender and receiver byexcluding all interference, that is by holding off noise. Interlocutorsare not opposed, as in the traditional conception of the dialecticalgame, but they are assumed to be on the same side. Opposition tothe agreement between sender and receiver cannot be subjective, butonly objective and external, appearing only in the non-human formof meaningless noise (or the form of an enemy intent on disruptingthe communication between two partners in agreement). ‘To hold adialogue is to suppose a third man and to seek to exclude him.’ 15The appearance of a modern informational problematic, then,is related to a conception of communication as an operational


16 Network Cultureproblem dominated by the imperatives of the channel and the coderather than by a concern with exchange of ideas, ethical truth orrhetorical confrontation (a definition that dominates the liberal andenlightened concept of communication). It is not about signs, butabout signals.This technical (rather than simply technological) conception ofcommunication is what for us opposes, for example, the ethics ofmodern professionals of communication (such as journalists) totoday’s communication managers (PR agents, advertisers, perceptionmanagers, information strategists, directors of communication).While journalists who subscribe to a professional ethics rooted in aliberal modernity, for example, would argue that information mustbe assessed in terms of its accuracy (or truth value) and relevance(meaningfulness), communication managers seem to have anothertype of grasp of the informational dimension of contemporary culture– which they reduce to a Manichean battle between signal and noise.The latter, in fact, understand the power of a communication act asdetermined by the overall dynamics of the informational milieu,where what counts is the preservation of the message/signal throughall the different permutations and possible corruptions which sucha message/signal is liable to undergo. This is why, for example, thissocial management of communication favours the short slogan oreven the iconic power of the logo. The first condition of a successfulcommunication becomes that of reducing all meaning to information– that is to a signal that can be successfully replicated across a variedcommunication milieu with minimum alterations. Whether it isabout the Nike swoosh or war propaganda, what matters is theendurance of the information to be communicated, its power tosurvive as information all possible corruption by noise.When a television debate is held, for example, between competingpoliticians in the wake of an election, can we say that such a debateis won or lost on the basis of a dialectical argument involving theinterplay of truth and persuasion? Can we say that politicians arereally conveying a persuasive content? Or isn’t the main problem thatof clearing out a channel through a noisy mediascape, of establishinga contact with the audience out there? In this context, the opponentbecomes noise and the public becomes a target of communication:not a rational ensemble of free thinking individuals, endowed withreason, who must be persuaded, but a collective receiver to which amessage can be sent only on condition that the channel is kept free of


Three Propositions on Informational Cultures 17noise (competing politicians, but also the whole noisy communicationenvironment to which such politicians relate, where, for example,more young people vote for reality TV shows than for generalelections). Or in another context: don’t the techniques of advertisinginvolve, first of all, an attempt to bypass the noise of a crowdedinformational milieu by establishing a connection with potentialcustomers? The purpose of communication (the exclusion of noiseand the establishment of contact) is simultaneously presupposed,technically produced, and actively reinforced. It is understandable,then, why cultural activism of the No Logo variety should havefocused so much on what Mark Dery has called ‘culture jamming’ –signal distortion, graffiti on advertising posters, hijacking of corporateevents, all kind of attempts at disrupting the smooth efficiency ofthe communication machine. Or, as Gilles Deleuze suggested, whycultural resistance within control societies might also involve thecreation of ‘vacuoles’ of non-communication able to elude the latter’scommand. Or why, in conditions of media monopoly, the problembecomes that of undermining the stronghold of such tyrannicalcontact by opening up as many other venues of communication aspossible (from festivals to the Internet, from demos and workshopsto screenings, dancing and public performances).The conditions within which a cultural politics of informationunfolds are thus those of a communicational environment thathas been technically reduced to its ‘fundamental problem’, asShannon put it (or its minimum conditions as we would say). Itis such minimum conditions that must be recreated each time bythe techniques of communication: the successful constitution of acontact, the suspension of all competing signals and the filtering outof all possible corruption of the message in transit. There is nothinginherently technological here, in the modern sense of a Frankensteinmonster which has been created by human will but which is nowthreatening to destroy it. It is not so much a question of technologyas of techniques and forms of knowledge that all converge – througha variety of media and channels – on the basic problem of how toclear out a space and establish a successful contact.Does that mean, then, that journalists and activists who hang onto relevance and truth and meaning have been made redundant bycommunication managers with a much better grasp of informationaldynamics? The problem here is not that of arguing for the obsolescenceof meaning and truth in favour of sheer manipulation within aninformational milieu. It might be the case, that is, that such managers


18 Network Cultureand entrepreneurs might have themselves misunderstood theinformational dimension of communication as such and that theirrepeated efforts at amplifying the signal in order to drown out thenoise might be as counterproductive in a social sense as they wouldbe within the circuit of a stereo system. As ARPANET directors J. C. R.Licklider and Robert W. Taylor will later put it, such theory is based onthe unsatisfactory assumption that communication is simply abouttwo people now knowing what only one knew before (the name ofa brand; the central message of a political speech). 16 The tactics ofamplification, the attempt to control the scene of communication bysheer power, might backfire because it does not take sufficiently intoaccount the powers of feedback or retro-action – increasingly cynicalor even angry audience/receivers or just a kind of social entropy thatnonlinearizes the transmission of messages as such.In this sense, the critique made by Gilbert Simondon of thetechnical theory of communication (as he called the work of telecomsengineers) opens up an interesting perspective on the dynamics ofcommunication beyond meaning but also beyond the operationaldemands of the channel. For Simondon, the mathematical theory ofcommunication underestimated information by reducing it to whatis transmitted between two distinct and individuated extremities: asender and a receiver. The relation of communication for Simondondoes not take place between two preconstituted individuals (suchas a politician and his audience, for example), but between the preindividual(what within the formed individual resists individuation)and the collective (the dimension within which another type ofindividuation takes place). Both the sender and the receiver, thepolitician or his director of communication and their audience,are in fact immersed within a larger field of interactions that packswithin itself a constitutive potential that the mathematical theoryof communication does not capture. All communication alwaysinvolves a metastable milieu, characterized by an incompatibilityamong different dimensions of a pre-individual and collective being.Information is thus not so much the content of communication, asone of its dimensions, and more specifically it indicates the directionof a dynamic transformation. For Simondon, information theoristsunderestimated the conditions of turbulence and metastability thatdefine information as a kind of active line marking a quantic processof individuation. 17 What this comes down to, in relation to ourunderstanding of the cultural politics of information, is that theact of establishing a contact might not be reduceable to a kind of


Three Propositions on Informational Cultures 19informational command – where the ultimate target is the control ofwhat audiences think and feel. On the contrary, the informationaldimension of communication seems to imply an unfolding processof material constitution that neither the liberal ethics of journalismnor the cynicism of public relations officers really address.For example, in some ways the informational dimension ofcommunication seems to implicate a production of reality in a waythat does not only involve our capacity to signify – that is, to knowthe world through a system of signs. In as much as informationconcerns the problem of form it also poses the question of theorganization of perception and the production of bodily habits whichit foregrounds with relation to the emergence of social meanings.Within design and architecture, for example, information is alsoabout the active transformation of bodily habits as this takes placearound keyboards and chairs, games, trains and cars, buildingsand small objects with which we perform all kind of daily actions.Information is not about brainwashing as a form of media effect, butit does also involve a level of distracted perception; it thus informshabits and percepts and regulates the speed of a body by pluggingit into a field of action. In this sense, the informational dimensionof communication is not just about the successful delivery of acoded signal but also about contact and tactility, about architectureand design implying a dynamic modulation of material and socialenergies. Information works with forms of distracted perceptionby modulating the organization of a physical environment. 18 Thisactive power of information is everywhere: it is in the interfacesthat relay machines to machines and machines to humans; it isin material objects including chairs, cars, keyboards, and musicalinstruments. It is in bottles and telephones in as much as they lendthemselves in a particular way to the action of a hand. It is not anessence, understood here as a transcendent form, but it indicates thematerial organization of a possible action that moulds and remouldsthe social field. The return of communication to its minimumconditions makes the whole field of culture and society (not simplythe media) open to the informational redesign and hence, to theaction of a code. A cultural politics of information, as it lives throughand addresses the centrality of information transmission, processingand communication techniques, opens up a heightened awareness ofthe importance of minute and apparently inconsequential decisionsas they are implemented in architecture and design, on televisionand the Internet, in medical research and news-making, in personal


20 Network Culturerelationships and working practices. In this sense, the relation ofsignal to noise with which we have opened our understanding ofinformation and which dominates the perspective of the socialengineering of communication does not exhaust the informationaldimension of culture as such. As we shall see, in fact, this dimensionis concerned not only with the successful transmission of messages,but also with the overall constitution of fields of possibilities – ofalternatives and potentials that are transforming the problem ofrepresentation as cultural theory has come to think of it.THE LIMITS OF POSSIBILITYSuppose we have a set of possible events whose probabilities ofoccurrence are p1; p2; …; pn. These probabilities are known but thatis all we know concerning which event will occur. Can we find ameasure of how much ‘choice’ is involved in the selection of theevent or of how uncertain we are of the outcome? 19Proposition II: The transmission of information implies the communicationand exclusion of probable alternatives.Corollary II: Informational cultures challenge the coincidence of thereal with the possible.Information theory understands all codes as operating within statisticalconstraints that make the succession of symbols more or less likely (whichallows the maximization of information transmission through a channel).Another key implication, however, is that all events (from the occurrenceof a symbol within a code to all communication acts) can be described asa selection among a set of mutually excluding and more or less probablealternatives. The communication of information thus implies the reductionof material processes to a closed system defined by the relation betweenthe actual selection (the real) and the field of probabilities that it defines(the statistically probable). The relation between the real and the probable,however, also evokes the spectre of the improbable, the fluctuation andhence the virtual. As such, a cultural politics of information somehowresists the confinement of social change to a closed set of mutually excludingand predetermined alternatives; and deploys an active engagement with thetransformative potential of the virtual (that which is beyond measure)The breakthrough that gave Shannon’s paper such relevance outsidethe circles of telecom engineers was a little parenthesis that heopens up in the middle of his paper on the mathematical theory


Three Propositions on Informational Cultures 21of communication, entitled ‘Choice, uncertainty, entropy’. As hesomewhat self-deprecatorily explained,This theorem, and the assumptions required for its proof, are inno way necessary for the present theory. It is given chiefly tolend a certain plausibility to some of our later definitions. Thereal justification of these definitions, however, will reside in theirimplications. 20In this section, he identified information as a measure of theprobabilities of occurrence of an event (including the choice of onesymbol over another within a code) – and hence a single selectionamong possible states. This measure was provided for him by statisticalmechanics – a discipline that tackled thermodynamic processesthrough the tools of statistics. As we will see, the nineteenth-centuryphysicist Ludwig Boltzmann identified the entropy of a closed system(that is, the tendency of such a system to lose structure while alsorunning out of useful energy), with an uncertainty in our knowledgeof the system. For thermodynamics, all irreversible processes involvethe interaction of a large number of particles that can be distributedin a variable number of states. Because of the numbers involved,we cannot really know such systems in detail. All we can have is astatistical description that allows us to calculate the probabilities ofoccurrence of events. Not an absolute and determinate description (asin classical physics), but a probabilistic evaluation of the states inwhich a system might be.Shannon’s breakthrough was to apply such a statisticalunderstanding of entropy to the theory of messages or informationtheory. ‘Entropy is a probability distribution, assigning variousprobabilities to a set of possible messages. But entropy is also ameasure of what the person receiving the message does not know aboutit before it arrives.’ 21 Shannon applied this statistical understandingof sets of messages to the problem of ‘code’. In particular, he lookedat language as a code and observed that codes such as the Englishlanguage obeyed definite statistical laws that determined the likelyfrequency of any combination of letters (there is a higher probabilitythat c will be followed by h than by z, for example). The Englishlanguage was thus defined as a code that involved approximately50 per cent individual freedom in the choice of symbols and 50 percent necessity as established by the statistical laws of the code. By


22 Network Culturemathematizing the relationship between redundancy and freedomin a code such as English, one could devise some means to encodethe message more effectively. The ‘choice’ of the individual speakeris constrained by the statistical laws of language.Warren Weaver, among others, drew from such theory a keyconsequence. Such definition of information implied that the actionof a code on a situation was also a kind of containment of the opennessof the situation to a set of mutually excluding alternatives. The codepredetermines, and it does so statistically. ‘The concept of informationapplies not to individual messages (as the concept of meaning would),but rather to the situation as a whole … Choices are, at least from thepoint of view of the communication system, governed by probabilities …’ 22For Norbert Wiener, ‘the transmission of information is impossiblesave as a transmission of alternatives. If only one contingency is tobe transmitted, then it may be sent most efficiently and with theleast trouble by sending no message at all.’ Hence, it is convenientfor transmission and storage purposes, to consider a unit amountof information as ‘a single decision between equally probablealternatives’. 23 Each transmitted unit of information is thus a selection(hence Shannon’s theory is also referred to sometimes as the selectivetheory of information).Let us look at a concrete example of such a selective and statisticalconception of information as given by another cybernetician, W.Ross Ashby. 24 A man has committed a crime and been arrested. Hedoes not know whether his partner in crime has been arrested ornot. His wife must communicate to him an essential piece of missinginformation (whether or not his confederate has been caught bythe police). The only communication allowed between her and herhusband is a cup of tea (in this case the channel); she will either putsugar in the tea or not depending on whether the confederate hasbeen caught or not (this is her code); the jailer can also be expectedto try to interfere in the communication if he can (he is the noise).There is here a priori information (the crime–jail scenario); a degreeof uncertainty in this system (as determined by two probable states:the confederate has either been caught or not); a sender and a receiver(the wife and husband); a channel (cup of tea); a code (presence/absence of sugar); an interference (the jailer).This situation and its uncertainty can thus be measured on thebasis of a statistical distribution of probabilities. In this case, thereare two alternative or probable states given to the information source(or wife): the confederate has either been caught or not. The field


Three Propositions on Informational Cultures 23constituted by the husband, the wife, the prison, the prison guards,the cup of tea, the confederate, and the police gives rise to a closedset of messages. The confederate might have got away with it ormight have been caught; hence the uncertainty of the situation canbe expressed through a simple binary code (yes or no), that is one bit(or binary digit). However, the information source or sender (the wife)is limited in the information that she can send by the channel (thereis not much information that you can communicate through a cupof tea). Because the only transaction that is allowed between herselfand her husband is a cup of tea, then the latter is the channel andthe capacity of such a channel will put constraints on the coding ofthe uncertainty of the situation. For example, they might agree thatsweetened tea would be a yes and unsweetened would be a no.Of course, communication also includes the possibility of acorruption of the message in transit by noise. The jailer might havefigured out that the tea can be used as a means of communication andhe might interfere by telling the prisoner that he had sweetened ithimself. If the wife wanted to make sure that the message got through,she would thus need some way of inserting some redundancy intothe code, thus doubling the probability of the information survivingthe noise (she might have agreed also to add or not to add milk,for example). Although the code might be different, both casesare equally likely, so the amount of information that needs to betransmitted is ultimately low (a choice between two possibilitiesexpressible by way of a single bit). What the example illustrates isthe principle that, as Warren Weaver put it, ‘the unit informationindicates that in this situation one has an amount of freedom ofchoice, in selecting a message, which is convenient to regard as astandard unit amount.’ 25 The measure of the information that isproduced when the message is chosen from the set is the amount thatthe woman can communicate to her jailed husband. This amountof information can be reduced to the logarithm of the probabilitiesof the situation and thus be prepared for communication througha suitable channel. The value of information theory for cybernetics,according to Ross Ashby, lies exactly in its representing a givensituation as a set of mutually excluding alternatives. It does not askwhat individual answer it can produce, but ‘what are all the possiblebehaviours that it can produce’ and how likely one behaviour is whencompared to another. The value of information theory is that it dealswith such sets of probabilities.


24 Network CultureThis example challenges us to think about information in waysthat are markedly different from the commonsensical ways in whichwe have come to regard it – not simply as content but also as akind of representation of a reality out there. Communication theoryexplicitly states that information involves the reduction of both a setof messages and a milieu of interaction to their statistical properties.Information thus operates as a form of probabilistic containment andresolution of the instability, uncertainty and virtuality of a process.It is thus implicated in a process by which alternatives are reducedand the uncertainty of processes is prepared for codification by achannel. Uncertainty can be measured and solved by applying aset of constraints to a situation that unfolds into a binary modeof ‘either/or’. ‘The transmission of information is impossible saveas a transmission of alternatives.’ 26 Within the mathematicaltheory of communication, information represents an uncertainand probabilistic milieu by reducing it to sets of alternatives thatdetermine more or less likely sets of possibilities on the basis ofa given distribution of probabilities as determined by the relationbetween channel and code.What the communication of information implies, then, is not somuch a relation between the ‘real’ and its ‘copy’ (or its representation),but the reduction of a process to a set of probabilities. It still holdstrue, that is, that information does not only address the dimension ofinterpretation or meaning (even though it also carries meaning andit is also subject to interpretation). But this operation of significationis secondary with relation to a primary operation which is that ofthe reduction of a situation to a set of more or less probable statesand alternatives as constrained by the interplay between a channeland a code; and the reduction of communication to the resolutionof such uncertainties through the selection of one of the alternativesfrom the set (this selection does not necessarily involve a humansubject, but can be spontaneously generated). In this sense, for manycritics of information and communication theory, the latter arealmost exclusively modes of power involved in the reproduction ofa system. 27 The communication of information related, for example,to a new deal between a government and a trade union, adds to ourknowledge of the situation only in as much as that situation hasbeen reduced to a set of possible outcomes (deal/no deal; strike/negotiations) that can be easily encoded within the medium ofthe news. The communication of information about a possible warsimilarly reduces the complexity of a situation to a set of pre-closed


Three Propositions on Informational Cultures 25alternatives. Nothing new is really added, only some (im)probablealternatives eliminated (such as other modes of knowledge ormethods of analysis).The transmission of information concerns alternatives formulatedon the basis of known probabilities within the constraints set up bythe interplay of code and channel or medium. That is why the mosteffective and concise modality of information transmission todayis that of the opinion poll, the survey, risk assessment and all othertypes of information that can be easily encoded for survival in themeta-medium of an informational milieu. What is the probabilitythat I will develop a fatal disease if I keep smoking? How is thepopularity of the government doing today? How many points didthe Dow Jones lose today and what are the chances that it will go up?Is it by chance that there is a whole sector of the financial markets,such as that of futures, that is based on a kind of legal gamble on theprobable future? Whether it is marketing research, polls-informedpublic policy, or medical decisions, the transmission of informationinvolves the action of a code and a channel setting the limits withinwhich the problem can be presented and mapping out sets of possiblealternatives. The political technology of information societies iscrucially concerned with the organization of the field of the probableor the likely. It thus produces a sensibility to social change (andforms of subjectivity) that are informed by the relation between thereal and the possible – where the real is what remains while all othercompeting possibilities are excluded.Once again what we are presented with here is not simply theeffect of a technological organization of communication, but a setof relays between the technical and the social. The closure of thehorizon of radical transformations that is implied in the probabilisticnature of information and the code is not simply the effect ofinformation and communication technologies. On the contrary, itis once again a matter of techniques and impersonal strategies asthey distribute themselves on the macroscopic consensus about theultimate triumph of the existent. A cultural politics of informationthus also implies a renewed and intense struggle around thedefinition of the limits and alternatives that identify the potentialfor change and transformation. The cultural politics of information,as it unfolds across the distributed networks of communication,often involves a direct questioning of the codes and channels thatgenerate the distribution of probabilities – that is the production ofalternatives as such. It is exactly because all information assumes the


26 Network Cultureconstitution of a closed field of possibilities that the cultural politicsof information is often centrally directed to constraints and ‘lack ofchoice’ as is. A cultural politics of information is crucially concernedwith questioning the relationship between the probable, the possibleand the real. It involves the opening up of the virtuality of theworld by positing not simply different, but radically other codes andchannels for expressing and giving expression to an undeterminedpotential for change.Even as mediated by the space of statistical probability, in fact,the relationship between the real and the probable that is enactedwithin the informational dimension of communication does notontologically exclude the possibility of the extremely improbable(or of the virtual). As Marco d’Eramo has put it, the probability ofa system’s being in a certain state is not a property of its being.Probabilities do not exclude the possibility of a fluctuation thatviolates the organized space of the real and the possible.If we say that water boils at 100 degrees Celsius, we are really sayingsomething else: that, at 100 degrees in a pot, water has a very highchance of boiling, but, at the same time, there is a possibility thatat 100 degrees water freezes. It is an infinitesimal possibility (wecan calculate it), but it exists. 28Information expresses the determination of probability, but it doesnot exclude beforehand the occurrence of the extremely unlikely.It is because communication, as a political technique, attempts toenclose an informational milieu around the informational couple‘actual/probable’ that it also opens up another space – that of thefluctuations that produce the unpredictable, of the inventions thatbreak the space of possibility, of the choices that are no choices atall but a kind of quantum jump onto another plane.This is why the cultural politics of information can be said tobypass the relationship between the real and the possible to open upthe relation between the real and the virtual – beyond the metaphysicsof truth and appearance of the utopian imagination informing therevolutionary ideals of modernity. What lies beyond the possible,in fact, is not a utopian time and space to be realized against theharsh alienation of the present. This improbability that can only bepredicted with the benefit of hindsight can be made to correspondto the category of the virtual – as it is formulated in the work ofHenri Bergson, Gilles Deleuze and more recently Brian Massumi and


Three Propositions on Informational Cultures 27Pierre Levy. 29 The virtualization of a process involves opening up areal understood as devoid of transformative potential to the actionof forces that exceed it from all sides. In an informational sense, thevirtual appears as the site not only of the improbable, but of theopenness of biophysical (but also socio-cultural) processes to theirruption of the unlikely and the inventive.What lies beyond the possible and the real is thus the openness ofthe virtual, of the invention and the fluctuation, of what cannot beplanned or even thought in advance, of what has no real permanencebut only reverberations. Unlike the probable, the virtual can onlyirrupt and then recede, leaving only traces behind it, but traces thatare virtually able to regenerate a reality gangrened by its reductionto a closed set of possibilities. Whether it is about the flash-likeappearance and disappearance of the electronic commons (as in theearly Internet), or the irruption in a given economic sector of a newtechnology able to unravel and disrupt its established organizationof production (as in the current explosion of file-sharing systems), orwhether it is about the virtuality of another world perceived duringa mass demonstration or a workshop or a camp, the cultural politicsof information involves a stab at the fabric of possibility, an undoingof the coincidence of the real with the given. In this sense, if we cantalk about a cultural politics of information at all it is not becauseof new technologies, but because it is the reduction of the space ofcommunication to a space of limited and hardly effectual alternatives(as in the postmodern sign) that poses the problem of the unlikelyand the unthinkable as such. The cultural politics of information isno radical alternative that springs out of a negativity to confront amonolithic social technology of power. It is rather a positive feedbackeffect of informational cultures as such.NONLINEARITY AND REPRESENTATIONFrom our previous discussion of entropy as a measure ofuncertainty it seems reasonable to use the conditional entropyof the message, knowing the received signal, as a measure of thismissing information. 30Proposition III: Information implies a nonlinear relation between themicro and the macro.


28 Network CultureCorollary III: Within informational cultures, the centrality of the coupledifference/position within a closed dialectics is displaced by that ofmutation/movement within open systems.Because information theory draws its theoretical underpinnings fromthermodynamics and statistical mechanics, it understands materialprocesses as implying a nonlinear relation between macrostates (such asaverages, but also identities, subjectivities, societies and cultures) andmicrostates (the multiplicity of particles and interactions that underliemacrostates in as much as they also involve irreversible processes). Thishas a double consequence for our understanding of the cultural politics ofinformation: on the one hand, it implies a shift away from representation tomodulation which emphasizes the power of the mutating and divergent; onthe other hand, it locates informational dynamics outside the perspectivaland three-dimensional space of modernity and within an immersive,multidimensional and transformative topology.For Jérôme Segal, we cannot really speak of a unified theory ofinformation until the late 1940s, but we can definitely see howthe preliminary labour started in fields such as statistics, physicsand telecommunications at least since the 1920s. The questionof information was posed first of all in the context of statistics of‘populations’. The question that the statistical theory of informationaddressed was that ofthe scientific reduction of a mass of data to a relatively smallnumber of quantities which must correctly represent this mass,or, in other words, must contain the largest possible part of thetotality of relevant information contained in the original data. 31The mathematical tools through which this reduction was madepossible were derived from the field of social physics as inauguratedin the mid nineteenth century by the Belgian astronomer AdolpheQuetelet (the inventor of the average man in society, a compiler ofmortality and criminality tables and also the author of a statisticalstudy on the ‘propensity to suicide’, which later came to provide thefoundations of Emile Durkheim’s famous sociological study). Themodern theory of probability, however, had started as early as themid seventeenth century, when a long-standing problem in gamesof dice was subjected to mathematical treatment. 32The statistical tools of probability theory had found a use inphysics as well at least since James Clerk Maxwell started treating


Three Propositions on Informational Cultures 29kinetic systems such as gases as ‘collections of tiny particles rushingabout at varying speeds and colliding with each other … Since itis impossible to establish the exact speed of each particle, Maxwelltreated the whole collection of particles statistically.’ 33 At the endof the nineteenth century, Ludwig Boltzmann had established thatsince human beings could not know and should not be interestedin the specific behaviour of each individual molecule at a particularmoment, they could at least know how vast collections of particlesbehaved on average. As the system becomes more disorderly andtemperature differences are lost, its entropy (the amount of ‘energyunavailable for work’) increases and even the limited knowledgeallowed for by the average disappears: ‘when the system is in a highstate of entropy then it is improbable that [such parts] will be foundin any special arrangement at a particular time’. 34 In a state of highentropy, both the randomness and the uncertainty with regard tothe state of a system are at their maxima.The entropy of a system thus corresponds to an uncertainty in ourknowledge of it. Boltzmann’s theorem identified a function H whichmeasured the difference between the distribution of probabilitiesat any given time and those that exist at an equilibrium state ofmaximum entropy. As entropy increases and the system becomesmore disorganized, the value of the function H would decrease andso would our knowledge of the probable state of any particle withinthe system. Shannon determined that Boltzmann’s H-theorem alsoworked as a way to measure information.Shannon repeatedly remarked how his theory of information wasonly concerned with the problem of communication engineering(and specifically the problem of the relation of noise to signal within achannel). And yet, the mathematical link that it established betweeninformation and entropy caused information theory to become thebasis for the reunification of knowledge so much yearned for bytwentieth-century science. The task of developing and expandingon the consequences of a quantitative definition of information fellto theorists such as Norbert Wiener (who published his bestsellingbook on cybernetics in the same year that Shannon’s book waspublished and who, on the basis of seniority and prestige, claimedfor himself priority over Shannon’s work), Warren Weaver (directorof the Rockefeller Foundation and author of a key explanatoryessay on Shannon’s theory) and later physicist Louis Brillouin, thecontroversial author of several texts on the relationship betweeninformation theory and science (such as Science and Information


30 Network CultureTheory, in 1956; and Scientific Uncertainty and Information, in 1964).The first symposium on information theory took place in Londonin 1950, but it was the Macy conferences on cybernetics that reallyfocused the scientific debate around information.It is difficult to underestimate the resonance that the link betweenentropy and information had in the mid-twentieth-century scientificenvironment. Nineteenth-century thermodynamics identifiedthrough entropy a principle of irreversibility in physical processes,and more specifically a tendency of life to run out of differences andhence of available energy in its drive towards death. By identifyingan equivalence between information and entropy, Shannon’s workthrew a bridge between the twentieth-century sciences of cyberneticsand quantum theory and the nineteenth-century interest in heatengines, energy, irreversibility and death.The link between information and entropy also referred back toa thinking experiment that had troubled physicists since the midnineteenth century, Maxwell’s Demon. The question posed byMaxwell’s Demon was whether it was possible to counteract thetendency of closed systems to run out of energy, whether, that is,it was possible to identify a physical capacity that ran against thestream of entropy. The experiment suggested that a fictional beingwith perfect knowledge of the state of each individual molecule in agas could counteract the increase of entropy within a heat engine bysorting out hot from cool molecules. The idea that Maxwell’s Demonwas nothing other than an abstract informational entity and thatinformation involved an expenditure of energy had already beensuggested – and with it the notion that information played a keyrole in the struggle of living organisms against the entropic tides thatthreatened them with death. As stated by quantum physicist ErwinSchrödinger in a key 1945 lecture, ‘What is Life?’, what needed to beexplained for many physicists was not so much the physical tendencyto dissipation that made all forms of life mortal, but its opposite. Ifthe universe tended overall towards homogenization, life somehowexpressed an upstream movement against the entropic tide. What islife if not negative entropy, a movement that runs against the secondlaw of thermodynamics, whose existence is witnessed by the varietiesof forms of life as they exist in the physical world? In asking a questionthat was to define the discourse of the life sciences for the next 50years, Schrödinger argued that ‘living organisms eat negative entropy’(that is, negentropy). Negentropic forces will thus be allocated aseat in the human organism – that of the macromolecule DNA, an


Three Propositions on Informational Cultures 31informational microstructure able to produce living organisms byinducing the chemical reactions leading to the conversion of energyinto differentiated types of cells.This notion that information was somehow related to anti-entropicor negentropic forces is at the basis of the informationalist perspectivethat identifies information with a kind of form determining thematerial unfolding of life. Echoes of informationalism are present inall statements that argue that informational genetic sequencesdetermine not only skin and hair colour, but even our very actionsand feelings. This interpretation of the relation between informationand entropy is not confirmed, however, by most of the current workin genetic or molecular biology, where the DNA macromolecule isunderstood as a simple inductor within the complex environmentof the cell. Rather than expressing a deterministic relation betweeninformational structures such as the DNA and a biophysicalphenomenon such as the organism, the informational trendemphasizes the nonlinear relationship between molecular or microlevels of organization and molar or macro layers. 35 Likethermodynamics and statistical mechanics, information theorysuggests that a macro-state or a molar formation (such as an averagetemperature; or an organism; or an ‘identity’) does not have a linearor deterministic relation to the multiplicity of the microscopic statesthat define it (the singular particles and their velocities; themicroscopic relations that make up an organism; the mutations andvariations that underlie all identities).In its technical and scientific sense, then, information impliesa ‘representation’ of a physical state, but there is no assumedresemblance between the representation and the state that it describes.Within the statistical model proposed by Quetelet’s social physics, forexample, the ‘average’ or ‘norm’ is the representation of a macrostateto which can correspond a variety of microstates. An average mightbe the same for a number of different possibilities (an average heightof 6 feet in a population of 100 people might be realized by manydifferent distributions of possible heights). As a macrostate, theaverage does not really exist, but it is a kind of social norm, a strangeattractor endowed with the function to regulate the social body andstabilize it. It is the centre of gravity to which ‘all the phenomena ofequilibrium and its movement refers’. 36 Like the mass society thatin those same years was increasingly preoccupying conservative andradical critics alike, thermodynamics and statistical mechanics toowere concerned with formations such as masses, quantities such


32 Network Cultureas averages and qualities such as homogeneity and heterogeneity.An average, however, can only adequately describe a low-entropy,highly structured system and its value as a descriptive measure isundermined in systems that are more fluid, hence more random anddisorganized (such as the disorganized capitalism described by JohnUrry and Scott Lash, for example). 37 The state of a flow is always afunction of the aggregate behaviour of a microscopical multiplicity,but as chaos theory showed, there is no linear and direct relationbetween the micro (the particles) and the macro (the overall flowdynamics). It is at the level of the micro, however, that mutationsand divergences are engendered and it is therefore in the micro thatthe potential for change and even radical transformation lie.This is why both cybernetics and the mathematical theory ofcommunication involved a shift of representational strategies, suchas a preference for the use of discrete quantities (such as digitalcode) over continuous ones. The difference is all in a relationship tomicroscopic levels of organizations that are understood as inherentlymetastable, characterized by sudden and discontinuous variationsthat the use of continuous quantities cannot capture with sufficientprecision. Norbert Wiener, for example, discussed the problemswith the continuous representation of physical states in terms ofits intrinsic inadequacy in relation to the microscopic instability ofthe matter–energy continuum. For him, machines that representthe object by following and reproducing the variations in intensityof light, texture or sound on a material substrate always end upproducing an unbridgeable gap between representation and reality,a gap which can only produce the dreaded interference of noise.For cyberneticians the discrete cut implied by a digital code made upfor the approximation inherent in continuous or analogous quantities(which can only capture a static average rather than the instability ofthe micro). In a passage that could be read in conjunction with JeanBaudrillard’s theory of simulation, Wiener describes the problematicrelation between a continuous representational technique (in thiscase a slide rule, but we could also say a map or an ‘identity’) andthe object represented (the territory, or the actual individuals andpre-individual or unstable dimensions that they contain). For Wiener,analogue machines, unlike digital machines, measure rather thancount, and are therefore ‘greatly limited in their precision’, becausethey operate ‘on the basis of analogous connection between themeasured quantities and the numerical quantities supposed torepresent them’. Wiener points out how digital machines, on the


Three Propositions on Informational Cultures 33other hand, offer ‘great advantages for the most varied problemsof communication and control…’, in as much as ‘the sharpness ofthe decision between “yes” and “no” permits them to accumulateinformation in such a way as to allow us to discriminate very smalldifferences in very large numbers’. 38 If we compare a slide rule, forexample, to a digital computer, we can clearly see how the accuracyof the former can only be approximate. The scale on which the markshave to be printed, the accuracy of our eyes, pose some very sharplimits to the precision with which the rule can be read. There is nopoint in trying to make the slide rule larger, because this increasesthe problems of accuracy. 39Any attempt at using continuous quantities to measure physicalphenomena is thus doomed by a material impossibility: the nature ofour perception (defined phenomenologically as the power of humaneyes), which is imprecise; and the rigidity of analogue machinesin general (which can only produce averages and identities whilstscreening out all micro-variations and mutations as irrelevantexceptions, and hence miss change). These factors combine tomake analogue techniques ultimately too limited. Even if the mapcould become as large as the territory, it would still be too rigid andinaccurate. Thus Wiener suggests that numbers are the best way tocapture an intrinsically unstable and unmeasurable matter. Numbersin this case stand for a principle of discontinuity and microvariationswhich another famous cybernetician, Gregory Bateson, opposed tothe continuity of quantities.Between two and three there is a jump. In the case of quantity, thereis no such jump and because the jump is missing in the world ofquantity it is impossible for any quantity to be exact. You can haveexactly three tomatoes. You can never have exactly three gallonsof water. Always quantity is approximate. 40By extending the principle of counting to fractions and infinitesimalnumbers, turning numbers into the infinite combinations of zeros andones, digitization is able to produce exact and yet mobile snapshotsof material processes. Such representations, however, are never eithercomplete or exhaustive, because the relationship between the microand the macro unfolds within a nonlinear mode. The result of thisnew closeness through numbers is a blurring: the closer you try toget to matter the faster your counting has to become in an attempt tocatch up with the imperceptible speed of matter. Information theory


34 Network Cultureaccepts the existence of an ‘incomplete determinism, almost anirrationality in the world … a fundamental element of chance’. 41The crossover of information and communication techniquesfrom scientific speculation to market research, from theoreticalphysics to cultural politics is too complex to map out here. Whatis relevant to the current discussion, however, is that the rise of theconcept of information has contributed to the development of newtechniques for collecting and storing information that have simultaneouslyattacked and reinforced the macroscopic moulds of identity(the gender, race, class, nationality and sexuality axes). Thus, thecultural politics of information does not address so much the threat of‘disembodiment’, or the disappearance of the body, but its microdissectionand modulation, as it is split and decomposed into segmentsof variable and adjustable sizes (race, gender, sexual preferences; butalso income, demographics, cultural preferences and interests). Itis at this point that we can notice the convergence of the culturalpolitics of information with digital techniques of decompositionand recombination. For Pierre Levy, ‘digitisation is the absolute ofmontage, montage affecting the tiniest fragments of a message, anindefinite and constantly renewed receptivity to the combination,fusion and replenishment of signs’. 42 It is not only the messages thatare fragmented and constantly renewed and recombined, but alsothe receivers of these messages, in the form of bits of informationarchived and cross-referenced through a million databases.The emergence of information as a concept, then, should also berelated to the development of a set of techniques, including marketingstrategies and techniques of communication management – as theyattempt to capture the increasing randomness and volatility of culture.Already in the early 1990s, the marketing literature was describingthe shift from new media to the Internet in terms of informationtargetingstrategies. The New Economy apologists, for example,famously postulated three stages of media power: broadcasting,narrowcasting and pointcasting. 43 The latter corresponded to a digitalmode in which messages were not simply directed at groups buttailored to individuals and even sub-individual units (or as GillesDeleuze called them, ‘dividuals’, what results from the decompositionof individuals into data clouds subject to automated integrationand disintegration). These patterns identified by marketing modelscorrespond to a process whereby the postmodern segmentation of themass audiences is pursued to the point where it becomes a mobile,multiple and discontinuous microsegmentation. It is not simply


Three Propositions on Informational Cultures 35a matter of catering for the youth or for migrants or for wealthyentrepreneurs, but also that of disintegrating, so to speak, suchyouth/migrants/entrepreneurs into their microstatistical composition– aggregating and disaggregating them on the basis of ever-changinginformational flows about their specific tastes and interests.Information transmission and processing techniques, asexemplified in the technical machine that Lev Manovitch considersto be the arch-model of the new media, the database, have helped todiscriminate and exploit the smallest differences in tastes, timetablesand orientations, bypassing altogether the self-evident, humanisticsubject, going from masses to populations of sub-individualizedunits of information. At the same time, this decomposition has notsimply affected the identical, but also the different. Gender, raceand sexuality, the mantra of the cultural politics of difference inthe 1980s and 1990s, have been reduced to recombinable elements,disassociated from their subjects and recomposed on a plane ofmodulation – a close sampling of the micromutations of the social,moving to the rhythm of market expansions and contractions.In this sense, the foregrounding of informational flows acrossthe socius also implies a crisis of representation (both linguisticand political). The statistical modulation of information is highlydisruptive in its relation to representation because it underminesthe perspectival and three-dimensional space which functions asa support for relations of mirrors and reflections as they engendersubjects, identities and selves. 44 In other words, the logic ofrepresentation presupposes a homogeneous space where differentsubjects can recognize each other when they are different and hencealso when they are identical. This applies both at the level of linguisticrepresentation (where I need to know what a man is in order to knowwhat a woman is); but also at the level of political representation (asdisplayed in the allocation of positions across a political spectrumthat is disposed from left to right as if facing an audience/publicsomehow always located at the centre). The analysis of the play ofdifferences in representation within the self–other dialectics, in fact,has always implied the support of a space where the other is observedas from across a space. It is this empty space organized by a threedimensionalperspective that gives support to the psychic dynamicsof identification, but also to the possibility of linguistic representationof the self and others as they are observed across such space. Thespace presupposed and engendered by an informational perspectiveexpresses a radical challenge to representation – and hence also to


36 Network Culturethe cultural politics of identity and difference. It is not only that allidentities and even differences are reconfigured as macrostates oraverages which belie a much more fluid and mutating composition.It is also that the whole configuration of space within which suchpolitics were conceived has undergone a shift of focus.The divergence between a representational and an informationalspace is illustrated by recent developments in robotics and artificialintelligence – two fields of research for which the relationship betweenrepresentation and information is crucial. Rodney Brooks has given ussome vivid descriptions of early efforts to build intelligent machines(such as mobile robots) able to navigate effectively through space.Most early efforts in mobile robotics (or mobotics as it is also known)relied on a representational approach to cognition and movement. Arobot was provided with sensors (such as cameras) able to scan a spacefor obstacle and directions. The informational stream collected by therobot was translated into a two- or three-dimensional map that therobot would then use to navigate the environment. This approachwas ultimately unsuccessful because the information contained in theenvironment ultimately exceeded the robot’s computational capacity.The robot just could not make a map that was accurate enough – itoften missed the relevant factors or picked the wrong ones. 45Brooks explains how the representational approach that assumeda relation between a three-dimensional space and a ‘cognition box’able to provide a two- or three-dimensional map of reality which inits turn gave rise to an action ultimately failed. He sees this failure asan incapacity of such representation to keep up with the complexityand instability of an informational space (representation can onlycapture the macro-scale, but it misses the abundance of realityand its capacity for dynamic shifts). The robot was not immersingitself in the complexity of informational space, allowing its sensoryorgans to interact directly with the environment, but was, so tospeak, keeping its distance from it in order to represent it. Thisdistance was necessary to the completion of a three-dimensionalmap of the environment that it then used to navigate the space.(The visualization techniques used in this process laid the basis fordevelopments in special effects and simulational training techniques).This operation was unbearably slow and it failed to deal even with aminimum alteration and the different levels of the environment. Themobotics approach, on the other hand, got rid of the cognition box,and instantiated a direct relationship between sensors and motors in


Three Propositions on Informational Cultures 37ways that allowed the robot to interact directly with its environment(substituting the cognition box with a simple memory device). Theloop between sensor and motor organs allowed a more direct anddynamic interaction with the huge informational flows generatedeven by the most simple environment. Space becomes informationalnot so much when it is computed by a machine, but when itpresents an excess of sensory data, a radical indeterminacy in ourknowledge, and a nonlinear temporality involving a multiplicity ofmutating variables and different intersecting levels of observationand interaction. Space, that is, does not really need computers to beinformational even as computers make us aware of the informationaldimension as such. An informational space is inherently immersive,excessive and dynamic: one cannot simply observe it, but becomesalmost unwittingly overpowered by it. It is not so much a threedimensional,perspectival space where subjects carry out actionsand relate to each other, but a field of displacements, mutationsand movements that do not support the actions of a subject, butdecompose it, recompose it and carry it along.An engagement with the technical and scientific genealogy of aconcept such as information, then, can be actively critical withoutdisacknowledging its power to give expression and visibility tosocial and physical processes. We are very aware of the linguisticand social constraints that overdetermine the formation of scientificknowledge, and yet we cannot deny it a dialogic relationship withnatural processes (as Ilya Prigogine and Isabelle Stengers have putit). As it informs and doubles into the social, the physical world thatemerges out of this relationship is not passive, immutable, or evenunknowable, but probabilistic, chaotic, indeterminate and open.As I have described it, information is neither simply a physicaldomain nor a social construction, nor the content of a communicationact, nor an immaterial entity set to take over the real, but a specificreorientation of forms of power and modes of resistance. On theone hand, it is about a resistance to informational forms of poweras they involve techniques of manipulation and containment of thevirtuality of the social; and on the other hand, it implies a collectiveengagement with the potential of such informational flows as theydisplace culture and help us to see it as the site of a reinvention oflife. In every case, this reinvention today cannot really avoid thechallenge of informational milieus and topologies. In as much asthe network topos seems to match and embrace the turbulentinvolutions of such microcultural dynamics, the informational


38 Network Culturedimension of communication involves the emergence of a networkculture. It is to the informational topos of the network, then, that wewill keep turning to in order to catch this active constitution ofinformational cultures.


2Network DynamicsMachines, the reality constructed by capitalism, are not phantasmsof modernity after which life can run unscathed – they are, on thecontrary, the concrete forms according to which reality organizesitself, and the material connections within which subjectivityis produced. Ordo et connexio rerum idem est ac ordo et connexioidearum. 1NETWORK TIMEIn 1998, taking its hint from McLuhan’s notion of the global village,the Swatch corporation decided to introduce some standards into thechaotic tangle of Internet culture – a world where successive wavesof global netsurfers would crowd chatrooms and online gaming sites,meeting and parting at the intersection of overlapping time zones,gathering as if they were passing down the torch of a sleepless, alwaysup and on networked planet. If the Internet was unifying the globethrough a common electronic space, then Swatch thought of itselfas the most obvious candidate to provide the single time to match.In the corporate imagination, the new medium that had capturedthe time and attention span of a fickle and affluent global youthcould be nothing else than an electronic metropolis, in dire need ofsome kind of time standard. The Swiss corporation thus launched anew global or Internet time, divided into ‘swatch beats’, each beatcorresponding to a little more than one minute. Thus if ‘a New Yorkbased web-surfer makes a date for a chat with a cyber-friend in Rome,they can simply agree to meet at an “@ time”, because Internet timeis the same all over the world’. 2 The Swatch time was a sleek attemptto link the transcendental globalization of the planet achieved bythe 1990s global consumer culture to a new type of globalization– grafting the power of the brand on that of the internetwork.As Geert Lovink recounts, Swatch’s Internet time was just one ofat least three attempts to propose a ‘spaceless, virtual time standard,located within networks no longer referring to Greenwich meantime’. 3 For Lovink, these attempts demonstrated how ‘[t]he legacy39


40 Network Cultureof our inherited 19th-century temporal model segmenting the planetinto 24 separate time zones (and two simultaneous dates) increasinglyno longer fits well with our nascent third-millennium global temporalperceptions’. 4 This idea of a global time corresponding to the globalspace of the Internet goes to the heart of the problem posed by theInternet and its relation to the world of locality – where the localis often made to coincide with the real, the heterogeneous and theembodied. This debate has recently come to overlap with an earlierperspective that considered computer networks mainly as expressionsof dematerialization and disembodiment. The everyday use of theInternet, its implication in the ordinary work of learning, working,and communicating, has done much to dismantle the notion ofcyberspace as virtual reality. On the other hand, it is undeniable thatthe Internet has joined media such as television and cinema as oneof the great accused in the trial about the virtual globalization (andrelated technocultural imperialism) of the planet. 5As geographers have pointed out, one of the most fundamentalaspects of communication lies in the ways in which it forms anddeforms the fabric of space and time. Communication technologiesdo more than just link different localities. Pathways and roads,canals and railways, telegraphs and satellites modify the speed atwhich goods, ideas, micro-organisms, animals and people encounterand transform each other. They actively mould what they connectby creating new topological configurations and thus effectivelycontributing to the constitution of geopolitical entities such ascities and regions, or nations and empires. The rise of the nationand nationalism in the nineteenth century, for example, wouldhave been unthinkable without the centralized pull of the railwaysystem, the homogenizing embrace of national newspapers and thesynchronizing power of national broadcasting corporations. Of late,the layered communication system modelled on the nation statehas witnessed another mutation with the rise of global, real-timecommunication networks such as satellite television and computernetworks. As should be expected, such reconfiguration of the overallcommunication system is linked to the emergence of new geopoliticalformations and in particular it seems inextricably linked to the openand unbounded space of the post-cold-war global empire, as describedin Michael Hardt and Antonio Negri’s homonymous book. 6The communication topology of Empire is complex as it is woventogether by aeroplanes, freight ships, television, cinema, computersand telephony, but what all these different systems seem to have in


Network Dynamics 41common is their convergence on the figure not simply of the network,but a kind of hypernetwork, a meshwork potentially connecting everypoint to every other point. As such, the network is becoming less andless a description of a specific system, and more a catchword to describethe formation of a single and yet multidimensional informationmilieu – linked by the dynamics of information propagation andsegmented by diverse modes and channels of circulation.If the network topos does not and cannot be made to coincide withthe Internet, the latter however expresses an interesting mutationof the network diagram in its relation to the cultural and politicalassemblages of this twenty-first century neo-imperial formation. Abrainchild of a US Defense research programme and, for a while,the spearhead of another US-led economic revolution, a globalmedium with a fast rate of diffusion in Third World countries, aglobal means of organization, the medium of the multitude, a marketfor technological innovations, a soapbox for opinionated individuals,a means of collective organization, a challenge to the regime ofintellectual property, a new publishing platform, electronic agora,cyber-bazaar, sleaze factory and global junkyard – some would arguethat the Internet can only be described in a piecemeal, empiricalfashion and in any case as an ‘unrepresentative’ medium in itsrelation to wider processes of globalization.It is true that in terms of the actual power to capture the passionsof the global masses, the Internet is no match for the reach and powerof television, which, from local and national broadcasting channelsto satellite TV such as CNN and Al-Jazeera, can count on the wideraccessibility of the necessary technology (the TV set) and on thehigh impact of images and sounds broadcast in real time. Neithercan we deny the minority status of Internet users on a global scale,considering that no medium can transcend the economic chasmwidened by the neoliberal policies of the ‘Washington Consensus’in the 1980s and 1990s. 7 If the Internet appears to us as such a keyglobal communication technology, it is not because of overwhelmingnumbers or mass appeal (although it is true that it has witnessed anexplosive global growth in just over a decade). It is rather because,unlike the other global communication technologies mentionedabove, it has been conceived and evolved as a network of networks,or an internetwork, a topological formation that presents somechallenging insights into the dynamics underlying the formation ofa global network culture. As a technical system, the Internet consistsof a set of interrelated protocols, abstract technical diagrams that


42 Network Culturegive the network consistency beyond the rapidly changing hardwareenvironment of computers, servers, cable and wires. Even thoughbasic Internet protocols have changed over time, the philosophythat has informed their design and hence the architecture of theInternet has been consistent overall and informed by a few keyprinciples which have, up until this moment, survived scalability(such as a universal address space, a layered and modular structure,the distributed movements of data packets and the interoperabilityof heterogeneous systems). Such principles imply a strong conceptionof an informational milieu as a dynamic topological formation,characterized by a tendency towards divergence and differentiation,posing the problem of compatibility and the production of a commonspace as an active effort involving an unstable or metastable milieu.In other words, beyond being a concrete assemblage of hardwareand software, the internetwork is also an abstract technical diagramimplying a very specific production of space. As we will see, whatcharacterizes the technical diagram and design principles that havedriven the development of the Internet is a tendency to understandspace in terms of the biophysical properties of open systems. Bymodelling such open network spatiality the Internet becomes for usmore than simply one medium among many, but a kind of generalfigure for the processes driving the globalization of culture andcommunication at large.OF GRIDS AND NETWORKSThe relation between the Internet and the production of space is, byno chance, crucial to all theoretical and analytical engagement withInternet culture. A feature of this engagement has been its insistenceon such informational space as being somehow characterized bya dangerous distance from the world of the flesh and of physicalspaces. If the early debate on information networks was dominatedby the image of a Gibsonian cyberspace in which users would loseconsciousness of the real world and lose themselves in a universeof abstract forms and disembodied perspectives, the contemporarydebate has shifted onto the terrain of globalization. Where the mostcommon image of cyberspace used to be that of a virtual-realityenvironment characterized by direct interface and full immersion(data gloves, goggles, embedded microchips and electrodes), nowthe image is that of a common space of information flows in whichthe political and cultural stakes of globalization are played out. The


Network Dynamics 43debate on a transcendental cyberspace in opposition to the world ofthe flesh has developed its counterpart in a political discourse thatopposed the homogeneous pull of the global to the heterogeneousworld of locality.For geographers such as Manuel Castells, for example, the networkmakes explicit the dynamics by which a globally connected elite iscoming to dominate and control the lives of those who remain boundto the world of locality, thus reinforcing a ‘structural dominationof the space of flows over the space of places’. 8 According to thisperspective, in network societies the concrete time of places, boundto a specific mode of duration, is increasingly subsumed by theimperium of a single, electronic and global space accessible at theclick of a mouse: ‘the edge of forever or timeless time’. Paul Viriliohas argued for the opposite and specular case: information networksare annihilating space in favour of time (thus the Gulf Wars wereglobal, not because they happened in a global space as did World WarII, but because they happened in global time, the single time or ‘realtime’ of global television and the Internet). If world history is markedby a constant acceleration (from the age of horses and carriages tothat of bullet trains and intercontinental missiles), the emergenceof global information networks marks a limit point, as if with globalcommunication we had hit a wall and started a detonation. Thus thesimultaneity of actions has taken precedence over the succession ofevents and the world has been reduced to one unique time and space– ‘an accident without precedent’. 9 The time of the network is ‘realtime’: everything happens simultaneously and thus fatalistically witha kind of after-the-event sense of inevitability.When we relate such allegations to the abstract technical diagramsthat make an electronic space such as the Internet possible, we findthat they seem to correspond to a specific aspect of its informationarchitecture. To be locatable on the Internet, in fact, a machine/host/user needs to have an address and this address needs to be unequivocallysituated within a common address space. This ecumenicalfunction (the function of creating a single space) is performed bythe Internet Protocol (IP) and the Domain Name System (DNS). ThisInternet Protocol has undergone a number of changes over the yearsbut its main function has not really changed: it is the code thatassigns to each machine an individual number. The Domain NameSystem associates each number with a cell in a table and also gives ita name. The DNS is thus an ideal single spatial map of the Internet,


44 Network Culturecomprising a system of unique addresses that makes each IP-codedhost and server locatable. Whenever we type an email address or aURL into the apposite program, we are to all effects referring to aspecific address in this global, electronic map. This feature of theInternet design confirms the image of a distance between a world ofinformation and a world of embodied and bounded locality.Furthermore, this informational and electronic space, as it isconstituted within this single map, appears as uncannily reminiscentof a modern dream for a completely homogeneous and controllablespace. If we compare the Internet to a global city, with its addressesand neighbourhoods, its overall layout as expressed by the DNSdatabase structure is hypermodernist. Its global electronic addressspace is structured like a grid of discrete locations – all of which fromthe point of view of the system have an equal probability of beingaccessed. In informational terms, that is, the Internet is in principlea highly entropic system (hence tendentially homogeneous) in asmuch as it can be entered at any point and each movement is inprinciple as likely as the next. In principle, that is, each Internetbrowser or file transfer protocol or email programme is structurallyfree to jump to any street and house number whatsoever (to continueour urban analogy). In order to limit the demands posed on thetechnical system by such high entropic levels of randomness andindetermination, the DNS protocol divides such single space througha limited number of top-level domains (.com, .org, .net, .edu, and thenational domains, such as .uk, .au, etc) enclosing it, so to speak, at thetop. 10 Each domain is infinitely divisible: it is divided into a series ofsubdomains and each subdomain in its turn is potentially composedof an infinite number of smaller addresses, neatly branching out fromits umbrella to identify individual users or machines, from servers topersonal computers to all kinds of communication devices. (There is amovement to extend the IP protocol to Internet-connectable electricappliances and objects such as toasters, fridges and clothes.)At the same time, however, this abstract and homogeneous spaceof cells and grids is not completely devoid of any physical relation tolocality. To this abstract space able to contain all possible addressescorresponds a concrete assemblage of technical machines, the DNSservers, which are arranged in a hierarchical structure. Thirteen rootservers, ten of which are currently located in the USA, two in Europeand one in Asia, for example, contain information about the next setof DNS machines, that is the authoritative name servers. There are asmany authoritative name servers as there are domains and each one


Network Dynamics 45of them contains information about all the machines in that domain;the same is true for subdomains and so on. Thus, if the abstractInternet space is a grid in principle equally accessible from all points,in practice the speed and even, as we shall see, the trajectory by whichwe can actually get from A to B is determined by the relation and stateof traffic between the servers, a relation that crucially includes thedifferential speeds of bandwidth and the ‘weighting’ of connections(where some nodes or cell-space assume centrality when compared toothers). Finally, to the relatively centralized structure of the namingsystem corresponds a centralized governing body – a kind of globalregulatory board. While the DNS was famously run for years andsingle-handedly by Internet pioneer Jon Postel, since his death ithas been supervized by a non-profit organization, ICANN (InternetCorporation for Assigned Names and Numbers) – a corporationthat has typically been the subject of heated controversies aboutaccountability and democratic governance of the Internet. 11Another way in which the abstract space of the grid is modified anddifferentiated is through its relation to the semantic domain of thename (and specifically the semiotic economy of the brand name). Theidentification of IP addresses with names has introduced into Internetspace the symbolic capital of brands – and hence has determinedanother differentiation at the heart of the universal informationspace, that of electronic real estate. Following the opening up ofthe Internet to commercial organizations, for example, the strugglesaround domain names have witnessed some spectacular lawsuits ascorporations, speculators and activists looking for a fight rushed to gettheir hands on valuable names and addresses. 12 Within the griddedspace of the DNS, the brand re-emerges as a star, a centre of gravity, anidentifiable name that guides the netsurfer through the anonymousspace of the IP number world. The tangled and heterogeneousmeshwork that constitutes the Internet is thus not simply reconciledwithin the hieratic indifference of a universal information space, butalso subjected to heated and controversial political debates, expensivelitigations and cultural struggles. The Domain Name System then isboth single and universal, but also formed and deformed by locality.For Tim Berners-Lee, the legal disputes around names correspond toa friction between electronic space and local space, which is wherethe DNS, overall, can be said to exist.Trademark law assigns corporate names and trademarks withinthe scope of the physical location of businesses and the markets


46 Network Culturein which they sell. The trademark-law criterion of separation inlocation and market does not work for domain names, becausethe Internet crosses all geographic bounds and has no concept ofmarket area, let alone one that matches the existing conventionsin trademark laws. 13And yet, beyond the distortions introduced in the realm of Internetdomains by the injection of symbolic capital, we cannot deny thatat least in principle the Internet is organized through the figure ofthe grid and that this grid constitutes one of the most privilegedreferences in theoretical understanding of electronic space. The gridis a fascinating figure and one with a particularly strong resonancewithin social and cultural theory, because of its strong association withthe space of reason and modernity. The modernist grid, as defined bythe intersection of two Cartesian axes, is a triumph of a mind ableto extract a homogeneous and ordered space out of the ruggednessand heterogeneity of topological space. There is always somethingboth utopian and dystopian about a grid. Whether it is a city plan, aprison layout or an accountant’s spreadsheet, the grid is a principleof division and order, making possible the counting and location ofthings. If the Internet is ultimately reducible to a modernist form suchas the grid, then the main movement that traverses it and organizesit is the vectorial movement of a tele-command. 14 An electronicaddress does not simply indicate a location within cyberspace (I am@ anyplace) but also the possible movement of a direct line tracedbetween two points. (You can find me @ anytime. This document isat www.anyplace.org; you can find it there whenever). Informationis divided and allocated a space, each node is assigned a uniquenumber/name, and all information is instantly retrievable by way ofa simple command line. Information is uploaded and downloadedas in a kind of electronic warehouse where new content is depositedand disposed of, deleted, updated, or simply left there to rot.The connection between different locales on the grid is activatedby the tele-command – by the click of a link activating the server’scall for a response by the corresponding machine. It is in this sense,as some have remarked, that the Internet might not be an immersivevirtual reality as the cyberpunks imagined it, but an alternative spaceexisting ‘at the edge of forever’, as Manuel Castells put it. Cyberspaceexists in the omni-equal distance that lies at the end of a mouse click.Regardless of the semantic differentiation of the IP address system,regardless of the geopolitical distribution of servers, within such a


Network Dynamics 47common informational plane a site in South Korea is ultimatelywithin the same vectorial reach as a page in Rio. The whole planetfeels as if it were compressed into the same virtual space just the otherside of a computer screen, but it is as if such space was ultimatelya static one, absorbing and neutralizing all differences on a singleplane of communication.Within this understanding, the Internet is thus nothing more thanan extended database, crossed by repeatable sequences of commandsenabling the retrieval of documents located at different points in theplanet. This chilling picture of a single information space, dividedand distributed on a single grid containing all the possible addressesof all possible machines, underlines many of the more damningdescriptions of the Internet and its relationship to the world of localityand embodiment. From this perspective, the single informationspace is an extension of a modern instrumental rationality drivingtowards the ultimate goal of the disappearance of the irreducibleheterogeneous in the homogeneous space of the global network. TheInternet thus appears to give form to a space of connections withouttransformations, where vectors of communication link up differentelectronic spaces outside of any real possibility for becoming. Butdoes the database structure really exhaust all aspects of networkcommunication? Or does an over-reliance on the database modelblind us to the more dynamic aspects of the Internet diagram andits relation to network culture as such?THE PARADOX OF MOVEMENTThe debate about space and time in the age of communication is notnecessarily limited to the Internet as such but is a variation on thelarger theme of cultural globalization. A communication technologysuch as the Internet participates in the emergence of a globalizedculture, following and expressing the fractal folds of a spatiality thattwists and knots together different scales of interaction – the localand the global, but also the regional and the national. In as much asthe Internet is an informational diagram, form here should not beunderstood in the sense of a mould, imprinting its stamp on a worldof locality already weakened by decades of global popular culture.The Internet informs a globalized planet by reproducing some of itsmost individuated and stable forms as well as its potential to diverge,to pass over into new formations through the combined power ofthe fluctuation and the mutation.


48 Network CulturePhysicists such as Duncan J. Watts and Albert László Barabási, forexample, have mapped ‘the small worlds’ of networks in terms ofa relation between ‘structure’ and ‘dynamics’. 15 Within the samefield, Steve Lawrence and Lee Giles at the NEC Research Institutein Princeton have produced a model of the Web, based on the databrought back by a meta-search engine or robot about its size andtopology. In this way, they have reconstructed the virtual geographyof the World Wide Web by mapping the number of links that connectdifferent web sites to each other. Replicating an action that searchengines carry out all the time, algorithms have been let loose on thenetwork to come back with a picture not only of how many pagesand sites are actually out there on the Web, but also of the overallmovement of information flows within the network. This approachdownplays the links to locality (mapping the global distributionof Internet access) for an internal snapshot of the web world. Theresearchers thus looked not only for the number of pages and theirlocation in the DNS grid (as a search engine bot would do), but alsofor the overall map drawn by the active movement of the link.The result is a kind of parallel global map of an informational planet,produced on the basis of outgoing and incoming links, mapping thedirected movement linking sites to sites. One such map picturedthe informational space of the web-planet through the topology ofcontinents, archipelagos and islands. 16 It mapped the gravitationalpulls of portals and brands (at the heart of the core continents lie allthe major websites – the likes of Yahoo, MSN, Google, the CNN andBBC – which collected the largest number of incoming links) and alsothe existence of peripheral information land masses, tied to a centralcore, but also independent from it. Beyond these massive continentssignalling a centralization of Internet traffic, they pictured a sprinklingof small archipelagos made up of web sites that connect only to eachother, and large info-islands which corresponded to Gibson’s BlackIce – the firewalls hiding the high-security intranetworks of militaryand financial institutions. At the same time, the researchers admitthat it was hard to claim that their map of web space is exhaustive– in as much as a great number of web sites appear to be off the radar.If the portals act as centripetal forces of attraction in an unstable anddisorienting network space, producing the effect of an informationalland mass, this does not exclude the existence of other movementsof divergence and disconnection, which characterize, for example,the choice of some groups to communicate only with each otherwithin a closed network of sites shielded from outside access by


Network Dynamics 49obscure addresses or corporate firewalls. In this sense, the globalappears as a site of accumulation of resources that manifests itselfas a mass, which distorts the homogeneous informational milieuby exercising a kind of gravitational pull that draws in other spatialscales (such as national or regional) to itself. Any interface with themedium, therefore, implies some kind of relation to such centripetalmovement.On the other hand, however, this centripetal and homogenizingpull of the global mass is not the only movement active within theInternet as an informational milieu. In this sense, we can draw a usefulparallel with the debate on globalization. If a structural dominationof the space of flows (the global) over that of places (the local) exists,together with attendant forms of cultural imperialism, it is one thatdoes not deny the fluidity of places as such, their constitution as localreservoirs endowed with a productive capacity for difference. Thestudy of global popular culture in the 1990s has gone some waytowards mapping some of the features of this ‘virtual global’. Whenseen spatially, a global culture has often appeared as split between theopposing pulls of homogenizing (global) and heterogenizing (local)forces. The relationship between the opposing poles of the globaland the local has been shown to produce all kinds of mutant culturalforms – ranging from familiar patterns of pseudo-individuation (theFrench McDonald’s as distinct from the American McDonald’s, asdepicted in the memorable dialogue between John Travolta andSamuel Jackson in Pulp Fiction), to more complex nonlinear dynamicsof mutual feedback (as in the relationship between the cinemas ofHong Kong and Hollywood). 17If the local, in fact, were nothing but a reservoir of frozendifferences; if the global were only the homogenizing pull of thelikes of McDonald’s, Microsoft and Coca-Cola; if the Internet werenothing but an electronic grid or database where all locationslie flat and movement is mainly that of vectors of fixed lengthbut variable position linking distant locations to a few centres– where would the potential for struggle and change, becomingand transformation come from? In the case of the Internet, forexample, where would its dynamism come from? How can wereconcile the grid-like structure of electronic space with the dynamicfeatures of the Internet, with the movements of information? Howdo we explain chain mails and list serves, web logs and web rings,peer-to-peer networks and denial-of-service attacks? What aboutthe rising clutter of information, the scams and the spam, the


50 Network Cultureendless petitions, the instantaneous diffusion of noise and gossip,the network as permanent instability? It is possible, that is, thatby thinking of the Internet in terms of the grid we might havefallen into a classic metaphysical trap: that of reducing duration tomovement, that is, of confusing time with space. 18The notion that cyberspace is nothing more than the intersectionof the grid and the vector reminds us of some classic paradoxes ofmovement – paradoxes that Henri Bergson referred to repeatedlyin his dissection of Western metaphysics’ relation to duration. TheZeno paradox, for example, marked a high point of confrontationbetween the pre-Socratic philosophy of qualitative change and theEuclidean geometry of position. The challenge of the former to thelatter was thrown on the basis of the geometrical argument thatbetween a point A and a point B lie an infinite number of points (A…B… C… D…). Zeno’s paradox was that of applying the geometricalmethod to motion: If an arrow has to pass through an infinity ofpoints, how will it ever reach its target? How could Achilles catchup with a tortoise if in order to do so he will have to go through aninfinity of points (which, in Euclidean geometry, compose a line)?Won’t he be caught up in the infinite passage from point A to pointB to point C and so on? Bergson’s reading of Zeno’s paradoxes isthat they showed how the specificity of duration is unaccountableon the basis of the notion of an infinitely divisible space, a notionthat deprives space of its qualitative dimension. Movement doesnot so much imply a simple passage between points, but involvesduration, that is a qualitative becoming that affects both the arrow,the target, the archer and the overall context. Space is subdividedinto discrete points only because the pragmatic orientation of ourbodies in the world privileges space as a homogeneous container ofobjects and underestimates the fact that extension and duration arerelated within the process of becoming.Bergson suggested that Western metaphysics (and hence also thepopular metaphysics that gives rise to what we think of as ‘commonsense’) is particularly troubled by the notion of an intensive space,a space that endures. Indeed Western metaphysics for Bergson haspersistently misunderstood duration, almost as if it constituted akind of unthinkable other. When we think about movement, Bergsonargued, we make the common mistake of thinking of it as alwaysthe movement of an object through a space. We tend to think ofsomething that moves as something that crosses a space that canbe neatly assigned a position between a point of departure (A) and


Network Dynamics 51a point of arrival (B) through a whole series of intermediate points inbetween. Such a conception of movement divides it, that is, into twoabstract formations: on the one side a homogeneous time (the timethat is, that says that a plane leaves at 17:00 and arrives at 21:00); onthe other side a homogeneous space (the space in which the planecan always be located at a definite longitude and latitude). The notionthat the Internet annihilates the heterogeneity of times onto a singlespace can only make sense within such a metaphysical understandingof the space–time relation. If a command, such as that which callsup a web page in India or California, employs more or less the sametime to reach any-destination-whatsoever, then there is no time atall. Only a single, hypercontracted, supergrid of a space. Bergsonexplained this understanding of movement in terms of a line linkingtwo positions with the necessary illusion by which our perceptionscreens out the blurred complexity of duration in order to isolatedefinite objects that can be manipulated. Necessary as this reductionsometimes is, for Bergson it cannot produce a proper understandingof becoming. What is time, then, if it is not the seconds or hoursor days that it takes to go from A to B? What is movement withinelectronic space if it is not the linear command of a client–serverexchange, the instant flashing of a message, the click on a link, theimmediate openness of everybody and everything to everybody andeverything else?For Bergson, by thinking of movement as a linear translation of anobject through space we miss a fundamental element: the virtualityof duration, the qualitative change that every movement brings notonly to that which moves, but also to the space that it moves inand to the whole into which that space necessarily opens up. Aplane journey, for example, is not simply about bringing a big, metal,flying machine with a bunch of passengers from A to B. The plane’smovement affects the space it moves in and modifies it. It transformsthe chemical composition of the atmosphere. It affects the passengersand staff through a transformation or qualitative change in theirrelationship with what they have left while they wait to changewhat they are moving towards. A piece of information spreadingthroughout the open space of the network is not only a vector insearch of a target, it is also a potential transformation of the spacecrossed that always leaves something behind – a new idea, a newaffect (even an annoyance), a modification of the overall topology.Information is not simply transmitted from point A to point B: itpropagates and by propagation it affects and modifies its milieu.


52 Network CultureWhile Western metaphysics looks at movement as a function of ahomogeneous, hence unchangeable space and time, Bergson suggeststhat we should understand movement always in relation to the openwhole, that is the whole duration that it affects. Duration implies aqualitative transformation of space and space itself is nothing but anongoing movement opening onto an unbounded whole.Thus in a sense movement has two aspects. On the one hand thatwhich happens between objects or parts; on the other hand thatwhich expresses the duration or the whole… We can thereforesay that movement relates the objects of a closed system to openduration, and duration to the objects of the system, which it forcesto open up. 19Although grounded in some aspects of its own technical structure,the notion that the Internet is just a new stage in the constitution ofa global culture where distance and locality are annihilated does notdo justice to such informational dynamics. At a practical level, forexample, we might point out that the space of the grid is much moremobile and dynamic than it appears – with individual users usingmore than one machine, or disappearing and reappearing from thenetwork at different times using different aliases or identities. At thesame time the linkages established by the tele-command of electronicspace, by the call-and-response mode of the distributed database, donot lead to a single time or space, but to a multiple duration wherelinkages constitute a fluid dynamic of connection and differentiation.However, even a partial modification of our description of the grid–vector model, does not do justice to the dynamic capacity of networkculture to renew and modify the medium – not simply by movinginformation around, but by deforming and differentiating the overallnetwork milieu. If the Internet is a form linking the bounded with theunbounded, the local with the global, then it is a particularly dynamicone. To take the duration of the network seriously demands that ouranalysis should relate its most stratified and organized moments (thegrid space of the DNS; the vectors of tele-command; the stellar powerof the brand system) to other aspects of internetworking (packetswitching; open architecture; network culture). It is not by chancethat we find this conception of space as duration within two of theother most basic and emblematic Internet design principles: openarchitecture and packet switching.


A TENDENCY TO DIFFERNetwork Dynamics 53… all innovative, creative systems are divergent, conversely, sequencesof events that are predictable are, ipso facto, convergent. 20If we managed the impossible task of freeze-framing the Internet fora second, we would be faced with a bewildering picture. Starting withthe most popular Internet activity, emailing, we would undoubtedly bestruck by the sheer magnitude of a traffic that eludes all measurement(estimates point to billions of messages exchanged per day). 21A change of perspective would reveal how this popular Internetprotocol entails a small galaxy of different modes – ranging fromthe exchanges of individual emails to the pack movement of spamto the mobilizing command of office mailing lists to the open spiralsscattered throughout by discussion groups. Besides the movementof email packets, we would also undoubtedly be struck by the vastexpanse of the World Wide Web, a veritable info-planet that computermodels have reconstructed as entailing its own geography of a denselypopulated land mass (the portal phenomenon), surrounded by smallercontinents, and little archipelagos of disconnected islands. 22 A closeupof the fringes would reveal staggering clusters of file-sharing,peer-to-peer networks and parallel computing. If we zoomed in tolook at the details we would be able to see the electronic bulletinboard systems and the social software zones – from dating agenciesto community web logs and wikis. Finally, if we looked hard enough,we would probably be able to see the open-source and free softwareprogramming sites, spinning the web from within their relativelysmall but highly effective network enclaves. This synchronic sliceof internetworking would highlight the layered and overlappingtopologies of the single communication space. However it is withtime, that is in its dynamic dimensions, that this topology reveals alarger picture of merging micro-waves of technological innovationthat over relatively short spans of time have considerably enrichedthe social experience of electronic communication (from telnet, ftp,email, irc, muds to http, streaming, blogs, p2p, wi-fi, wikis, etc.).What makes the Internet a challenging medium is not only thenature of its technological components but more generally the designprinciples that have informed its ongoing evolution. The Internet,in fact, is not just a global computer network, but a network ofnetworks, the actualization of a set of design principles entailing theinteroperability of heterogeneous information systems. Not only,


54 Network Culturethat is, is there no central control of the Internet (although thereare many control centres), but the whole space of communicationhas been designed and conceived in terms of dynamic and variablerelations between different communication networks. 23 The Internetwas conceived from its inception as a heterogeneous network, ableto accommodate in principle, if not in actuality, not only diversecommunication systems, but also drifting and differentiatingcommunication modes.If we look at the architecture of the Internet as a turning-pointwithin the history of communication, in fact, this turning-point doesnot simply coincide with the set-up of the first ARPANET connectionin 1969 – traditionally held to be the birth of the Internet. Thefirst dedicated computer network to span a number of institutionsacross the USA, ARPANET was certainly an important technologicalbreakthrough, but it was still not the Internet. ARPANET carriedthrough the consequences of a conceptual revolution in computingthat shifted the emphasis from computers as calculating machines tocomputers as communication devices – at first by way of time sharingand subsequently through local and wide area networks.Around the time when the first computer network was devised, thewhole approach to computing was undergoing a singular revolution.The old notion of a computer as a task-oriented, mechanical deviceto which white-coated lab scientists would respectfully take theirmore complex calculations was giving way to a focus on interactivitywithin a cybernetic and informational perspective on communication.Among those scientists who were involved with research at DARPA,for example, we find J. C. R. Licklider, who advocated the notion ofa ‘man–computer’ symbiosis that could both speed up some of themore routine aspects of scientific work and free up a new capacityfor conceptual innovation. Others, like Douglas Engelbart, pioneereddevices such as the mouse, time sharing and the visual display ofinformation, intended to facilitate the collectivity of technicalthinking. The ARPANET team tested and introduced packet switching– a technical innovation that allowed the maximum exploitationof bandwidth for the purposes of data communication; designedthe system by thinking of network users as potential contributorsto the system’s development, thus paving the way for the endlesslymutating future Internet; publicized the results of the experimentoutside the DARPA network in computer conferences and publicjournals; and trained PhD students who were later to make an


Network Dynamics 55essential contribution to the emergence of business and grassrootsnetworks.In spite of the fact that ARPANET was the first computer network,however, it did not really include some of the key principles ofnetwork architecture that characterize the Internet today. Somepeople claim that the real ‘birthday’ of the Internet was four yearslater, that is 1 January 1973. This change is identified with the switchoverfrom NCP to TCP/IP, that is, from a closed network modelthat drew clear boundaries around the network and maintained ameasure of centralized control to an open architecture model thatwas designed as intrinsically open to new additions. As an article onthe net-magazine Telepolis argued[t]he change from one big packet switching network under thecontrol of one administrative or political structure to an openarchitecture allowing for communication among dissimilarnetworks under diverse forms of political or administrativestructures, is the change that has made it possible to have aninternational Internet today. 24Open architecture was part of a more general effort by computerscientists to think anew the question of the organization of networksand electronic space. Introduced by Robert Kahn and Vincent Cerfat DARPA, open architecture networking assumes that individualnetworks may be separately designed and developed, using theirown specific and unique interfaces to fit the user requirements andthe environment in which they operate. Whatever the interface orscope of individual networks, whether small, local area networksor intercontinental, wide area ones, the design philosophy of openarchitecture dictates that they should all be equally allowed to connectto the internetwork and hence to each other by way of a system ofgateways and routers directing traffic between them on a best-effortbasis. This process involves the design of common protocols that aremeant to impose no internal change on the participating networks.Each network is thus assumed to be autonomous, that is, able to standon its own even outside its Internet connection (thus if the Internetwere to break down, individual networks would simply lose theirconnections to each other but internal use should still be possible).No internal changes or global control are thus required (except, aswe have seen, for some coordination at the DNS level and opentechnical boards in charge of maintaining consistency of standards).


56 Network CultureThe individual networks might even decide to keep part of theiroperations closed or to filter out most outside traffic. 25 The decision ofindividual networks to close the flow of information however wouldnot affect the overall topology that tends towards the production ofa smooth, open and unbounded space.The development of open architecture within computing standsin marked contrast to parallel developments in architecture of thebricks-and-mortar variety. While in the late 1970s urban architecturewent through the postmodern moment of Venturi’s Learning From LasVegas, the still marginal field of information architecture introducedan alternative conception of spatial organization. If architects weremoving away from the purist excesses of high modernism towardsthe heterogeneous pastiches of postmodernity, information architectsand engineers were working out the blueprints of a similar and yetalternative problematic of space. Within open architecture, in fact,electronic space is not conceived as composed of different fragments,juxtaposed together in a pastiche mode, as in postmodern architecture.The different components accommodated by open architecture arenot inert fragments of living or dead styles, but autonomous networksin continuous expansion and modification. Open architectureprovided the field of computer networks with a common frameworkbased on a pragmatic grasp of the inevitability of spatio-temporaldifferentiation. 26 As an RFC (Request for Comments) document ofthe Network Working Group put it, ‘The [Internet’s] architecturalprinciples … aim to provide a framework for creating cooperationand standards, as a small “spanning set” of rules that generates alarge, varied and evolving space of technology.’ 27In as much as there is no limit to the number of networks that openarchitecture can accommodate, the development of internetworkingtechnologies is crucially concerned with modulating the relationshipbetween differentiation and universality. As Paul Baran demonstratedin his 1960s research on packet switching, centralized networksare extremely vulnerable both to technical failures and to targetedenemy attacks. Command functions need to be distributed, thusallowing a communication network to survive the destruction of ahigh percentage of its nodes. At the same time, such distribution ofcommand functions once applied to a system that is conceived asalways potentially open to new additions carries within itself a tendencyto divergence and differentiation that in the absence of a coherentdesign strategy can easily lead to catastrophic transformations (suchas the breakdown of the network in secluded territories). Removed


Network Dynamics 57from the central controlling gaze of a single centre, space tends notso much to fragment into individual cells, as to diverge, hybridizingitself around the peculiar features of different milieus and cultures.Decentralized and distributed networks, although intrinsically morerobust and resilient than centralized ones, present the intrinsicproblem of a tendency towards differentiation and drift that threatensto turn the open network into an archipelago of disconnected andisolated islands.This tendency of decentralized networks to diverge to the pointof disconnection is described by open architecture as a tendencytowards the production of incompatibilities. Divergence brings withit the tendency towards disconnection and disconnection producesincompatibilities. The adaptability and flexibility inherent in theshift away from expensive mainframes towards microcomputers andeventually personal computers makes computer networks particularlyliable to modifications and mutations, to specialized uses inherent inthe multiplicity of contexts into which computing spreads.The networking of computers, the emergence of different networkingcultures in the 1970s and 1980s, only confirmed the intuition of openarchitecture: resilience needs decentralization; decentralization bringslocalization and autonomy; localization and autonomy producedifferentiation and divergence. Within open architecture, suchdivergent movements are not supposed to disappear once and forall once the initial incompatibilities are overcome. Incompatibilitiesset up by the drift of divergent movements are the limits that opennetworks continuously generate and must also overcome. An opennetwork should always be potentially extensible, and thereforeshould be structurally equipped to deal with irreconcilable tensionsby leaping to a new level of generality that would thus allow suchdifferences to connect within a common space. This level of generalitymust involve a structural openness able to accommodate the durationof the network at large.A brief technical history of the development of internetworking,from ARPANET to the World Wide Web, clearly shows the recurrenceof the problem of divergence and incompatibility. The ARPANETresearchers, for example, have often insisted that the military needsfor a resilient communication network were not the direct motivationbehind the development of computer networks. Resilience wasimportant, of course: the Internet was built to be robust. But,they say, this was not the main drive behind the development ofinternetworking technologies, not even behind the controversial


58 Network Cultureadoption of packet switching. The main problem, they explain, wasnot a nuclear attack, but the tendency towards the production ofincompatibilities:The core problem of getting computers to communicate with eachother is, by definition, one of compatibility. As the network growsbigger, incompatibilities must be overcome. As separate networkspresent the prospect of interconnection, compatibility hurdles arise.And as the pressure grows to connect all data resources togetherand make them universally accessible, the key technologicalobstacle is incompatibility. 28This tendency of an open space to endure, that is to diverge anddifferentiate, is observable from the very beginning of computernetworking. With the dissemination of the results of the ARPANETexperiment into the larger community of computer scientists, newtypes of computer network started to spring up, in a relationship ofactive mimesis, or innovative copying of the ARPANET model. Thesenetworks were local adaptations to different institutional demands orto diverse grassroots cultures. The development of private networks,business-oriented, grassroots, and educational, coupled with thefact that use of ARPANET was barred for a long time to institutionswithout Department of Defense sponsorship, initiated a movementof divergence or disjunction. By the late 1980s, although all networksadopted the basic technology of packet switching, they were alsousing widely different protocols and systems: Local Area Networks(LANs) based on 3COM Ethernet technologies; the academic networkof Unix machines; the office networks of personal computers that usedNovell file-sharing technology; the anarchic and grassroots networksof bulletin board systems; the powerful Sun workstations and theCisco routers. To each of these networks corresponded differentnetwork architectures and cultures of use, but after the switch-overfrom the research network NSF to the commercial Internet in 1995and the lifting of the ban on commercial uses, the principle of openarchitecture eventually allowed all these differences to connect, thusforming a single meshwork.At a basic level, the complex operations that enable the existenceof such an internetwork are managed by way of a hierarchic andmodular division of labour among layers.


Network Dynamics 59The functions are called layers because they are arranged in aconceptual hierarchy that proceeds from the most concrete andphysical functions (such as handling electrical signals) to the mostabstract functions (e.g. interpreting human-language commandsby users). Each higher-level function builds on the capabilitiesprovided by layers below. 29These layers go from the concrete levels of cables and wires, tocommunication protocols, to desktop applications such as browsers,email programs, audio-visual software, etc. Each layer also correspondsto different aspects of the government of the Internet, involving adhoc arrangements that include professional associations of computerscientists, telecommunication companies, national governments,educational institutions, the software industry and the ISP sector.Layers can also be linked to different political economies of theInternet (from that of the telecommunication industry that looks afterthe telephone lines, cables and fibre optics to the software industryfor desktop applications) and to different levels of technical expertise(from communication engineering to simple end-user capabilities).Innovations can thus be isolated within different layers in such away as to keep the ripple effect of such transformations limited. Inall these cases, even in the layered structure, the concern is with thepropagation of differentiations and incompatibilities.To minimize such problems, new protocols are usually insertedbetween systems or added to them, as an ulterior layer, without askingthe current system to discard its old components and substitute themimmediately with new ones. If an incompatibility emerges, it producesa ‘trigger for change’ requiring new technical and social negotiations.Generally, however, a new protocol or level is introduced that, byoperating between or on top of different layers, will allow them all tocoexist under a single common framework. This is what happened,for example, with protocols ruling the operations of gateways androuters; or with the World Wide Web; or with the X.25 protocolfavoured by European telecoms companies when it was absorbedunder TCP/IP. This incompatibility between different protocols,however, is never resolved once and for all, but is a recurrent andpervasive motif in the development of internetworking technologies.Incompatibility, understood as a tension between divergent moments,is not relinquished but brought into the network through a processof horizontal addition and/or vertical subsumption: a network is addedto other networks; a new protocol is inserted between layers. In order


60 Network Cultureto expand, an open network has to be able to extend both upwardsand sideways.Open architecture not only prescribes that internal differencesshould be maintained, but also that they should not be allowed todevelop without an accompanying elaboration of bridging devicesor protocols. A set of bodies (the Internet Architecture Board, theInternet Society and the Internet Engineering Task Force) is in chargeof regularly checking that new developments do not threaten theinteroperability of the medium. Different networks should thus beable to preserve an original peculiarity, but should also be monitoredso that such a peculiarity would not run the risk of becoming anincompatibility.Heterogeneity is inevitable and must be supported by design.Multiple types of hardware must be allowed for, e.g. transmissionspeeds differing by at least 7 orders of magnitude, variouscomputer word lengths, and hosts ranging from memory-starvedmicroprocessors up to massively parallel supercomputers. Multipletypes of application protocol must be allowed for, ranging fromthe simplest such as remote login up to the most complex suchas distributed databases. 30Network engineers argue that the internetwork was designed ‘toappear seamless. Indeed they were so successful that today’s Internetusers probably do not even realize that their messages traverse morethan one network.’ Open architecture requires an active effort to buildbridges between what is separated to start with and to bring togetheragain what has diverged too far from a common line. The result isa smooth space that is infinitely crossable by flows of informationdetached from enclosed milieus and allowed to spread throughout anelectronic maze of coaxial and fibre-optic cables and now increasinglyalso wireless frequencies.It would seem reductive to read this dynamic feature of theopen space of networking exclusively in terms of a technologicalnecessity. The technical principles offer support to a tendency thatis not simply inherent in the flexibility and reprogrammabilityof information technologies. It is almost as if the open space ofinternetworking was a technical solution, not only to hardwareand software incompatibilities, but also to the tensions introducedby the postmodern celebration of difference. If, within postmodernorganization theory, difference is celebrated as a positive source of


Network Dynamics 61added value (as in the New Labour campaign for a New MulticulturalBritain in the mid 1990s), it also presents another side whichconstitutes a kind of underlying problematic of postmodern theory.How can difference be productively engaged with when the latter alsoexpresses a tendency of the social to decompose into closed enclavesor identities, coexisting but not interacting with each other outsidethe mediation of symbols or the hostility of cultural tensions?Within the field of information architecture, the rigidity ofcultures, territories, interests, languages, and egos has been as mucha material concern as has the technical incompatibility of technicalmachines. In the 1940s, Vannevar Bush complained that the scopeand differentiation of knowledge was exceeding the rigidity ofdisciplinary divisions; new devices and structures were thus neededso that thinking would be able to withstand and take advantageof the exponential rise and yet specialization of knowledge. Theproblem arises, that is, from institutional demands for cooperation atthe moment of emerging hegemony of immaterial labour over themass factory mode. 31 This tendency towards the extension of theproductive powers of cooperation is also present in J.C.R. Licklider’svision of an ‘intergalactic’ computer network as a technical expedientintended to liberate the full potential of collective thinking from thenarrow boundaries of petty narcissisms and territorial attachments.For Berners-Lee, the Institute for Particle Physics (CERN) in Geneva inthe 1980s was a microcosm of the globalization to come, in as muchas ‘[p]eople brought their machines and customs with them, andeveryone else just had to do their best to accommodate them. Thenteams went back home and, scattered as they were across time zonesand languages, still had to collaborate.’ 32 From the incompatiblesystems and the different time zones and languages at work atCERN, Berners-Lee moved to the separate areas that constituted theInternet (email, FTP, WAIS, gopher, IRC, telnet) and subsumed themall under a new protocol, http and the URL space of the Web. In allthese cases, too, we find that the technical solutions implementedto overcome the question of divergence and incompatibility startfrom the principle that it is both unrealistic and wasteful to think ordesire that heterogeneous and divergent systems should disappear.Larry Roberts at DARPA ‘viewed the diversity of computers not as anunfortunate necessity but as a strength of the system, since a networkthat connected the heterogeneous systems could offer users a widerrange of resources’. 33 And Tim Berners-Lee realized ‘that the diversity


62 Network Cultureof different computer systems and networks could be a rich resource– something to be represented, not a problem to be eradicated’. 34This openness, then, constitutes the conditions that give rise tothe most general of the political concerns expressed by a networkculture. The tension between universality and divergence thatinforms the open space of internetworking in fact produces a richcultural dynamics and a set of political questions that are taken upagain and again across network culture at large. The history andprehistory of internetworking is thus rife with pragmatic and politicalquestions such as: How does one avoid the openness of virtual spacebeing overruled by its tendency to reinforce specialized interests andnarrow group identities? How does one undermine the rigid lines ofterritorialization that divide electronic space in disconnected islandsof specialized interests and firewalled domains? In this sense, GeertLovink has rightly pointed out that Michael Hardt and AntonioNegri’s popular description of network power appears to describethe space of the internetwork more accurately than it describes thepolitical organization of global government. For Hardt and Negri,network power operates by way of a structural opening to differenceand divergence, whilst being simultaneously concerned with theirrecuperation within the horizon of an eternal empire.Network power must be distinguished from other purelyexpansionist and imperialist forms of expansion. The fundamentaldifference is that the expansiveness of the immanent concept ofsovereignty is inclusive, not exclusive. In other words, when itexpands, this new sovereignty does not annex or destroy the otherpowers it faces but on the contrary opens itself to them, includingthem in the network. 35The fundamental characteristic of imperial sovereignty is that ‘its spaceis always open’. 36 If we see the Internet as a mode of network poweras described by Hardt and Negri, we have to count as a dimension ofits openness not only a benevolent welcoming of differences but alsoa more general drive towards expansion. The Internet is an expansiveand imperial medium, not in the sense that the number of connectedusers is bound to increase (we cannot know the events that will shapethe future), but more in terms of an active openness of networkspatiality. There is nothing to stop every object from being given anInternet address that makes it locatable in electronic space. Everyobject and device can, in principle, be networked to the network


Network Dynamics 63of networks in a kind of ubiquitous computational landscape (asdescribed by Bill Gates for example in his vision of the home of thefuture). At the same time, the addition of new networks (of objects,devices, machines and people) does not leave its space unaffectedbut involves the whole duration of the network. An outcome of thistendency to acentered interoperability is that the topological anddynamical features of the Internet suggest a much more complexpicture than that of a single, electronic space–time – linking the localand the global according to a mode of simultaneous interaction ortele-command. At the same time, the general notion of an open spacesubjected to the double tension between compatibility and divergencestill leaves us with a very generic understanding of network duration.What is this dynamic movement that network architecture bothproduces and attempts to contain? And what is its relationship to thepolitical and cultural questions raised above about the microphysicsof the internetwork topology?FRINGE INTELLIGENCETemporalization penetrates the machine from all sides, theemergence of a machine marks a date, a change, different from astructural representation. 37To say that sociologists and cultural theorists have tended tooverlook the duration of electronic space does not mean that thestudy of network dynamics is a neglected field. While sociologistsand philosophers have thoroughly debated the relation betweenspace and time in network societies, mathematicians and physicistshave been busy modelling the dynamics of Internet traffic and itsrelation to the topology of cyberspace. What the former mostly seeas a single electronic space causing a space–time implosion, the lattersee as the epiphenomenal manifestation of hidden physical lawsthat make the Internet part of a more general class of biophysicalsystems. On the one hand, a technology implicated in the socialcollapse of distances, the imperialist homogenization of times, andthe reduction of the heterogeneity of the world to the one dimensionof communication; on the other hand, a type of dynamical physicalsystem characterized by a specific topological distribution, whoselaws must be discovered and formalized. In between these differentvisions of the network lies the sprawl of Internet culture – with itsvast digital archives, its mutating landscape of search engines and


64 Network Culturecorporate pages, networked home pages, mailing lists, electronicnewsletters, blogs and wikis, news sites and newsletters, spam andporn, peer-to-peer networking, bulletin board, chatlines and ICQ.Between the scientific search for laws and the sociological need toname macro-shifts in social experience, the transversal line of culturesuggests another way of framing the problem.The bewildering variety and dynamism of cultural expression onthe Internet has often been understood as an effect of a new modeof communication (distributed and many-to-many rather thancentralized and few-to-many). If we consider the technical form ofthe media, one of the basic ways in which this network of networksdiffers from the mass media system is that it does not operate bysynchronizing a closed space of receivers around a single or limitednumber of frequencies so that a particular message flow can bestreamed from a central point (involving a handful of broadcasters)to the margins (involving a segmented multiplicity of viewers). Ifmodern communication organizes space by the principle wherebymessages are beamed linearly to a segmented and privatized socialspace, distributed communication breaks this model down at thereceiver level as well as at the level of the message itself.One of the major points of departure that distinguishes the Internetfrom other modern decentralized media (from telegraphy to radio) isthat messages are not beamed or transmitted through a channel, butbroken down and let loose in the network to find their destination.There is mostly no straight line connecting a point A to a point B, buta multiplicity of potential routes through which data packets have tofind their ways. The packet-switching mode contributes to a peculiarquality of information flow within the internetwork: a diffuse andchaotic movement marked by gradients of openness and closure. Thisfeature of the movement of information through the internetworkis probably the most mythologized of all the technical aspects of theInternet, at least since the day when John Gilmore suggested that ‘theInternet treats censorship as damage and routes around it’. It is atthe basis of the notion that the Internet is in principle uncensorableand endowed with its own vitality.Particularly important among the contributions of ARPANETresearchers to the emergence of a new electronic space freed of thelimitations of isolated computers was their demonstration of thefeasibility of Paul Baran’s vision of a packet-switched network. 38 Asis well known, Paul Baran argued for two features of such networks:messages should be broken down into equally sized packets, enveloped


Network Dynamics 65by an electronic tag carrying information about the sender’s andreceiver’s addresses and the position of the packet within the overallmessage; and they should be sent out on their own to find theirdestination in the best possible way by travelling from node tonode. In other words, in a packet-switched network messages arefragmented, divided into packets, and sent off into the big widenetwork to find the quickest way to their destination.This mode of communication was initially deemed unsuitablefor voice networks, because telephonic conversations need thecontinuous use of a line in order to carry the high informationcontent of the voice. In the light of the technical limitations of the1950s and 1960s, when the model was devised, to packet-switcha telephone conversation or to break down the voice into severalpackets would have involved substantial corruption in the qualityof information. Thus telephone conversations required a ‘circuitswitching’ approach that opens up and reserves a line for the wholeduration of the conversation (circuit switching of early telephonywas entrusted to the figure of the telephone operator, usually a youngwoman). Information traffic among computers, on the other hand,tended to happen in bursts, and this left a considerable amount ofbandwidth unused. Baran thought that by implementing a packetswitchingnetwork, bandwidth could be more effectively managedand made full use of. If a line was not used for a while, then it couldbe used by other packets looking for all possible places of passagein their search for their destination. Significantly, Baran proposedthat a packet-switched network should be digital, in as much asanalogue signals would lose quality once submitted to the lengthyrelay journey of data packets in packet-switched mode. Unliketelegraphy and telephony, then, the communication of informationin computer networks does not start with a sender, a receiver and aline, but with an overall information space, constituted by a tangleof possible directions and routes, where information propagatesby autonomously finding the lines of least resistance. Messagesare broken down into packets and each packet is sent out into thenetwork to find its destination by being relayed around through anetwork of autonomous and decentralized nodes. If any obstaclesarise along the main lines, the various packets can be sent out indifferent directions to find their own best possible routes. Messagesare broken down so as to be able to maximize the communicationcapacity of telephone lines, avoid interference and accidental ordeliberate obstructions and ruptures in the communication flow.


66 Network CultureThe space of communication is thus radically modified: yes, thereare still messages being transmitted, but the messages are brokendown and granted a kind of autonomous movement. The linearitywhich channelled the turbulent flow of information in the modernmedia is supplemented by another principle, where there are not somuch senders, channels and receivers, as nodes, relays and packets,all lying flat within a space of more or less open, or obstructed, flow.If a node is blocked, the packet or datagram is sent off to a differentnode, which in its turn will pass it on until the shortest possible routeis found. This shortest possible route, however, is variable and overalldependent on the traffic and usage of the network.While the distributed movement of packets holds a key positionwithin the mythology of the self-organizing Internet, the othercorollary of packet switching (fringe intelligence) is less remarkedupon but equally important. In a distributed network, we do not havea relation between aerials and transmitters, but among an assembageof semi-autonomous nodes that are programmed with a kind of basicintelligence. As the Network Working Group put it, the intelligenceis end to end. 39 This is a pragmatic choice in as much as a traffic ofthe magnitude and nonlinear complexity of that crossing computernetworks on a daily basis cannot operate without distributing theresponsibility to the end systems themselves.A specific case is that any network, however carefully designed,will be subject to failures of transmission at some statisticallydetermined rate. The best way to cope with this is to accept it,and give responsibility for the integrity of communication to theend systems. 40It is this delegation of responsibility to the margins, or fringeintelligence, that is, so to speak, the technical engine on which thetendency to divergence can ride. In Baran’s model each node shouldbe able to select the quickest route from A to B (the line of leastresistance), but it should also record this route in its routing tables,in order to maximize the transmission rate. When another nodeis taken down, each node should be able to ‘forget’ that node andthat route; find and remember a new one; and update its routingtables for next time. In this way, for Baran the network should beable to survive the destruction of up to 50 per cent of its nodes, afterwhich even a distributed network would be hard-pressed to recover.Electronic space is made adaptive through the selective introduction


Network Dynamics 67of a ‘network memory’, capable of remembering and forgetting inrelation to the perception that nodes have of the network at anygiven moment. ‘What is envisioned is a network of unmanned digitalswitches implementing a self-learning policy at each node so thatoverall traffic is effectively routed in a changing environment …without need for a central and possibly vulnerable control point.’ 41The combination of packet switching, end-to-end intelligence androuting makes computer networks a kind of distributed neural networkable to remember and forget. This fringe intelligence produces aspace that is not just a ‘space of passage’ for information, but aninformational machine itself – an active and turbulent space. Thisquality of informational space makes the Internet a space suitable tothe spread of contagion and transversal propagation of movement(from computer viruses to ideas and affects).In a packet-switched network, then, there is no simple vector orroute between A (for example a computer in Taiwan) and B (a computerin Cape Town). Even as such space is perceived as simultaneous fromthe point of view of the subjects engaged in receiving and sendingmessages in real time from all over the world, that does not implyan actual simultaneity – as if the whole of the electronic space werea blank support for the transmission of messages. Beneath the levelof desktop applications such as browsers and email, the space of theinternetwork is continuously although unevenly agitated, constrainedand transformed by the movement of packets. The route which amessage might take has to take into account at all points the overalltraffic of the network, the points of blockages, the local catastrophes,political repressions, cultures of secrecies, the blind alleys and theopen channels. The various cultural formations and forces that overthe years have engaged with the medium have enthusiastically takenover this feature of network space and made of it a fundamentalaspect of its overall culture. This movement is the condition withinwhich Internet culture operates and it constitutes an importantinterface with the world of locality. The relation between local andglobal, the territory and the network is thus that of fluctuation, ofan increased or decreased, obstructed or relayed flow.The layered structure of the Internet emerges, so to speak, asa system of channels and microchannels that can structure suchbasic turbulent flow. As we have seen, the nonlinear movementof information does not produce a random and homogeneousinformational milieu, but it is given form and organized by semioticeconomies, client–server relationships, local points of centralization,


68 Network Cultureand international boards in charge of monitoring and assessing theviability of its technical protocols to the massive scaling processundergone by the medium at different stages of its short history. Thenonlinear movement of information, then, makes the Internet notso much a unified electronic grid as a chaotic informational milieu– whether it is about the movement of email, the diffusion of viruses,the constitution of ad hoc communication arrangements (such asmirror sites for sites that are censored) or the periodic surfacingand disappearance of pirate islands of connectivity (as for examplein ‘warez’ or pirate software sites). 42 It is here that the practicesof appearance and disappearance, assembling and dismantling,forwarding and deleting unfold within a veritable informationalecology of viruses and parasites, petitions and cries for help, scamsand junk mails. There is no cultural experimentation with aestheticforms or political organization, no building of alliances or elaborationof tactics that does not have to confront the turbulence of electronicspace. 43 The politics of network culture are thus not only aboutcompeting viewpoints, anarchic self-regulation and barriers toaccess, but also about the pragmatic production of viable topologicalformations able to persist within an open and fluid milieu.Internet culture has thus given us some important topoi of networkspace, able to capture and impart a specific speed and consistencyto the potential indeterminacy of information flows. We have hadboards and domains, lists and webs, but also spheres and ringsbinding local areas of connectivity within an open informationspace. These figures express the power of a local movement able tobind the turbulence of the flows within a new type of informationalstructure. If bulletin board systems offered a steady platform fromwhich to launch oneself into an open info-space, the fragmentationof web space that favours the power of attraction of the portals iscounteracted by web rings, where small sites with similar contentform a circular vortex able to capture and channel the attention ofthe web surfer. The sphere has also been used as another model forthe multidimensional aggregation of web rings, or simply as a wayto designate a kind of informational gravitation around a commonorbit. The term ‘blogosphere’ for example designates the ways inwhich all the personal web logs, or hyperlinked online journals,can be considered as ultimately related to each other within theinformational orbit of the blogging movement. 44 These figures aretraced not so much by a link connecting two different sites acrossthe grid of a common domain name, as by transversal movements,


Network Dynamics 69continuously spilling out of the grid, constituting the network asa space of centripetal and centrifugal movements, of spirals andvortexes, in various overall states of contraction and dilation.While the topology of the web maintains a certain level of soliditythat makes the notion of rings and spheres appropriate, the overallspace of the internetwork is also crossed by vortical movements thatbetray a microscopic fluidity and instability – informed by the powerof rippling centrifugal and centripetal forces. A network microphysicsis also made of temporary and unstable alliances and relations – suchas the temporary chat channels that are opened up and closed downfor the duration of a conversation or the fleeting email contacts thatrandomly link distant and even opposed areas of network space.These figures and contacts involve a crucial experimentation withthe peculiar semi-fluid mechanics of network space at large. Internetforms such as list servers and majordomo lists actively experimentwith such instability to build vortical structures able to combinepermanence and impermanence, dynamism and consistency. Asuccessful mailing list has to be able to exercise enough of a pull onits members and the material circulated so as to keep a certain levelof consistency and dynamism, durability and renewal. 45 It opens upthe political model of the ‘group’ with its tendency to sectarianismand infighting to an open informational milieu. The production ofcollective modes of organization (from a ‘group’ to ‘collaborativefiltering’ and ‘collective intelligence’) requires not only a knowledgeof social and psychological dynamics informing a collective mode ofproduction, but also an active engagement with the larger ecology ofthe Internet – screening out the junk, balancing the passive energyof the lurkers with the hyperactivism of the regular posters andkeeping the vandals out. 46 The history of the Internet is litteredwith failed experiments in network hydrodynamics, vortexes thatdissolved under the tides of the network. We can only mention herea few classic stories of centripetal dissolution: Alluquère RosanneStone’s CommuniTree, a 1980s community bulletin board eventuallydismantled by the charge of hordes of teenage hackers; 47 Inke Arns andAndreas Broeckmann’s activist list Syndicate, done in by net parasitesand anti-censorship campaigners; 48 and other classic examples ofcommunity networks being pulled apart by powerful economic forces,such as Howard Rheingold’s Electric Mind, Amsterdam’s public DigitalCity project and the famous net-art list ‘rhizome’. Each of these casesexpresses a specificity which needs to be analysed in its own particularcircumstances (thus for CommuniTree the problem was that of the


70 Network Cultureimmaturity of cyberculture; for the Digital City it was the dot-comwave that proved to be too much; and so on). However, overall, wecan claim that what all these examples share is the fact that for awhile they maintained a precarious balance between centripetal andcentrifugal forces which allowed them to prosper in spite of whatPhil Agre has called ‘the always imminent threat of heat death’ thatbefalls all those informational experiments that are unable to dealwith the entropic dynamics of the network milieu. 49This difficult balancing act is obviously far from being an exclusivefeature of the Internet (indeed in as much as the informationaldimension crosses all communication spaces we might use a similarmodel to analyse the emergence and disapperance of culturalformations as such). On the other hand, it is within a culture ofinternetworked, informational and distributed communicationthat the dynamics through which cultural assemblages are formedexplicitly becomes the condition for a kind of network micropolitics.Any local, that is bounded, cultural phenomenon within the networkis always caught on one side by the danger of being overwhelmed bythe open network ecology; and on the other side by that of solidifyingto the point of becoming a self-contained and self-referentialarchipelago of the like-minded (and hence succumbing to a kind ofheat death). On the other hand, this mobility of cyberspace can alsobe productively exploited – hackers have tuned the art of formingand deforming cyberspace to a fine degree. Warez boards offeringpirate software, for example, are particular evanescent and mobileinformational islands, appearing and disappearing, springing out ofnowhere, signalled only to insiders, only to dissolve as soon as thefrantic transactions are carried out.We also need to emphasize that while these centripetal andcentrifugal movements are central to the evolution of electronicspace, they do not take place within an isolated and self-referentialinfosphere. On the contrary, they are related to the overallinformational dimension that cuts across the global matrix ofcommunication of which the Internet is part. As an open space,the Internet is open not only to the addition of new nodes, but alsoto the informational flows relayed by television, radio and popularculture, as well as by political passions involving social antagonismsand conflicts. The centripetal/centrifugal movements that determinethe fate of informational cultures are open to the overall plane ofcommunication, and as such the Internet can also be said to becharacterized by another duration, that which relates it to the


Network Dynamics 71various states of contraction and dilation experienced by the globalcommunication system.The openness of the Internet in relation to the network matrixof which it is a part was foregrounded, for example, in the wakeof the attacks on the World Trade Center on 11 September 2001.If 9/11 was at some level a quintessential televisual global event,it also brought to light some of the dynamic features of the overallconfiguration of communication that links in a relationship ofresonance or direct interconnection a multiplicity of media outletsand communication devices. The attacks, in fact, famously broughtthe Internet to a state of virtual standstill. It was not simply thatthey destroyed a key communication centre located underneaththe Twin Towers, but that they also provoked an unprecedentedand simultaneous use of the Internet as a way of gathering news,contacting friends and acquaintances, or simply exchanging reactionsand opinions. This synchronized assault on bandwidth did not somuch paralyse the network as contract it – at first towards the greatInternet portals, but successively also in a dense transversal traffic ofnews and commentary, direct reporting and critical interventions, ofnew web pages and web logs, an activity that whipped the Internetinto a kind of electronic frenzy. The network, that is, is not a closedelectronic space, but it is literally contracted by the intensity of theinformational flows that reach it from the outside, an intensity whichrises and declines, disperses and diversifies again to the rhythms ofthe geopolitical events, social debates and cultural trends that arethe whole onto which a network duration opens. 50AFTERTHOUGHTWhen looking at the socio-technical organization of informationoperated by Internet protocols, it is difficult not to be struck by howa military and scientific technology has come to model so well afractal and turbulent mode of globalization that exceeds on all sidesthe early rhetoric of immateriality and timelessness. The polaritybetween levels of universality and movements of differentiationthat concerned network architects and engineers mirrors that of aprocess of globalization that beneath the shadow cast by neo-imperialformations is similarly striving for some kind of pragmatic politicsable to conjugate such tendencies within a common, constitutivemovement. The tension arising out of the incompatibilities anddivergences produced by pre-existing differences and an ongoing


72 Network Cultureprocess of differentiation becomes a tendency towards the productionof new structures that temporarily resolve some of the incompatibilitiesat stake. But in a way, we can say that the whole plane of networkculture is crossed by social undercurrents that pose the problem of aglobal geopolitics not only at the level of a technical infrastructure,but, more importantly, at the speed of cultural and informationalflows. A network culture can never be a unitary formation, describinga homogeneity of practices across a global communication matrix. Onthe contrary, if such a thing exists, it can only describe the dynamicsinforming the cultural and political process of recomposition anddecomposition of a highly differentiated, multi-scaled and yetcommon global network culture.


3Free Labour 1The real not-capital is labour.(Karl Marx Grundrisse)Working in the digital media industry was never as much funas it was made out to be. Certainly, for the workers of the bestknownand most highly valued companies, work might have beena brief experience of something that did not feel like work at all. 2On the other hand, even during the dot-com boom the ‘netslaves’of the homonymous webzine had always been vociferous aboutthe shamelessly exploitative nature of the job, its punishing workrhythms and its ruthless casualization. 3 They talked about ‘24/7electronic sweatshops’, complained about the 90-hour week and the‘moronic management of new media companies’. Antagonism in thenew media industry also affected the legions of volunteers runningwell-known sites for the Internet giants. In early 1999, seven of the15,000 ‘volunteers’ of America Online rocked the info-loveboat byasking the Department of Labor to investigate whether AOL owedthem back wages for the years of playing chat hosts for free. 4 Theyused to work long hours and love it; but they also felt the pain ofbeing burned by digital media.These events pointed to an inevitable backlash against theglamorization of digital labour, which highlighted its continuitieswith the modern sweatshop and the increasing degradation ofknowledge work. Yet the question of labour in a ‘digital economy’as an innovative development of the familiar logic of capitalistexploitation is not so easly dismissed. The netslaves are not simplya typical form of labour on the Internet; they also embody a complexrelation to labour, which is widespread in late capitalist societies.In this chapter, we call this excessive activity that makes the Interneta thriving and hyperactive medium ‘free labour’ – a feature of thecultural economy at large, and an important, yet unacknowledged,source of value in advanced capitalist societies. By looking at theInternet as a specific instance of the fundamental role played by freelabour, we will also highlight the connections between the ‘digital73


74 Network Cultureeconomy’ and what the Italian autonomists have called the ‘socialfactory’ (or ‘society–factory’). 5 The ‘society–factory’ describes a processwhereby ‘work processes have shifted from the factory to society,thereby setting in motion a truly complex machine’. 6 Simultaneouslyvoluntarily given and unwaged, enjoyed and exploited, free labour onthe Net includes the activity of building web sites, modifying softwarepackages, reading and participating in mailing lists and buildingvirtual spaces. Far from being an ‘unreal’, empty space, the Internetis animated by cultural and technical labour through and through,a continuous production of value which is completely immanent inthe flows of the network society at large.Support for this argument, however, is immediately complicatedby the recent history of Anglo-American cultural theory. Howshould we speak of labour, especially cultural and technical labour,after the demolition job carried out by 30 years of postmodernism?The postmodern socialist feminism of Donna Haraway’s ‘CyborgManifesto’ spelled out some of the reasons behind the antipathy of1980s critical theory for Marxist analyses of labour. Haraway explicitlyrejected the humanistic tendencies of theorists who see the latteras the ‘pre-eminently privileged category enabling the Marxist toovercome illusion and find that point of view which is necessary forchanging the world’. 7 Paul Gilroy similarly expressed his discontentat the inadequacy of the Marxist analysis of labour to the descendantsof slaves, who value artistic expression as ‘the means towards bothindividual self-fashioning and communal liberation’. 8 If labour is‘the humanizing activity that makes [white] man’, then, surely, this‘humanising’ labour does not really belong in the age of networked,posthuman intelligence.However, the ‘informatics of domination’ which Harawaydescribes in the ‘Manifesto’ is certainly preoccupied with the relationbetween technology, labour and capital. In the 20 years since itspublication, this triangulation has become even more evident.The expansion of the Internet has given ideological and materialsupport to contemporary trends towards increased flexibility of theworkforce, continuous reskilling, freelance work, and the diffusionof practices such as ‘supplementing’ (bringing supplementary workhome from the conventional office). 9 Advertising campaigns andbusiness manuals suggest that the Internet is not only a site ofdisintermediation (embodying the famous death of the middle man,from bookshops to travel agencies and computer stores), but also the


Free Labour 75means through which a flexible, collective network intelligence hascome into being.I will not offer here a judgement on the ‘effects’ of the Internet onsociety. What I will rather do is map the way in which the Internetconnects to the autonomist ‘social factory’. We will look, that is, athow the ‘outernet’ – the network of social, cultural and economicrelationships which criss-crosses and exceeds the Internet – surroundsand connects the latter to larger flows of labour, culture and power.It is fundamental to move beyond the notion that cyberspace is aboutescaping reality in order to understand how the reality of the Internetis deeply connected to the development of late postindustrial societiesas a whole. It is related to phenomena that have been defined as‘external economies’ within theoretical pespectives (such as thetheory of transaction costs) suggesting that ‘the production of valueis increasingly involving the capture of productive elements andsocial wealth that are outside the direct productive process …’. 10Cultural and technical work is central to the Internet but is also awidespread activity throughout advanced capitalist societies. Suchlabour is not exclusive to so-called ‘knowledge workers’, but is apervasive feature of the postindustrial economy. The pervasivenessof such diffuse cultural production questions the legitimacy of a fixeddistinction between production and consumption, labour andculture. It also undermines Gilroy’s distinction between work as‘servitude, misery and subordination’ and artistic expression as themeans to self-fashioning and communal liberation. The increasinglyblurred territory between production and consumption, work andcultural expression, however, does not signal the recomposition ofthe alienated Marxist worker. The Internet does not automaticallyturn every user into an active producer, and every worker into acreative subject. The process whereby production and consumptionare reconfigured within the category of free labour signals theunfolding of another logic of value, whose operations need carefulanalysis. 11THE DIGITAL ECONOMYThe term ‘digital economy’ emerged in the late 1990s as a way tosummarize some of the processes described above. As a term, it seemsto describe a formation which intersects on the one hand with thepostmodern cultural economy (the media, the university and the arts)


76 Network Cultureand on the other hand with the information industry (the informationand communication complex). Such an intersection of two differentfields of production constitutes a challenge to a theoretical andpractical engagement with the question of labour, a question whichhas become marginal for media studies as compared with questionsof ownership (within political economy) and consumption (withincultural studies).We will distinguish here between the New Economy, ‘a historicalperiod marker [that] acknowledges its conventional association withInternet companies’, 12 and the digital economy – a less transientphenomenon based on key features of digitized information (its easeof copying and low or zero cost of sharing). In Richard Barbrook’sdefinition, the digital economy is characterized by the emergence ofnew technologies (computer networks) and new types of worker (suchas digital artisans). 13 According to Barbrook, the digital economy is amixed economy: it includes a public element (the state’s funding ofthe original research that produced ARPANET, the financial support toacademic activities which had a substantial role in shaping the cultureof the Internet); a market-driven element (a latecomer that tries toappropriate the digital economy by reintroducing commodification);and a gift economy (the true expression of the cutting edge ofcapitalist production which prepares its eventual overcoming into afuture ‘anarcho-communism’).What Barbrook proposed was that the vision of politicians andcorporate leaders who linked the future of capitalism to the informationalcommodity involved a basic misunderstanding. Pointingto the world of discussion groups, mailing lists and the distributedlearning of programmers, he suggested that the Internet was far fromsimply being a new way to sell commodities. The predominance ofrelationships of collaboration across distance and exchange withoutmoney suggested that this was a practised relationship with a viableand alternative political and economic model.Unrestricted by physical distance, they collaborate with each otherwithout the direct mediation of money and politics. Unconcernedabout copyright, they give and receive information withoutthought of payment. In the absence of states or markets to mediatesocial bonds, network communities are instead formed throughthe mutual obligations created by gifts of time and ideas. 14


Free Labour 77Barbrook’s vision of the informational commons was only reinforcedby the subsequent explosion of peer-to-peer, file-sharing networks– a huge network phenomenon that had the music and film industryup in arms.From a Marxist–Hegelian angle, Barbrook saw the high-tech gifteconomy as a process of overcoming capitalism from the inside. Thehigh-tech gift economy is a pioneering moment which transcendsboth the purism of the New Left do-it-yourself culture and theneoliberalism of the free-market ideologues: ‘money–commodity andgift relations are not just in conflict with each other, but also co-existin symbiosis.’ 15 Participants in the gift economy are not reluctant touse market resources and government funding to pursue a potlatcheconomy of free exchange. However, the potlatch and the economyultimately remain irreconcilable, and the market economy is alwaysthreatening to reprivatize the common enclaves of the gift economy.Commodification, the reimposition of a regime of property, is, inBarbrook’s opinion, the main strategy through which capitalism triesto bring back the anarcho-communism of the Net into its fold.This early attempt to offer a polemical platform from which tothink about the digital economy overemphasized the autonomyof the high-tech gift economy from capitalism. The processes ofexchange which characterize the Internet are not simply the reemergenceof communism within the cutting edge of the economy,a repressed other which resurfaces just at the moment whencommunism seems defeated. It is important to remember that thegift economy, as part of a larger informational economy, is itselfan important force within the reproduction of the labour force inlate capitalism as a whole. The provision of ‘free labour’, as we shallsee later, is a fundamental moment in the creation of value in theeconomy at large – beyond the digital economy of the Internet. Aswill be made clear, the conditions that make free labour an importantelement of the digital economy are based on a difficult, experimentalcompromise between the historically rooted cultural and affectivedesire for creative production (of the kind more commonly associatedwith Gilroy’s emphasis on ‘individual self-fashioning and communalliberation’) and the current capitalist emphasis on knowledge as themain source of added value.The volunteers for America On Line, the netslaves and the amateurweb designers did not work only because capital wanted them to, butthey were acting out a desire for affective and cultural productionwhich was none the less real just because it was socially shaped.


78 Network CultureThe cultural, technical and creative work which supported the NewEconomy had been made possible by the development of capitalbeyond the early industrial and Fordist modes of production andtherefore is particularly abundant in those areas where post-Fordismhas been at work for several decades. In the overdeveloped countries,the end of the factory has spelled out the marginalisation of the oldworking class, but it has also produced generations of workers whohave been repeatedly addressed as active consumers of meaningfulcommodities. Free labour is the moment where this knowledgeableconsumption of culture is translated into excess productive activitiesthat are pleasurably embraced and at the same time often shamelesslyexploited.Management theory has also been increasingly concerned withthe question of knowledge work, that indefinable quality which isessential to the processes of stimulating innovation and achievingthe goals of competitiveness. For example, Don Tapscott, in a classicexample of New Economy managerial literature, The Digital Economy,wrote about a ‘new economy based on the networking of humanintelligence’. 16 Human intelligence provides the much needed addedvalue, which is essential to the economic health of the organization.Human intelligence, however, also poses a problem: it cannot bemanaged in quite the same way as more traditional types of labour.Knowledge workers need open organizational structures in order toproduce, because the production of knowledge is rooted in collaboration;this is what Barbrook had defined as the ‘gift economy’.… the concept of supervision and management is changing to teambasedstructures. Anyone responsible for managing knowledgeworkers know they cannot be ‘managed’ in the traditional sense.Often they have specialized knowledge and skills that cannot bematched or even understood by management. A new challenge tomanagement is first to attract and retain these assets by marketingthe organization to them, and second to provide the creative andopen communications environment where such workers can effectivelyapply and enhance their knowledge. 17For Tapscott, therefore, the digital economy magically resolves thecontradictions of industrial societies, such as class struggle: whereasin the industrial economy the ‘worker tried to achieve fulfillmentthrough leisure [and]… was alienated from the means of productionwhich were owned and controlled by someone else’, in the digital


Free Labour 79economy the worker achieves fulfillment through work and finds inher brain her own, unalienated means of production. 18 Such meansof production need to be cultivated by encouraging the workerto participate in a culture of exchange, whose flows are mainlykept within the company but also need to involve an ‘outside’, acontact with the fast-moving world of knowledge in general. Theconvention, the exhibition and the conference – the traditional waysof supporting this general exchange – are supplemented by networktechnologies both inside and outside the company. Although thetraffic of these flows of knowledge needs to be monitored (hence thecorporate concerns about the use of intranets), the Internet effectivelyfunctions as a channel through which ‘human intelligence’ renewsits capacity to produce.Is it possible to look beyond the totalizing hype of the managerialliterature, but also beyond some of the conceptual limits of Barbrook’sgift economy model? We will look at some possible explanationsfor the coexistence, within the debate about the digital economy,of discourses which see it as an oppositional movement and otherswhich see it as a functional development to new mechanisms ofextraction of value. Is the end of Marxist alienation wished for bythe management guru the same thing as the gift economy heraldedby leftist discourse?We can start undoing this deadlock by subtracting the label ‘digitaleconomy’ from its exclusive anchorage within advanced forms oflabour (we can start, then, by de-pioneering it). This chapter describesthe ‘digital economy’ as a specific mechanism of internal ‘capture’ oflarger pools of social and cultural knowledge. The digital economy isan important area of experimentation with value and free cultural/affective labour. It is about specific forms of production (web design,multimedia production, digital services and so on), but it is also aboutforms of labour we do not immediately recognize as such: chat, reallifestories, mailing lists, amateur newsletters and so on. These typesof cultural and technical labour are not produced by capitalism inany direct, cause-and-effect fashion, that is they have not developedsimply as an answer to the economic needs of capital. However, theyhave developed in relation to the expansion of the cultural industriesand they are part of a process of economic experimentation with thecreation of monetary value out of knowledge/culture/affect.This process is different from that described by popular, left-wingwisdom about the incorporation of authentic cultural moments: it isnot, then, about the bad boys of capital moving in on underground


80 Network Culturesubcultures or subordinate cultures and ‘incorporating’ the fruitsof their production (styles, languages, music) into the media foodchain. This process is usually considered the end of a particularcultural formation, or at least the end of its ‘authentic’ phase.After incorporation, local cultures are picked up and distributedglobally, thus contributing to cultural hybridization or culturalimperialism (depending on whom you listen to). Rather than capital‘incorporating’ from the outside the authentic fruits of the collectiveimagination, it seems more reasonable to think of cultural flows asoriginating within a field which is always and already capitalism.Incorporation is not about capital descending on authentic culture,but a more immanent process of channelling of collective labour(even as cultural labour) into monetary flows and its structurationwithin capitalist business practices.Subcultural movements have stuffed the pockets of multinationalcapitalism for decades. Nurtured by the consumption of earliercultural moments, subcultures have provided the look, style andsounds that sell clothes, CDs, video games, films and advertisingslots on television. This has often happened through the activeparticipation of subcultural members in the production of culturalgoods (independent labels in music; small designer shops in fashion). 19This participation is, as the word suggests, a voluntary phenomenon,although it is regularly accompanied by cries of ‘Sell-out!’ The fruitsof collective cultural labour have been not simply appropriated, butvoluntarily channelled and controversially structured within capitalistbusiness practices. The relation between culture, the cultural industryand labour in these movements is much more complex than thenotion of incorporation suggests. In this sense, the digital economyis not a new phenomenon, but simply a new phase of this longerhistory of experimentation.KNOWLEDGE CLASS AND IMMATERIAL LABOURIn spite of the numerous, more or less disingenuous endorsements ofthe democratic potential of the Internet, its links with capitalism havealways been a bit too tight for comfort to concerned political minds.It has been very tempting to counteract the naive technologicalutopianism by pointing out how computer networks are the materialand ideological heart of informated capital. The Internet advertisedon television and portrayed by the print media seems not just thelatest incarnation of capital’s inexhaustible search for new markets,


Free Labour 81but also a full consensus-creating machine, which socializes themass of proletarianized knowledge workers into the economy ofcontinuous innovation. 20 After all, if we do not get online soon, thehype suggests, we will become obsolete, unnecessary, disposable. Ifwe do, we are promised, we will become part of the ‘hive mind’, theimmaterial economy of networked, intelligent subjects in charge ofspeeding up the rhythms of capital’s ‘incessant waves of branchinginnovations’. 21 Multimedia artists, writers, journalists, softwareprogrammers, graphic designers and activists, together with smalland large companies, are at the core of this project. For some they arethe cultural elite, for others a new form of proletarianized labour. 22Accordingly, digital workers are described as resisting or supportingthe project of capital, often in direct relation to their positions inthe networked, horizontal and yet hierarchical world of knowledgework.Any judgement on the political potential of the Internet, then, istied not only to its much vaunted capacity to allow decentralizedaccess to information, but also to the question of who uses theInternet and how. If the decentralized structure of the Net is to countfor anything at all, the argument goes, then we need to know aboutits constituent population (hence the endless statistics about income,nationality, gender and race of Internet users, the most polled, probedand yet opaque survey material in the world). If this population is stilllargely made up of ‘knowledge workers’, a global elite with no ties toa disenfranchised majority, then it matters whether these are seenas the owners of elitist cultural and economic power or the avantgardeof new configurations of labour which do not automaticallyguarantee elite status.The question of who uses the Internet is both necessary and yetmisleading. It is necessary because we have to ask who is participatingin the digital economy before we can pass a judgement on the latter.It is misleading because it implies that all we need to know is howto locate the knowledge workers within a ‘class’, and knowing whichclass it is will give us an answer to the political potential of the Netas a whole. If we can prove that knowledge workers are the avantgardeof labour, then the Net becomes a site of resistance; 23 if we canprove that knowledge workers wield the power in informatedsocieties, then the Net is an extended gated community for themiddle classes. 24 Even admitting that knowledge workers are indeedfragmented in terms of hierarchy and status won’t help us that much;it will still lead to a simple system of categorization, in which the


82 Network CultureNet becomes a field of struggle between the diverse constituents ofthe knowledge class.The question is further complicated by the stubborn resistanceof ‘knowledge’ to quantification: knowledge cannot be exclusivelypinned down to specific social segments. Although the shift fromfactory to office work, from production to services, is widelyacknowledged, it just isn’t clear why some people qualify andsome others do not. 25 The ‘knowledge worker’ is a very contestedsociological category.A more interesting move is possible, however, by not lookingfor the knowledge class within quantifiable parameters but byconcentrating instead on ‘labour’. Although the notion of classretains a material value which is indispensable to make sense ofthe experience of concrete historical subjects, it also has its limits:for example it ‘freezes’ the subject, just like a substance within thechemical periodical table – one is born as a certain element (workingclass metal) but then might become something else (middle classsilicon) if submitted to the proper alchemical processes (educationand income). Such an understanding of class also freezes out the flowsof culture and money which mobilize the labour force as a whole. Interms of Internet use, it gives rise to the generalized endorsementsand condemnations which I have described above and does notexplain or make sense of the heterogeneity and yet commonaltiesof Internet users. It seems therefore more useful to think in termsof what the Italian autonomists, and especially Maurizio Lazzarato,have described as immaterial labour. For Lazzarato, the concept ofimmaterial labour refers to two different aspects of labour:On the one hand, as regards the ‘informational content’ of thecommodity, it refers directly to the changes taking place in workers’labor processes … where the skills involved in direct labor areincreasingly skills involving cybernetics and computer control(and horizontal and vertical communication). On the otherhand, as regards the activity that produces the ‘cultural content’of the commodity, immaterial labor involves a series of activitiesthat are not normally recognized as ‘work’ – in other words, thekinds of activities involved in defining and fixing cultural andartistic standards, fashions, tastes, consumer norms, and, morestrategically, public opinion. 26


Free Labour 83Immaterial labour, unlike the knowledge worker, is not completelyconfined to a specific class formation. Lazzarato insists that this formof labour power is not limited to highly skilled workers, but is aform of activity of every productive subject within postindustrialsocieties. In the highly skilled worker, these capacities are alreadythere. In the young worker, however, the ‘precarious worker’, andthe unemployed youth, these capacities are ‘virtual’, that is theyare there but are still undetermined. This means that immateriallabour is a virtuality (an undetermined capacity) which belongs tothe postindustrial productive subjectivity as a whole. For example,the obsessive emphasis on education of 1990s governments can beread as an attempt to stop this virtuality from disappearing or frombeing channelled into places which would not be as acceptableto the current power structures. In spite of all the contradictionsof advanced capital and its relation to structural unemployment,postmodern governments do not like the completely unemployable.The potentialities of work must be kept alive, the unemployed mustundergo continuous training in order to be both monitored andkept alive as some kind of postindustrial reserve force. Nor can theybe allowed to channel their energy into the experimental, nomadic,and antiproductive lifestyles which in Britain have been so savagelyattacked by the Criminal Justice Act since the mid 1990s. 27However, unlike the post-Fordists, and in accordance with hisautonomist origins, Lazzarato does not conceive of immaterial labouras purely functional to a new historical phase of capitalism:The virtuality of this capacity is neither empty nor ahistoric; it israther an opening and a potentiality, that have as their historicalorigins and antecedents the ‘struggle against work’ of the Fordistworker and, in more recent times, the processes of socialization,educational formation, and cultural self-valorization. 28This dispersal of immaterial labour (as a virtuality and an actuality)problematizes the idea of the ‘knowledge worker’ as a class in the‘industrial’ sense of the word. As a collective quality of the labourforce, immaterial labour can be understood to pervade the socialbody with different degrees of intensity. This intensity is producedby the processes of ‘channelling’ of the capitalist formation whichdistributes value according to its logic of profit. 29 If knowledge isinherently collective, this is even more the case in the postmoderncultural economy: music, fashion, and information are all produced


84 Network Culturecollectively but are selectively compensated. Only some companiesare picked up by corporate distribution chains in the case of fashionand music; only a few sites are invested in by venture capital. Howeverit is a form of collective cultural labour which makes these productspossible even though the profit is disproportionately appropriatedby established corporations.From this point of view, the well-known notion that the Internetmaterializes a ‘collective intelligence’ is not completely off the mark.The Internet highlights the existence of networks of immateriallabour and speeds up their accretion into a collective entity. Theproductive capacities of immaterial labour on the Internet encompassthe work of writing/reading/managing and participating in mailinglists/websites/chat lines. These activities fall outside the conceptof ‘abstract labour’, which Marx defined as the provision of timefor the production of value regardless of the useful qualities of theproduct. 30 They witness an investment of desire into productionof the kind cultural theorists have mainly theorized in relation toconsumption.This explosion of productive activities was undermined for variouscommentators by the globally privileged character of the Internetpopulation. However, we might also argue that to recognize theexistence of immaterial labour as a diffuse, collective quality ofpostindustrial labour in its entirety does not deny the existence ofhierarchies of knowledge (both technical and cultural) which prestructure(but do not determine) the nature of such activities. Thesehierarchies shape the degrees to which such virtualities becomeactualities, that is they go from being potential to being realized asprocessual, constituting moments of cultural, affective, and technicalproduction. Neither capital nor living labour want a labour forcewhich is permanently excluded from the possibilities of immateriallabour. But this is where their desires cease to coincide. Capitalwants to retain control over the unfolding of these virtualities andthe processes of valorization. The relative abundance of cultural/technical/affective production on the Net, then, does not exist as afree-floating postindustrial utopia but in full, mutually constitutinginteraction with late capitalism.COLLECTIVE MINDSThe collective nature of networked, immaterial labour was exalted bythe utopian statements of the 1990s cyberlibertarians. Kevin Kelly’s


Free Labour 85popular thesis in Out of Control, for example, suggested that theInternet is a collective ‘hive mind’. According to Kelly, the Internetis another manifestation of a principle of self-organization that iswidespread throughout technical, natural and social systems. TheInternet is the material evidence of the existence of the self-organizing,infinitely productive activities of connected human minds. 31 From adifferent perspective, Pierre Levy drew on cognitive anthropology andpoststructuralist philosophy to argue that computers and computernetworks enable the emergence of a ‘collective intelligence’. Levy,who is inspired by early computer pioneers such as Douglas Engelbart,argues for a new humanism ‘that incorporates and enlarges the scopeof self-knowledge and collective thought’. 32 According to Levy, we arepassing from a Cartesian model of thought based upon the singularidea of cogito (I think) to a collective or plural cogitamus (we think):What is collective intelligence? It is a form of universally distributedintelligence, constantly enhanced, coordinated in real time, andresulting in the effective mobilization of skills … The basis and goalof collective intelligence is the mutual recognition and enrichmentof individuals rather than the cult of fetishized or hypostatizedcommunities. 33Like Kelly, Levy frames his argument within the common rhetoricof competition and flexibility which dominates the hegemonicdiscourse around digitalization: ‘The more we are able to formintelligent communities, as open-minded, cognitive subjects capableof initiative, imagination, and rapid response, the more we will beable to ensure our success in a highly competitive environment.’ 34In Levy’s view, the digital economy highlights the impossibility ofabsorbing intelligence within the process of automation: unlike thefirst wave of cybernetics, which displaced workers from the factory,computer networks highlight the unique value of human intelligenceas the true creator of value in a knowledge economy. In his opinion,since the economy is increasingly reliant on the production ofcreative subjectivities, this production is highly likely to engender anew humanism, a new centrality of man’s [sic] creative potentials.Especially in Kelly’s case, it has been easy to dismiss the notion of a‘hive mind’ and the self-organizing Internet-as-free market as ‘Internetgold rush’ rhetoric, promptly demolished by more or less unexpectedevents of 2001 (dot-com crash, resurgence of international terrorismand imperialism). It was difficult to avoid a feeling of irritation at


86 Network Culturesuch willing oblivion of the realities of working in the high-techindustries, from the poisoning world of the silicon chips factoriesto the electronic sweatshops of America Online, where technicalwork is downgraded and workers’ obsolescence is high. 35 How canwe hold on to the notion that cultural production and immateriallabor are collective on the Net (both inner and outer) after the belatedY2K explosion in 2001 and without subscribing to the idealistic andteleological spirit of the wired revolution?We could start with a simple observation: the self-organizing,collective intelligence of cybercultural thought captured the existenceof networked immaterial labour, but was weak in its analysis of theoperations of capital overall (including the coexistence of differentcapitalist lobbies and their relation to insitutional governance). Capital,after all, is the unnatural environment within which the collectiveintelligence materializes. The collective dimension of networkedintelligence needs to be understood historically, as part of a specificmomentum of capitalist development. The Italian writers who areidentified with the post-Gramscian Marxism of Autonomia Operaiahave consistently engaged with this relationship by focusing on themutation undergone by labour in the aftermath of the factory. Thenotion of a self-organizing ‘collective intelligence’ looks uncannilylike one of their central concepts, the ‘general intellect’, a notionthat the autonomists ‘extracted’ out of the spirit, if not the actuallywording, of Marx’s Grundrisse. The ‘collective intelligence’ or ‘hivemind’ captures some of the spirit of the ‘general intellect’, but removesthe autonomists’ critical theorization of its relation to capital.In the autonomists’ favorite text, the Grundrisse, and especially inthe ‘Fragment on Machines’, Marx argues (as summarized by PaoloVirno) thatknowledge – scientific knowledge in the first place, but notexclusively – tends to become precisely by virtue of its autonomyfrom production, nothing less than the principal productive force,thus relegating repetitive and compartmentalized labor to a residualposition. Here one is dealing with knowledge … which has becomeincarnate … in the automatic system of machines. 36In the vivid pages of the ‘Fragment’, the ‘other’ Marx of the Grundrisse(adopted by the social movements of the 1960s and 1970s againstthe more orthodox endorsement of Capital) describes the system ofindustrial machines as a horrific monster of metal and flesh:


Free Labour 87The production process has ceased to be a labour process in thesense of a process dominated by labour as its governing unity.Labour appears, rather, merely as a conscious organ, scatteredamong the individual living workers at numerous points of themechanical system; subsumed under the total process of themachinery itself, as itself only a link of the system, whose unityexists not in the living workers, but rather in the living (active)machinery, which confronts his individual, insignificant doingsas a mighty organism. 37The Italian autonomists extracted from these pages the notionof the ‘general intellect’ as ‘the ensemble of knowledge … whichconstitute the epicenter of social production’. 38 Unlike Marx’s originalformulation, however, the autonomists eschewed the modernistimagery of the general intellect as a hellish machine. They claimedthat Marx completely identified the general intellect (or knowledgeas the principal productive force) with fixed capital (the machine)and thus neglected to account for the fact that the general intellectcannot exist independently of the concrete subjects who mediate thearticulation of the machines with each other. The general intellectis an articulation of fixed capital (machines) and living labour (theworkers). If we see the Internet, and computer networks in general, asthe latest machines – the latest manifestation of fixed capital – thenit won’t be difficult to imagine the general intellect as being aliveand well today.However the autonomists did not stop at describing the generalintellect as an assemblage of humans and machines at the heart ofpostindustrial production. If this were the case, the Marxian monsterof metal and flesh would just be updated to that of a world-spanningnetwork, where computers use human beings as a way to allowthe system of machinery (and therefore capitalist production) tofunction. The visual power of the Marxian description is updatedby the cyberpunk snapshots of the immobile bodies of the hackers,electrodes like umbilical cords connecting them to the matrix,appendixes to a living, all-powerful cyberspace. Beyond the specialeffectsbonanza, the box-office success of The Matrix series validatesthe popularity of the paranoid interpretation of this mutation.To the humanism implicit in this description, the autonomistshave opposed the notion of a ‘mass intellectuality’, living labour inits function as the determining articulation of the general intellect.Mass intellectuality – as an ensemble, as a social body – ‘is the


88 Network Culturerepository of the indivisible knowledges of living subjects and oftheir linguistic cooperation … an important part of knowledge cannotbe deposited in machines, but … it must come into being as the directinteraction of the labor force’. 39 As Virno emphasizes, massintellectuality is not about the various roles of the knowledge workers,but is a ‘quality and a distinctive sign of the whole social labor forcein the post-Fordist era’. 40The pervasiveness of the collective intelligence within both themanagerial literature and Marxist theory could be seen as the resultof a common intuition about the quality of labour in informatedsocieties. Knowledge labour is inherently collective, it is alwaysthe result of a collective and social production of knowledge. 41Capital’s problem is how to extract as much value as possible (inthe autonomists’ jargon, to ‘valorize’) out of this abundant, and yetslightly untractable terrain.Collective knowledge work, then, is not about those who workin the knowledge industry. But it is also not about employment.The mass layoffs in the dot-com sector have not stopped Internetcontent from growing or its technologies from mutating. Theacknowledgement of the collective aspect of labour implies a rejectionof the equivalence between labour and employment, which wasalready stated by Marx and further emphasized by feminism andthe post-Gramscian autonomy. 42 Labour is not equivalent to wagedlabour. Such an understanding might help us to reject some of thehideous rhetoric of unemployment which turns the unemployedperson into the object of much patronizing, pushing and nudgingfrom national governments in industrialized countries (accept anyavailable work or else …) Often the unemployed are such only inname, in reality being the life-blood of the difficult economy of‘under the table’, badly paid work, some of which also goes into thenew media industry. 43 To emphasize how labour is not equivalent toemployment also means to acknowledge how important free affectiveand cultural labour is to the media industry, old and new.EPHEMERAL COMMODITIES AND FREE LABOURThere is a continuity, and a break, between older media and newmedia in terms of their relationship to cultural and affective labour.The continuity seems to lie in their common reliance on their public/users as productive subjects. The difference lies both in the modeof production and in the ways in which power/knowledge works in


Free Labour 89the two types. In spite of different national histories (some of whichstress public service more than others), the television industry, forexample, is relatively conservative: writers, producers, performers,managers, and technicians have definite roles within an industry stillrun by a few established players. The historical legacy of television asa technology for the construction of national identities also meansthat television is somehow always held more publicly accountablethan the news media.This does not mean that the old media do not draw on free labour;on the contrary. Television and the print media, for example, makeabundant use of the free labour of their audiences/readers, but theyalso tend to structure the latter’s contribution much more strictly, interms of both economic organization and moralistic judgement. Theprice to pay for all those real-life TV experiences is usually a heavydose of moralistic scaremongering: criminals are running amok onthe streets and must be stopped by tough police action; wild teenagerslack self-esteem and need tough love; and selfish and two-faced realityTV contestants will eventually get their come-uppance. If this doesnot happen on the Internet, why is it then that the Internet is notthe happy island of decentred, dispersed and pleasurable culturalproduction that its apologists claimed it to be?The most obvious answer to such questions came spontaneouslyto the early Internet users, who blamed it on the commercializationof the Internet. E-commerce and progressive privatization wereblamed for disrupting the free economy of the Internet, an economyof exchange which Richard Barbrook described as ‘gift economy’. 44Indeed, the Internet might have been a different place from what itis now. However it is almost unthinkable that capitalism could havestayed forever outside of the network, a mode of communicationwhich is fundamental to its own organizational structure.The outcome of the explicit interface between capital and theInternet is a digital economy which manifests all the signs of anacceleration of the capitalist logic of production. During its dotcomdays, the digital economy was the fastest and most visible zoneof production within late capitalist societies. New products, newtrends and new cultures succeeded each other at anxiety-inducingpace. It was a business where you needed to replace your equipment/knowledge, and possibly staff, every year or so.At some point, the speed of the digital economy, its acceleratedrhythms of obsolescence and its reliance on (mostly) ‘immaterial’products seemed to fit in with the postmodern intuition about the


90 Network Culturechanged status of the commodities whose essence was said to bemeaning (or lack of it) rather than labour (as if the two could beseparable). 45 The recurrent complaint that the Internet contributes tothe disappearance of reality is then based both in humanistic concernsabout ‘real life’ and in the postmodern nihilism of the recombinantcommodity. 46 Hyperreality confirms the humanist nightmare of asociety without humanity, the culmination of a progressive takingover of the realm of representation. Commodities on the Net are notmaterial and are excessive (there is too much information, too manyweb sites, too much spam, too many mailing lists, too much clutterand noise) with relation to the limits of ‘real’ social needs.It is possible, however, that the disappearance of the commodityis not a material disappearance, but its visible subordination to thequality of labour behind it. In this sense the commodity does notdisappear as such; it rather becomes increasingly ephemeral, itsduration becomes compressed, it becomes more of a process than afinished product. The role of continuous, creative, innovative labouras the ground of market value is crucial to the digital economy. Theprocess of valorization (the production of monetary value) happensby foregrounding the quality of the labour which literally animatesthe commodity.The digital economy, then, challenged the postmodern assumptionthat labour disappears while the commodity takes on and dissolvesall meaning. In particular, the Internet foregrounds the extraction ofvalue out of continuous, updateable work and is extremely labourintensive.It is not enough to produce a good web site; you needto update it continuously to maintain interest in it and fight offobsolescence. Furthermore, you need updateable equipment (thegeneral intellect is always an assemblage of humans and theirmachines), which in its turn is propelled by the intense collectivelabour of programmers, designers and workers. It is as if the accelerationof production has increased to the point where commodities, literally,turn into translucent objects. Commodities do not so much disappearas become more transparent, showing throughout their reliance onthe labour which produces and sustains them. It is the labour of thedesigners and programmers that shows through a successful web siteand it is the spectacle of that labour changing its product that keepsthe users coming back. The commodity, then, is only as good as thelabour that goes into it.As a consequence, the sustainability of the Internet as a mediumdepends on massive amounts of labour (which is not equivalent


Free Labour 91to employment, as we have said), only some of which was hypercompensatedby the capricious logic of venture capitalism. Of theincredible amount of labour which sustains the Internet as a whole(from mailing list traffic to web sites to infrastructural questions), wecan guess that a substantial amount of it is still ‘free labour’.Free labour, however, is not necessarily exploited labour. Withinthe early virtual communities, we are told, labour was really free:the labour of building a community was not compensated by greatfinancial rewards (it was therefore ‘free’, unpaid), but it was alsowillingly conceded in exchange for the pleasures of communicationand exchange (it was therefore ‘free’, pleasurable, not-imposed). Inanswer to members’ requests, information was quickly posted andshared with a lack of mediation which the early netizens did not failto appreciate. Howard Rheingold’s book, somehow unfairly accusedof middle-class complacency, is the most well-known account of thegood old times of the old Internet, before the net-tourist overcamethe net-pioneer. 47The free labour which sustains the Internet is acknowledgedwithin many different sections of the digital literature. In spite of thevolatile nature of the Internet economy (which yesterday was aboutcommunity and portals, today is about P2P and wireless connections,and tomorrow, who knows … ?). the notion of users’ labour maintainsan ideological and material centrality which runs consistentlythroughout the turbulent succession of Internet fads. Commentatorswho would normally disagree, such as Howard Rheingold andRichard Hudson, concur on one thing; the best way to keep yoursite visible and thriving on the Web is to turn it into a space whichis not only accessed, but somehow built by its users. 48 Users keep asite alive through their labour, the cumulative hours of accessing thesite (thus generating advertising), writing messages, participating inconversations and sometimes making the jump to collaborators. Outof the 15,000 volunteers which keep AOL running, only a handfulturned against it, the others stayed on. Such a feature seems endemicto the Internet in ways which can be worked on by commercialization,but not substantially altered. The ‘open-source’ movement,which relies on the free labour of Internet tinkers, is further evidenceof this structural trend within the digital economy.It is an interesting feature of the Internet debate (and evidence,somehow, of its masculine bias) that users’ labour has attractedmore attention in the case of the open-source movement than inthat of mailing lists and websites. This betrays the persistence of


92 Network Culturean attachment to masculine understandings of labour within thedigital economy: writing an operating system is still more worthyof attention than just chatting for free for AOL. This in spite of thefact that in 1996, at the peak of the volunteer moment, over 30,000‘community leaders’ were helping AOL to generate at least $7 milliona month. 49 Still, the open-source movement has drawn much morepositive attention than the more diffuse user-labour described above.It is worth exploring because of the debates which it has provokedand its relation to the digital economy at large.The open-source movement is a variation of the old tradition ofshareware and freeware software, which substantially contributedto the technical development of the Internet. Freeware software isfreely distributed and does not even request a payment from itsusers. Shareware software is distributed freely, but incurs a ‘moral’obligation for the user to forward a small sum to the producer inorder to sustain the shareware movement as an alternative economicmodel to the copyrighted software of giants such as Microsoft. ‘Opensource’ ‘refers to a model of software development in which theunderlying code of a program – the source code a.k.a. the ‘crownjewels’ – is by definition made freely available to the general publicfor modification, alteration, and endless redistribution’. 50Far from being an idealistic, minoritarian practice, the opensourcemovement has attracted much media and financial attention.In 1999, Apache, an open-source web server, was the ‘Web-serverprogram of choice for more than half of all publicly accessible Webservers’ 51 and has since then expanded to the point where Bavariain Germany and the whole of China have recently announced aswitchover to it. Open-source conventions are anxiously attendedby venture capitalists, informed by the digerati that open source isa necessity ‘because you must go open-source to get access to thebenefits of the open-source development community – the nearinstantaneousbug-fixes, the distributed intellectual resources of theNet, the increasingly large open-source code base’. 52 Open-sourcecompanies such as Cygnus convinced the market that you do notneed to be proprietary about source code to make a profit: thecode might be free, but technical support, packaging, installationsoftware, regular upgrades, office applications and hardware arenot.In 1998, when Netscape went open source and invited thecomputer tinkers and hobbyists to look at the code of its new browser,


Free Labour 93fix the bugs, improve the package and redistribute it, specializedmailing lists exchanged opinions about the implications. 53 Netscape’smove rekindled the debate about the peculiar nature of the digitaleconomy. Was it to be read as being in the tradition of the Internet‘gift economy’? Or was digital capital hijacking the open-sourcemovement exactly against that tradition? Richard Barbrook salutedNetscape’s move as a sign of the power intrinsic in the architectureof the medium. 54 Others such as John Horvarth did not share suchoptimism. The ‘free stuff’ offered around the Net, he argued,is either a product that gets you hooked on to another one ormakes you just consume more time on the net. After all, the goalof the access people and telecoms is to have users spend as muchtime on the net as possible, regardless of what they are doing. Theobjective is to have you consume bandwidth. 55Far from proving the persistence of the Internet gift economy, Horvarthclaimed, Netscape’s move is a direct threat to those independentproducers for whom shareware and freeware have been a way ofsurviving exactly those ‘big boys’ that Netscape represents:Freeware and shareware are the means by which small producers,many of them individuals, were able to offset somewhat thebulldozing effects of the big boys. And now the bulldozers areheaded straight for this arena. As for Netscrape [sic], such a movemakes good business sense and spells trouble for workers in thefield of software development. The company had a poor last quarterin 1997 and was already hinting at job cuts. Well, what better wayto shed staff by having your product taken further by the freewarepeople, having code-dabbling hobbyists fix and further developyour product? The question for Netscrape [sic] now is how to tamethe freeware beast so that profits are secured. 56Although it is tempting to stake the evidence of crashes andlayoffs against the optimism of Barbrook’s gift economy, theremight be more productive ways of looking at the increasingly tightrelationship between an ‘idealistic’ movement such as open sourceand the venture mania for open-source companies. 57 Rather thanrepresenting a moment of incorporation of a previously authenticmoment, the open-source question demonstrates the overrelianceof the digital economy as such on free labour, free both in the sense


94 Network Cultureof ‘not financially rewarded’ and of ‘willingly given’. This includesAOL community leaders, the open-source programmers, the amateurweb designers, mailing list editors and the netslaves who for a whilewere willing to ‘work for cappucinos’ just for the excitement and thedubious promises of digital work. 58Such a reliance, almost a dependency, is part of larger mechanismsof capitalist extraction of value which are fundamental to latecapitalism as a whole. That is, such processes are not created outsidecapital and then reappropriated by capital, but are the results of acomplex history where the relation between labour and capital ismutually constitutive, entangled and crucially forged during thecrisis of Fordism. Free labour is a desire of labour immanent to latecapitalism, and late capitalism is the field which both sustains freelabour and exhausts it. It exhausts it by undermining the meansthrough which that labour can sustain itself: from the burnoutsyndromes of Internet start-ups to under-compensation andexploitation in the cultural economy at large. Late capitalism doesnot appropriate anything: it nurtures, exploits and exhausts its labourforce and its cultural and affective production. In this sense, it istechnically impossible to separate neatly the digital economy of theNet from the larger network economy of late capitalism. Especiallysince 1994, the Internet has always and simultaneously been a gifteconomy and an advanced capitalist economy. The mistake of theneoliberalists (as exemplified by the Wired group), was to mistake thiscoexistence for a benign, unproblematic equivalence.As we stated before, these processes are far from being confined tothe most self-conscious labourers of the digital economy. They arepart of a diffuse cultural economy which operates throughout theInternet and beyond. The passage from the pioneeristic days of theInternet to its ‘venture’ and ‘recession’ days does not seem to haveaffected these mechanisms, only intensified them. Nowhere is thismore evident that on the World Wide Web.THE NET AND THE SETIn the winter of 1999, in what sounded like another of its resounding,short-lived claims, Wired magazine announced that after just fiveyears the old Web was dead:The Old Web was a place where the unemployed, the dreamy, andthe iconoclastic went to reinvent themselves … The New Web isn’t


Free Labour 95about dabbling in what you don’t know and failing – it’s aboutpreparing seriously for the day when television and Web contentare delivered over the same digital networks. 59The new Web was made of the big players, but also of new waysto make the audience work. In the new Web, after the pioneeringdays, television and the web converge in the one thing they havein common: their reliance on their audiences/users as providers ofthe cultural labour which goes under the label of ‘real life stories’.Gerry Laybourne, executive of the web-based media company Oxygen,thought of a hypothetical show called What Are They Thinking? areality-based sketch comedy show based on stories posted on theWeb, because ‘funny things happen in our lives everyday’. 60 As Bayersalso adds, ‘[u]ntil it’s produced, the line separating that concept frommore puerile fare dismissed by Gerry, like America’s Funniest, is hardto see’. 61The difference between the puerile fare of America’s Funniest anduser-produced content does not seem to lie in the more seriousnature of the new Web as compared to the vilified output of ‘peopleshows’ and ‘reality television’. From an abstract point of view thereis no difference between the ways in which people shows rely onthe inventiveness of their audiences and the web sites rely on users’input. People shows rely on the activity (even amidst the mostshocking sleaze) of their audience and willing participants to a muchlarger extent than any other television programmes. In a sense, theymanage the impossible; they create monetary value out of the mostreluctant members of the postmodern cultural economy: those whodo not produce marketable style, who are not qualified enough toenter the fast world of the knowledge economy, are converted intomonetary value through their capacity to affectively perform theirmisery.When compared to the cultural and affective production on theInternet, people shows and reality TV also seem to embody a differentlogic of relation between capitalism (the media conglomerateswhich produce and distribute such shows) and its labour force –the beguiled, dysfunctional citizens of the underdeveloped North.Within people shows and reality TV, the valorization of the audienceas labour and spectacle always happens somehow within a power/knowledge nexus which does not allow the immediate valorizationof the talk show participants: you cannot just put a Jerry Springerguest on TV on her own to tell her story with no mediation (indeed


96 Network Culturethat would look too much like the discredited access slots of publicservice broadcasting). There is no real 24/7 access to reality TV, butincreasing/decreasing levels of selective editing (according to thedifferent modalities of a communication spectrum that goes fromterrestrial to digital TV and the Internet). In the case of talk shows,various levels of knowledge intervene between the guest and theapparatus of valorization, which normalize the dysfunctional subjectsthrough a moral or therapeutic discourse and a more traditionalinstitutional organization of production. So after the performance,the guest must be advised, patronized, questioned and often bulliedby the audience and the host, all in the name of a perfunctory,normalizing morality. In reality television, psychologists and otherexperts are also brought in to provide an authoritative perspectivethrough which what is often a sheer voyeuristic experience may beseen as a ‘social experiment’.TV shows also belong to a different economy of scale: althoughthere are more and more of them, they are still relatively few whencompared to the millions of pages on the Web. It is as if the centralizedorganization of the traditional media does not let them turn people’sproductions into pure monetary value. TV shows must have morals,even if those morals are shattered by the overflowing performancesof their subjects.Within the Internet, however, this process of channelling andadjudicating (responsibilities, duties and rights) is dispersed to thepoint where practically anything is tolerated (sadomasochism,bestiality, fetishism and plain nerdism are not targeted, at least withinthe Internet, as sites which need to be disciplined or explained away).The qualitative difference between people shows and a successfulweb site, then, does not lie in the latter’s democratic tendency asopposed to the former’s exploitative nature. It lies in the operation,within people shows, of majoritarian discursive mechanisms ofterritoralization, the application of a morality that the ‘excessive’abundance of material on the Internet renders redundant and, evenmore, irrelevant. The digital economy cares only tangentially aboutmorality. What it really cares about is an abundance of production, animmediate interface with cultural and technical labour whose resultis a diffuse, non-dialectical antagonism and a crisis in the capitalistmodes of valorization of labour as such.As we shall see in the next chapter, it would be a mistake to thinkof such trends as constituting an automatic process of liberation fromthe tyranny of capitalist exploitation. On the contrary, as we have


Free Labour 97also suggested here, this open and distributed mode of production isalready the field of experimentation of new strategies of organizationthat starts from the open potentiality of the many in order to developnew sets of constraints able to modulate appropriately the relationbetween value and surplus value – or, as we will refer to them, theentanglement of emergence and control.


4Soft ControlBIOLOGICAL COMPUTINGWriting in 1934, a few years before the first digital computers wereassembled in the United States, technology historian Lewis Mumfordcalled for an end to the numbing power of the industrial technologicalmilieu and for it to be replaced by a new technological age – freefrom the dominance of the mechanical rationality of the clock andthe deadening sensorial influence of materials such as iron and coal.Mumford thought that the future of technological development layin a return to the organic, a return that he also significantly saw atthe heart of research into modern mass media: 1 the study of the ear,throat and tongue, he remarked, had been fundamental to thedevelopment of the phonograph; and research on motion in horses,oxen, bulls, greyhounds, deer, and birds provided the basis for thescientific study of the relationship between images and movementsthat produced the motion picture. Mumford suggested thattechnological innovation was not intrinsically tied to the dominationof nature, as the Baconian model implies, but also entailed a morechallenging relationship with the artficiality of the natural world.Human technicity does not so much construct increasingly elaborateextensions of man, but rather intensifies at specific points itsengagement with different levels of the organization of nature. Suchlevels are abstracted and redeployed within the complex socialmachines within which human and technical segments arrangethemselves. The nature that emerges out of this interaction is itselfnot only complex, but also ‘artificial’, that is inventive and productive.Far from being synonymous with an eternal and immutable essence,the natural world comes across as multiple and complex, endowedwith its own ingenious, adaptive and inhuman creativity. 2 Andmachines, as George Canguilhem and Felix Guattari would put itlater, can be more than mere mechanisms.Mumford’s call for an organic complication of the mechanicalwould not sound out of place in an age where communicationnetworks are often described as self-organizing, evolutionary and98


Soft Control 99bottom-up. Above all, the explosion of the Internet phenomenonhas induced a rush to compare and contrast its workings with thoseof other systems endowed with a similar logic (from swarms tomarkets). Drawing on the insights of population biology, apologistsfor free markets and bottom-up organization have pointed out theubiquity of the latter in the realm of the ‘made and the born’. 3 ‘NewEconomy’ endorsers claimed to be inspired by the ubiquity ofevolutionary processes and their capacity not only to sort out the fitfrom the unfit, but also actively to produce the variety of life as such.This use of evolutionary theory pointed to an artificial nature – thatis a nature that was made and unmade by specific and complextechniques that it produced immanently and without a predefinedpurpose or aim.Inspired by the work of formidable computing pioneers such asJohn von Neumann and Stanislav Ulam, the field of biologicalcomputing that is the focus of this chapter has engaged with thetechnicity of nature as expressed in evolutionary processes and hasthus been criticized as a sustained and misleading attempt tonaturalize technical and social relations – giving support to thenotion of a self-organizing Internet intrinsically given over to thebeneficial action of free-market forces. Biological computing, in fact,is centrally concerned with understanding phenomena of bottom-uporganization by simulating the conditions of their emergence in anartificial medium – the digital computer. The term refers to a clusterof subdisciplines within the computer sciences – such as artificial life(which aims to develop lifelike behaviour within computersimulations); mobotics (the engineering of mobile robots that areable to learn from their mistakes); and neural networks (a bottom-upapproach to artificial intelligence that starts with simple networksof neurons rather than with top-down sets of instructions). Whatthese subdisciplines share is a common reference to John vonNeumann’s work in the 1950s with cellular automata – a go-like gameentailing an open chequerboard and a population of squares boundby local rules of interaction. Von Neumann’s cellular automata havebeen demonstrated to be capable of universal computation (just likethe universal Turing machine).Since von Neumann’s times, the field of biological computing hasdeveloped into a well-funded and profitable field of research withimportant applications in areas as different as animation and cancerresearch. It has absorbed insights from chaos theory, molecular


100 Network Culturebiology, population thinking and, of course, evolutionary theory. Itsfield of interest can be described as the capacity of acentred andleaderless multitudes of simple elements, bound only by local rules,to produce emergent phenomena able to outperform the programmers’instructions. Biological computing explores the larger plane of abstractmachines of bottom-up organization, of which the Internet appearsas a specific instance and product. What makes such machines abstractis their lack of qualities: they are no more technical than they arenatural, nor could they be described as biological rather than social.Their simulation involves the description of an abstract diagram thatbrings into relation almost indefinite entities, laws and capacities –acentred multitudes; local rules; global dynamics; capacity to engenderemergence; relative unpredictability; refractoriness to control. Whatbiological computing asks is: How do such systems come to be? Whatare they made of? What rules explain them? How can they be recreatedand what kind of control modes are better suited to theirimmense potential and refractory tendencies?If the network is a type of ‘spatial diagram’ for the age of globalcommunication, the self-organizing, bottom-up machines ofbiological computation capture the network not simply as an abstracttopological formation – but as a new type of production machine. Inthis sense, as we shall see, the processes studied and replicated bybiological computation are more than just a techno-ideologicalexpression of market fundamentalism. Biological computationimplies an informational milieu where each point is directlyconnected to its immediate neighbours (on whom it acts and towhom it reacts); and is indirectly, but no less effectively, affected bythe movements of the whole. A self-organizing system engenderingemergent behaviour (that is, behaviour that has not been explicitlyprogrammed or commanded) expresses a mode of production thatis characterized by an excess of value – an excess that demands flexiblestrategies of valorization and control. In the next pages, then, wewill explore that entanglement of the organic and the inorganic, thephysical and the biological, and the natural and the technological,in order to catch a glimpse of the emergence of a kind of abstractmachine of soft control – a diagram of power that takes as its operationalfield the productive capacities of the hyperconnected many. But first,we shall proceed to a preliminary outline of what the shift tobiological computation implies and its relation to the larger epistemicfield of contemporary scientific knowledge.


FROM ORGANISMS TO MULTITUDESSoft Control 101The biological turn in computing, with its interest in natural andartificial bottom-up organization, can be thought of as part of a largertechno-scientific reconceptualization of life – beyond the mechanicallaws of classical science, but also beyond the organized forms ofmodern anatomy and biology. For artificial life theorists such asCharles Taylor and David Jefferson, all natural life can be understoodin terms of the interactions of a large number of simple elementsfrom levels below. The living organism is no longer mainly one singleand complicated biochemical machine, but is now essentially theaggregate result of the interaction of a large population of relativelysimple machines. 4 These populations of interacting simple machinesare working at all levels of the biophysical organization of matter.They are active at the molecular level, the cellular level, the organismlevel, and the population–ecosystem level. 5 As a consequence, ‘toanimate machines, … is not to “bring” life to a machine; rather it isto organize a population of machines in such a way that theirinteractive dynamics is “alive”’. 6 Mary Shelley’s Frankenstein,therefore, started from the wrong premises. If you want to reproducethe complexity of life, you do not start with organs, stitch them alltogether and then shock them into life. You start more humbly andmodestly, at the bottom, with a multitude of interactions in a liquidand open milieu.The computational disintegration of the organism into a multitudeof interacting, simple machines is also observable in currentconnectionist approaches to the mind, at the basis of recent artificialintelligence work that attempts to model the behaviour of the centralnervous system (CNS). By studying the dynamics of neural cells inthe CNS, recent work in artificial intelligence is hoping to reproducesome of the complex features of the mind, such as its capacity torecognize patterns and to hold an ‘indefinite memory’. Very simplifiedmodels of neural nets, such as networks of fixed threshold neurons,‘have learned to distinguish a variety of complex patterns; faces,handwriting, spoken words, sonar signals and stock marketfluctuations have all served as grist for such networks.’ 7A bottom-up approach, as the expression indicates, implies thatone does not start with the already formed and stratified organ, suchas the brain, or with a faculty, the mind, but that one reconstructsthe conditions that underlie and produce, as an after-effect, thatorgan or faculty. Modern science conceived of thinking as a capacity


102 Network Culturelocated in a specific part of the human body, which could be damagedand easily destroyed if the organ was attacked. Thus if by accidentor experiment a part of the brain is destroyed, with it go some of ourmemories and capacity to grasp the world. Neural nets are inspiredby connectionist approaches to cognition that reject an understandingof the mind in terms of the morphology of the brain (or imply thepresence of a representational cognitive box, as Rodney Brooks hasput it) 8 . Here the brain and the mind are dissolved into the dynamicsof emergence, the outcome of a multitude of molecular, semi-orderedinteractions between large populations of connected neurons.While organisms haven’t stopped being damaged by the destructionof individual organs, and the rational and perceptive capacities ofindividuals can still be dramatically affected by a physical blow tothe head, such a conception of cognition is no longer occupying thecentre stage of contemporary research into the artificiality of themind. From the conception of the brain as a specialized organ thatacts like a storage for memories and a coordinator for the whole body,connectionism moves on to outline the feature of a mind that is nolonger located anywhere specific. As Gregory Bateson put it:[W]e may say that ‘mind’ is immanent in those circuits of the brainthat are completely within the brain. Or that mind is immanentin circuits which are complete within the system brain plus body.Or finally, that mind is immanent in the larger system – man plusenvironment. 9In biological computing, the organism is sidestepped from above andfrom beneath: from above it dissolves into the collectivity of itsconnections (the mind plus its environment); and from beneath itis sidestepped by the parallel-processing features of the neurons inthe brain and the central nervous system. Thus cognitive sciencebecomes Bergsonian. Memories are not images that are storedsomewhere in the brain, but emergent events assembled out of manydiscrete fragments in an act of partial reinvention. ‘These pieces ofhalf-thoughts have no fixed home; they abide throughout the brain… The act of perceiving and the act of remembering are the same.Both assemble an emergent whole from many distributed pieces.’ 10In artificial life research, the attempt to breed artificial forms oflife is explicitly related to the dynamic of populations, whose localinteraction produces a lifelike effect. ‘It is this bottom-up distributed,local determination of behavior that AL employs in its primary


Soft Control 103methodological approach to the generation of lifelike behavior.’ 11This can include the capacity of a robot to walk by letting its legsinteract according to simple rules or the artificial evolution of softwarethrough genetic algorithms. Rather than being pre-programmedsequentially and inbuilt in hardware, these experiments aim to‘evolve’ lifelike behaviour. The biological turn is thus unabashedlymechanistic and reductionist: all life is reduced to a simple set oflocal rules governing the behaviour of simple machines. But, it claims,its mechanicism is deeply different from the old one, ‘based as it ison multiplicities of machines and on recent results in the field ofnonlinear dynamics, chaos theory, and the formal theory ofcomputation’. 12Biological computation is thus concerned with the power of thesmall. Small here should not be taken to indicate size and weight inthe metrical sense. Smallness is not measured by rulers and scales,but it is exterior and relational: it is described by an overall relationto a large number of variables, with no ultimate determination orcentral control. What determines the ultimate lack of any distinctionbetween the natural and the artificial is ultimately the indeterminationof a multitude. An individual broker within a large and turbulentstock market is as small as a molecule within a turbulent fluid. Whatmakes the components of an open system small is not their size butthe fact that they are grasped in terms of their overall relation to alarge number of interchangeable components that interact on eachother by way of recursive feedback loops. These systems do not simplydie or reproduce themselves by way of an autopoietic movement,but they are always becoming something else: ‘They are foreverdynamic and can be considered dead and of little interest when theycome to thermodynamic equilibrium. It is really the dynamicproperties of complexity, the motion pictures, not the snapshots,which characterize the systems in which we are interested.’ 13All these processes are grasped in terms not only of their risksbut also of their potential conceived from the perspective of thereplicability of open and productive structures. Thus Silicon Valley inthe San Francisco Bay Area has been similarly analysed as a kind ofecosystem for the development of ‘disruptive technologies’, ‘whosegrowth and success can be attributed to the incessant formationof a multitude of specialized, diverse entities that feed off, supportand interact with one another.’ 14 Such concerns have preoccupieddifferent governments keen to replicate the success of the SiliconValley postindustrial ecosystem. If the latter was not planned in any


104 Network Culturetraditional sense, but emerged as a hotbed of technological innovationsout of a multiplicity of different factors and connections, howcan one replicate the same process elsewhere? A combination ofthe very small and the very large implies a shift in the organizationof productivity, but how should this be accomplished if there is noformula that will fit all cases?The matter is then not so much about cracking the secret of themicro in its entirety, but understanding the initial conditions, that,once got right, allow a certain kind of outcome to emerge spontaneously.‘Emergence must somehow be bound up in the selection ofthe rules (mechanisms) that specify the model, be it game or physicalscience.’ 15 The outcome is not programmed step by step, but is,rather, carefully prepared at the beginning and there is no guaranteeof success – as Manuel Castells and Peter Hall’s study of failed attemptsat recreating the conditions for the emergence of milieus of innovationhas also shown. 16The most important feature of the systems studied and simulatedwithin biological computing is that they are not easily controllableor predictable. A multitude of simple bodies in an open system is bydefinition acentred and leaderless. There is no single control structurethat can be manipulated or attacked: the sheer multiplicity ofnonlinear interactions, feedback loops, and mutations make thebehaviour of such systems very hard to analyse, because it isimpossible to control them completely and unequivocally (even thesimple activity of observing them alters them). They cannot beknown completely because they cannot be studied by dissection:once the connection and mutual affection with other elements isremoved, the individual element becomes passive and inert. In theshift from the bug to the hive colony, from the individual to thepopulation, from the Internet user to a network culture, somethinghappens and this something, although somehow inherent in thebug/individual, cannot be found in it by any of the traditional means(it is both pre-individual and collective). You can observe and kill anindividual entity, anatomize it, and you still won’t find out what itis that will make it act in a certain way once it acts as an elementwithin a population open to flows. You can collect as much data asyou want about individual users, but this won’t give you the dynamicof the overall network.This leads us to wonder what else is packed into the bee that wehaven’t seen yet? Or what else is packed into the hive that has not


Soft Control 105yet appeared because there haven’t been enough honeybee hivesin a row all at once? And for that matter, what is contained in ahuman that will not emerge until we are all interconnected bywires and politics? 17Knowing what it is that is packed into an individual but that isnot reducible to it is a matter of some importance considering howinherently unstable such systems are. A multitude can always veeroff somewhere unexpected under the spell of some strange attractor.On the other hand, while difficult to control, these systems arecharacterized by a potentially enormous productivity, what theliterature on the subject describes as their dynamic capacity to support‘engaging events’, while acting with a high degree of distributed‘autonomy and creativity’. This autonomy and creativity is producedby a process of recursive looping that generates divergent andtransmittable variations at all points. Such systems, that is, arecharacterized by their tendency to escape from themselves, continuallydiverging; at the same time, such divergence does not generatecomplete differentiation because such mutations are spread by wayof diagonal and transversal dynamics. Finally, there is no guaranteethat such bottom-up dynamics will lead to emergent behaviour:systems can also come to a standstill or veer off catastrophically inunexpected directions.A crucial problem for the simulation of the behaviour of suchentities is that of reproducing the right speed, that is the right degreeof boundedness, of a multitude. A bottom-up system, in fact, seemsto appear almost spontaneously not so much as a result of a changein the composition of individual elements, but more in relation tohow loosely such elements can interact with each other (it is afunction of their overall speed). A multitude, for example, is quiteforeign to sequentiality, whether it is the linear and closed sequentialityof the assembly line or the one-directional flow of broadcasting.When segments are connected together in a single line, they becomeimmediately bound to each other and to the overall structure andhence geared towards reproduction rather than becoming. Similarly,a transversely-connected multitude is quite alien to the logic of masssocieties, in as much as the solidity and boundedness of the masstend towards the production of homeostasis, that is an increasinghomogenization, while a multitude tends to engender, multiply andspread mutations. ‘We should learn more about predicting andcontrolling critical stages in the process of emergence. This knowledge,


106 Network Culturein turn, should help us to understand better the processes that crownhuman intellectual undertakings: innovation and creation.’ 18SEARCHING A PROBLEM SPACENew Economy capitalism made much of the relationship betweenbottom-up organization and speed and focused on the importanceof fluidity as a physical condition. At a certain level of speed, in asemi-ordered or liquid phase, large numbers are subject to a differentkind of rules than solid and bound entities.It has long been appreciated by science that large numbers behavedifferently than small numbers. Mobs breed a requisite measureof complexity for emergent entities. The total number of possibleinteractions between two or more members accumulatesexponentially as the number of members increases. At a high levelof connectivity, and a high number of members, the dynamics ofmobs takes hold. More is different. 19Moreness, then, as Kevin Kelly put it, is explicitly linked to the needfor a different immanent logic of organization that demands newstrategies of control to take advantage of its potentially infiniteproductivity while controlling its catastrophic potential. So it shouldnot surprise us how much biological computing owes to the mechanicsof fluids.In his eulogy of bottom-up organization written at the height ofthe ‘free market’ digital wave, Kevin Kelly quoted the writings of C.Lloyd Morgan, who in his 1923 book Emergent Evolution definedemergent phenomena as ‘a different variety of causation’: ‘[t]heemergent step, though it may seem more or less saltatory [a leap], isbest regarded as a qualitative change of direction, or critical turningpoint,in the course of events.’ 20 In this sense, the biological turnentails a rediscovery, that of the ancient clinamen (or swerve) – theexplanation given by the pre-Socratic philosopher Epicurus for theexistence of a principle of indeterminacy in the form of chance inatomic theory. 21 This swerve or clinamen, a principle of chance andindetermination, is identified by Ilya Prigogine and Isabelle Stengersas a crucial intuition into the features of ‘far-from-equilibriumsystems, systems that is that are very sensitive to fluctuations’. 22Michel Serres associates the clinamen with the ancient Lucretiannotions of turbo and turba.


Soft Control 107The minimal angle of turbulence produces the first spirals hereand there. It is literally revolution or it is the first evolution towardsomething else other than the same. Turbulence perturbs the chain,troubling the flow of the identical as Venus had troubled Mars. 23For Michel Serres, the sciences of the clinamen take as their centralconcern fluctuations, deviations, and instability. In the biologicalturn such features are not discarded or pushed outside the perimeterof legitimate scientific investigation, as was the case, as Prigogineand Stengers argued, with classical science from Aristotle to Clausius.The acknowledgement of an original difference/clinamen/deviation(a microdeterritorialization or line of flight) is taken as the object ofstudy in order to develop a better knowledge of, and implicit controlover, the action of randomness and chance in populations at thepoint where they have lost all social qualities and qualifications.Within financial and investment banking, for example, in therandomness and chance inherent in markets characterized by highfluidity of currency and investment patterns, such uncertainty isconsidered a source not only of potentially immense profitabilitybut also of dreaded, abrupt changes – the kind of changes that candepress stock markets for years or trigger new types of global wars. Formanagement theory, autonomy in the workplace is a similarly volatilecompound: it represents a highly productive source of value but canalso dangerously veer in unprofitable directions. Within bureaucraticorganizations such as governments, banks, and corporations, flows arelooked at with mixed feelings, from glee and greed to suspicion.The biological turn in computing gives such mixed feelings a solidbasis in scientific knowledge and technical experimentation, wherefluid states are considered essential conditions for emergence. Thesynthesis of emergent phenomena out of a fluid and hence uncertainorganization is described as a search for a particular ‘phase space’,which is characterized by a specific level of speed:… there was a certain area where information changed but notso rapidly that it lost all connection to where it had just beenpreviously. This was akin to a liquid state. … it was the liquid regimethat supported the most engaging events, those that would supportthe kind of complexity that was the mark of living systems. 24The question is how to maintain the productivity of a fluid space,dynamically perched between the unproductive extremes of solidity


108 Network Cultureand gaseous chaos. Or as Tim Berners-Lee put it, ‘[w]e certainly needa structure that will avoid those two catastrophes: the global uniformMcDonald’s monoculture, and the isolated Heaven’s Gate cults thatunderstand only themselves.’ 25 It is a matter then of finding the rightspeed that facilitates activity.To use somewhat anthropomorphic language: in equilibriummatter is ‘blind’, but in far-from-equilibrium conditions it beginsto be able to perceive, to take into ‘account’ in its way of functioning,differences in the external world (such as weak gravitational orelectrical fields). 26A fluid state is thus defined as a relation of speed determining thelevel of connection loosely binding a multitude of simple bodies ormachines. A fluid space is characterized by a loose relation betweenmolecules or components which allows them the capacity to deviateand spontaneously produce turbulent phenomena. A space of flowsengenders emergent phenomena but does not guarantee that theywill always be of use or even advantage to the experimenter or theplanner, because they are open to sudden transformations orcatastrophes. Since planning is confined to the initial conditions,preferred outcomes can only be hoped for rather than counted on.Emergent output needs to be carefully collected and valorized at theend of the process with a very fine-toothed comb; it must be modulatedwith a minimum amount of force. It cannot be analysed, but onlysynthesized, by experimenting with the set of constraints that facilitateit. This is a control of a specific type, it is a soft control. It is not softbecause it is less harsh (often it has nothing gentle about it) butbecause it is an experiment in the control of systems that respondviolently and often suicidally to rigid control.GLOBAL COMPUTATIONBiological computing applies the algebraic logic of Boolean functionsto local interactions among artificial populations of code. In this,biological computing follows population thinking – a perspective onlife that brings together evolutionism and genetics.In a nutshell what characterizes this style may be phrased as ‘neverthink in terms of Adam and Eve but always in terms of largerreproductive communities’. More technically, the idea is that


Soft Control 109despite the fact that at any one time an evolved form is realizedin individual organisms, the population not the individual is thematrix for the production of form. 27Population thinking considers a given form, such as an animal, asthe statistical result of a larger process of diffusion of genetic materialat different rates and different times. Population thinking hasproduced a great awareness of transversal genetic lines cutting acrossbiological forms, their variations and the coevolutionary processesthat have produced the variety of populations and complex organismson earth. Although some populations can become isolated bygeological accidents (as with the Moa in New Zealand, for example),generally speaking populations operate as open systems. Thedifferential rates of diffusion of genetic material across and withinpopulations can be computed over time and such computationhighlights the fluctuations and leaps that characterize the evolutionof life. 28 For Stuart Kauffmann, the meeting between populationbiology, Darwinism and genetics is crucial to contemporaryunderstandings of life. Contemporary evolutionary biology haslearned to connect the historical variation of species and populationsin terms of the diffusion of genes and genetic variations under theaegis of natural selection. The link between evolutionary biology andcomputation dates back to John von Neumann’s work in the aftermathof World War II with a computational experiment called ‘cellularautomata’: (or CAs) – ‘a relatively new field that appeared in themidst of the intellectual dust that accompanied the development ofthe first digital computers’. 29 This interest was a recurring concernof information theory and computer science (Claude E. Shannon’smentor, Vannevar Bush, also encouraged him to write his MA thesison the relationship between information and genetics). VonNeumann’s CAs, however, are the most common point of referencefor biological computation. They were devised as an alternativeapproach to computation that was originally developed as a meansto formalize the characteristics of life – in tune with the formalistspirit that we know informed the emergence of the Turing machine.CAs can also be seen as a complex kind of game, and it is as a game,such as John Conway’s game of life, that CAs are probably best knownoutside computational circles.According to John Holland, a leading researcher in artificial life,the original impulse for the construction of cellular automata camefrom Stanislav Ulam, who was interested in ‘model physics’, that


110 Network Cultureis mathematical models of the physical universe that obeyed thetwo main laws of physics: they had to have a geometry and a setof locally defined laws that held at every point of the geometry.In a series of lectures entitled ‘General and Logical Theory ofAutomata’, von Neumann showed how such model physics couldbe used to replicate one of the key formal features of life, that is, selfreproduction.He thus embarked on building a formal model that couldbe shown as capable of making copies of itself by simply followinga set of mechanical and local rules. His models of self-reproducingautomata, or CAs, were shown to correspond to the mechanismsdriving the replication of genetic material in the cell and are alsoconsidered today as serious alternatives to Alan Turing’s universalcomputer. Variations of von Neumann’s CAs (such as quantumcellular automata) are also researched in terms of their potentialto take computing power beyond the limits of current microchipmanufacturing technology. Cellular automata machines (such as TomToffoli and Norman Margolus’s dedicated CA computer CAM-8) havebeen shown to be able to simulate dynamical systems that are beyondthe reach of conventional computers and even supercomputers,such as fluid mechanics. ‘The particles in fluids can be representedin such detail that Toffoli now considers that he is not workingwith a computational system but instead he says he is manipulating“programmable matter”’. 30The key idea of CAs is that there are formal structures that are ableto perform global computation through a system of local rules thatsimply dictate the relationship of each cell/particle/node with itsneighbours. In a cellular automata system, every cell reacts exclusivelyto the state of its neighbours and it is on this basis that it changes.The cumulative result of these changes has been shown to be capableof universal computation, with some classes of cellular automata ableto model the behaviour of chaotic and open systems. CAs can beplayed like a game, in as much as they involve the invention of rulesand their application to a population of cells.In spite of their random appearance, cellular automata have beenshown as capable of computing. This is no mean feat, since to usecellular automata to solve an equation requires asking hundreds ofthousand of components to produce a reliable result by reactingindividually but in a strictly determined manner to the behaviourof their neighbours. And yet computer simulations of CA systemshave proved that you can solve equations in this way, even thoughit involves a process of trying, testing and running hundreds of


Soft Control 111simulations until one hits on the solution. A classic example of acellular automata system able to perform very complex calculationsis the nervous system – with its millions of simultaneous localinteractions producing conscious perception by way of emergence. 31CAs have also been shown to be able to successfully model complexphenomena ranging from the role of neutral and selective mutationsin cancer to the dynamics of neural nets; from call-routing usingadaptive pricing to the emergence of social morals and computerassisted design in architecture.Biological computing envisages an abstract computational diagramable to simulate (and hence capture) the productive capacities ofmultitudes of discrete and interacting elements. The most productivechallenge of CA systems to the sequential computer lies in the factthat they do not start with the easily controllable linearity of a tape,but with the multiplicity of a population. What produces thecomputation is not a sequence of instructions carried out by anindividual probe head, but a multitude of simultaneous interactionsperformed by a potentially infinite population. A CA system, in fact,can be imagined as an open chequerboard space divided into apotentially infinite number of cells/nodes each one of which can bein a number of states (von Neumann devised a very complex CA with29 states, but the game of life has only two: dead or alive). In a twodimensionalCA, each node is linked to eight neighbouring nodes.The single node will change its state by following a rule table thatdictates what it should do in relation to each specific configurationof the other eight cells. Thus, if the eight neighbouring cells are blue,for example, the central cell would obey an instruction that says thatif they are all blue, it should turn yellow. Its turning yellow wouldin turn affect the states of neighbouring cells, and so on. All changesof phase happen simultaneously according to discrete time steps. CAresearchers have shown that simply by playing around with the initialconfigurations and the rule table, one can get such populations ofcells to replicate any other machine – exactly like a universal Turingmachine, but using a different logic that involves the collective anddecentralized work of populations in synchronized, local andnonlinear relations. Different CA systems differ in terms of thenumber of dimensions considered (there are one-dimensional andtwo-dimensional CAs although they could theoretically have anynumber of dimensions); and the states which the cells can be in (theyrange between the dead/alive state of the game of life to von


112 Network CultureNeumann’s 29 states). More importantly, CA systems can differ interms of the rule tables that dictate the changes of state of each node(what Christopher Langton has called the ‘genotype’ of the CA). Eachrule table will produce different configurations depending on theinitial state of the cells (in ALife jargon, the name for these differentconfigurations engendered by identical rule tables is ‘phenotypes’).This simple model constitutes a type of computation that is locallyconnected but spatially extended and capable of global computation.There is no central control determining what each cell should doaccording to a general blueprint or master plan. Each cell is subjectto the same table of rules according to which it changes state at everytime step together with all its neighbours. All the interactions takeplace at the local level, depending simply on the relationship betweeneach cell and its neighbours. And yet this decentralized system ofrules has been shown as capable of producing a high-level computationthat cannot be explained through the action of individual cells. CAsunderplay the importance of individual in favour of collectivedynamics. This collective dynamics, however, is not related back tothe action of a central agency in charge of regulating the fluctuations,but is capable of spontaneous self-regulation. CAs form dynamicmilieus, space–time blocks, that have no real territorial qualities butdo have rich topographies and challenging dynamics.Unlike Turing machines, which George Caffentzis has likened toa kind of Taylorism of intellectual work, CA machines cannot reallybe programmed in the usual sense. 32 It is impossible, that is, todetermine a priori the sequence of configurations that running a CAexperiment will produce, once a set of rules and an initial configurationare given. There is no prediction involved, but a strategy ofproliferation and selection. If one runs a CA genotype long enough,something good may come out of it. If it doesn’t, however, there isno point in trying to fix it. Other genotypes, involving different setsof rules, will be found that are more successful in completing thetask. CAs systems can be classified only a posteriori, that is after theyhave been tried out.Researchers such as Stephen Wolfram and Christopher Langtonhave tried to produce some kind of classification of such distributedcomputational systems. Wolfram was the first to accomplish anempirical classification of CA dynamical systems by cataloguing 100runs of the game of life. As a result of his survey, he found that mostCAs fit within four classes. Class I CAs were those programs that for


Soft Control 113some reasons ended up running out of computational capabilitiesvery quickly by reaching a limit or end point after which nocomputation was possible. Class II CAs, on the other hand, arecharacterized by what chaos theorists describe as ‘limit cycles’: theyproduce self-replicating structures (such as gliders) that spread acrossthe computational space by self-replication. Class III and IV CAs,however, express a qualitative jump in complexity with relation tothe other two. Class III CAs produce fractal structures, that is formsthat do not simply repeat themselves, like those of Class II, but thatare also capable of scaling and hence of progressively structuring theCA space. Class III CAs have proved to be good models of how thebasic function of metabolism is carried out in similar ways inorganisms of very different sizes (from oxen to squirrels). FinallyClass IV systems are chaotic, that is unstable and random but withno predictable time limits (a Class IV CA, for example, is the weatherwhere specific rules produce a high degree of randomness andunpredictability).Artificial Life pioneer Chris Langton also attempted Wolfram’srepeated runs of CA systems (this time running them thousandsrather than hundreds of times). What he came up with was not onlya different order of classification (I, II, IV and III), but also a keymetric or lambda, which is correlated to the rate of informationflow within a CA system. What lambda measures, that is, is thefluctuation of different CA systems with relation to theircomputational abilities. What Langton found was that this metriccould vary between 0 (defining a state where a system is so randomthat it is incapable of computing) and 1 (defining a highly structuredsystem that is not flexible enough to compute). A key finding wasthat the most interesting computational activity takes place ataround the value of 0.5 – a value that within chaos theory isassociated with phase transitions, that is with the points at whicha system changes its state, such as for example when water starts toboil. Within CAs, then, the key area of computation is identifiedwith a border zone fluctuating between highly ordered and highlyrandom CAs. Within this phase, significantly, the behaviour of eachnode ceases to be strictly determined by its immediate neighboursand starts being affected by the overall fluctuation and propagationof information across the CA space. These fluid CAs have beenshown to be capable of the most complex computation thusconstituting a real alternative to Turing’s universal machine, butone that does not respond to direct control.


114 Network CultureSOCIAL EMERGENCEThe outcome of a CA run, then, cannot be predetermined or plannedin its entirety (and as such, as Kevin Kelly put it, it is ‘out of control’).Being out of control does not mean to be beyond control. The typeof control that such fluid populations respond to, however, is quitedifferent from the negative control of autopoietic living organismsthat self-reproduce within closed boundaries. The fluidity ofpopulations, their susceptibility to epidemics and contagion, isconsidered an asset: at a certain value or informational speed, themovement of cells turns liquid and it is this state that is identifiedas the most productive, engendering vortical structures that are bothstable and propagating. It is these dynamic structures, as they areproduced by the propagation of movement in a CA world, that areconsidered computationally interesting. This liquid behaviour istypically characterized by a swerve – that between the moment whenthe model is constructed, through a rule table and an initialconfiguration, and the moment of emergence of useful or pleasingforms. Ideally, CAs should always produce a level of surprise andunexpectedness for the human observer.Getting CAs to compute, therefore, is no easy feat, for it requiresa very careful fine tuning: the rules that define the transition functionsbetween the different possible states of the cell must be determinedwith the utmost care. This fine tuning can only be carried out bysuccessive runs until one is able to find the right computational levelable to carry out a specific task. On the other hand, the fact that aCA system is in principle capable of carrying out a computation doesnot mean that it will actually do it spontaneously. This is why a newlevel of control is introduced – not just the fine tuning of initialconditions but also the modulation of the global aims and objectsof the computation.A common way in which such global modulation of aims is carriedout in CA systems is through the ‘genetic algorithm’ model. Geneticalgorithms are a special type of ‘search algorithm’, like the ones thatrun a popular search engines such as Google. Genetic algorithmswork by searching the computational space and measuring thesuccess rate of different CAs (defined by their genotypes or rule tables)on the basis of their phenotypes (that is the actual performanceproduced by the rules when starting from different configurationsof the state of the nodes). What determines the outcome of thecompetition among CAs is a fitness function that defines the different


Soft Control 115scores attributed to different CAs. Inefficient CAs are screened out,while the most efficient systems are left to compete and even swaptraits with each other. This model has proved to be highly effectivein determining optimal solutions to specific problems, in as much asthis CA race usually ends up by reaching an optimal zone where thetask is computed efficiently and in a minimum amount of time.Successful genotypes, whose phenotypes have proved more successfulthan others, are thus selected for analysis or targeted use.Genetic algorithms thus act as virtual sieves whose meshes can beadapted to the specific purposes of simulation. ‘All that remains afterthis process is the cells that do not conform to the patterns and soform boundaries between the regions of cells that do.’ 33 The behaviourof these active boundary particles is then analysed to see if it conformsto any kind of rule.If such a rule does exist (and can be found), then the CA has beenexplained … If no such rule exists, then the process must berepeated, once again trying to extract patterns out of the mess ofparticles this time, yielding perhaps meta-particles and so on. 34The genetic algorithm approach to CAs has been criticized as beingunable to produce true emergence, that is phenomena of selforganizationand computation that are not explicitly programmedby a human agent. The genetic algorithm model of control is thusdeemed insufficient by some ALife researchers in as much as it doesnot leave enough space to genuine emergence. If the fitness functionsare too strict and too task-oriented, CAs can compute a task that hasbeen given to them, but they cannot produce genuine novelty, thatis events that are not prescribed by initial rules and conditions. 35The genetic algorithm thus describes both a mode of control and itslimits (those of ‘true’ emergence, that is of a potential for transformationthat cannot be programmed or even aimed for).The control of acentred multitudes thus involves different levels:the production of rule tables determining the local relations betweenneighbouring nodes; the selection of appropriate initial conditions;and the construction of aims and fitness functions that act like sieveswithin a liquid space, literally searching for the new and the useful.These sieves separate those configurations that seem to produce staticoutcomes from those dynamic particles that deviate the most fromthe structure. These dynamic particles do not obey laws of statisticalregularity. Because they do not fall within the regular (or the ordinary)


116 Network Culturethey can perform computations at the cutting edge. At the same time,these configurations cannot be allowed to pass all the way intorandomness, where the turbulence is so strong that no real controlcan be exercised. The particles selected thus come to constitute amobile and dynamic boundary zone or active limit capable ofemergent computation. What the von Neumann machine reveals isa kind of content for a cybernetic machine of social control: not afluctuating life threatened by entropic forces (as in the first cyberneticwave), but a level of indeterminate production entrusted to acentredmultitudes that potentially never run out of energy. The content ofsuch control is not a population understood as a mass that must bekept from undergoing dangerous swerves, but a multiplicity ofcomputational milieus spread over a fluid and dynamical space.HACKING THE MULTITUDEIn the late 1990s, the ‘out of control’ techniques of biologicalcomputations found particular favour with capitalist corporations,which have been important sponsors, for example, of the ArtificialLife Center at the Santa Fe Institute. Cellular automata, in fact, modelwith a much greater degree of accuracy the chaotic fringes of thesocius – zones of utmost mobility, such as fashions, trends, stockmarkets, and all distributed and acentred informational milieus.Biological computation parallels the emergence of a larger set ofsocial techniques that are concerned with inducing and controllingthe formation of bottom-up milieus of self-organization. Outside thecomputer medium, that is, biological computation expresses a sociotechnicaldiagram of control that is concerned with producing effectsof emergence by a manipulation of the rules and configurationswithin a given milieu.If we were to look for human CA experiments, for example, wewould find them in the organizational models of the New Economy,such as those documented by Andrew Ross’s ethnography of NewYork’s ‘Silicon Alley’ company Razorfish. For Ross, Razorfish, with itsopen-plan offices and its enthusiastic labour force who did notconsider work as ‘labour’, constituted an important moment ofexperimentation with alternative modes of organization able tocapitalize on the productive capacities of an educated and alienatedGenX milieu. A dynamic and turbulent company that enjoyed anexponential expansion in the late 1990s, Razorfish was proud of itsstimulating working environments that guaranteed a high level of


Soft Control 117innovation and professionalism in digital design. Razorfish’s workingenvironment, in its turn, expressed an encounter between theAmerican East Coast bohemian profile and the Silicon Valley startupsecology – a highly successful model of a decentralized hotbed oftechnological innovations. In this sense, the social experiment ofthe New Economy, like biological computation itself, expressed anencounter between the innovative edges of the capitalist economyand the reflux of a 1960s counterculture and its rejection of dull andrepetitive forms of labour.Another description of the Razorfish offices by Lev Manovitch canfurther illustrate the point:The large, open space houses loosely positioned workspacesoccupied mostly by twenty-something employees … He [themanager] proudly points out that the workers are scattered aroundthe open space regardless of their job titles – a programmer nextto an interface designer next to a Web designer. 36Manovitch describes the space design as defined by computer culture’skey themes ‘interactivity, lack of hierarchy, modularity’. But thesefeatures are also those of the CA machines: a world without qualities,where all elements of a population are conceived exclusively fromthe point of view of local interactions with a view to the modulationof global aims and goals (such as staying on top of the digital designgame). The work culture of the New Economy was informed by amovement of reform in management theory, which emphasized thevalue of letting teams of workers control the production process byintroducing a new set of management rules (decentralization,delegation, deadline, etc.) The New Economy CA proved itself ahighly turbulent and productive one, but was ultimately selected outof existence by the algorithms of a capitalism undergoing anotherof its energy crises. Whatever the fate of singular social CAs, biologicalcomputation is indicative of a mode in which the productivity ofthe milieu, as opposed to the territory, is highlighted: not a closedsystem (a people; a group; a class), but an open milieu (a dynamicmultitude spreading across a smooth space).Biological computing offers us an insight into a wider mode ofsoft control that takes as its focus the space–time of the swerve – thatbetween the moment when the model is constructed, through thepositioning of constraints and a local determination of behaviour,


118 Network Cultureand the moment of emergence of useful or pleasing forms that areselected by exercising pressure or looked for through the elaborationof searching devices. Control is located at the two ends of the process:at the beginning, when a set of local rules is carefully put togetherand fine tuned; and at the end, when a searching device or a set ofaims and objectives aim at ensuring the survival of the most usefulor pleasing variations.It is in this larger framework that we should understand thedefinition of a network offered by Kevin Kelly as ‘the least structuredorganization that can be said to have any structure at all’ but also‘one of the few structures that incorporates the dimension oftime. It honors internal change. We should expect to see networkswherever we see constant irregular change, and we do.’ 37 It is notjust any network that will do to support the interaction of such largenumbers, of this multitude in continuous variation. It is the networkas a ‘grand mesh’, a form able to accommodate all variation and itsmutations – an abstract machine that goes beyond the model tobecome the actual terrain for the study and engineering of complexand innovative behaviours. The open network is a global and largerealization of the liquid state that pushes to the limits the capacityof control mechanisms effectively to mould the rules and select theaims: ‘Running genetic mechanisms online puts heavy constraints onthe selectionist mechanisms that can be used but it brings the experimentalconditions closer to real autonomous robotic agents.’ 38The great discovery of the biological turn is not only that thereexists an abstract machine that can facilitate, contain and exploitthe creative powers of a multitude (human and inhuman). It is alsoabout the discovery of the immense productivity of a multitude, itsabsolute capacity to deterritorialize itself and mutate. What gives thebiological turn its mystical tone, is the discovery of this productiveline of flight, associated with the unpredictability of this middle zone,a relative autonomy and creativity, that is decoded, freed up fromthe constraints of sequential programming, almost at the same timeas it is recoded, brought back into the fold by selection in the formof fitness functions. None of these considerations however takes awayfrom the fact that a new content has entered the control of production:not simply a vague assemblage of information flows and feedbackloops, but a spontaneously productive and autonomous force,endowed with its own specific activity that can be modelled anddetermined only at certain points, by exercising pressure selectivelyand moderately. Between local microdeterminism and the


Soft Control 119transcendental fitness functions, we find that the power of the middlezone can only be partially controlled. It is this middle, autonomousand productive zone, in between local determination and globalselection, that the term ‘emergence’ partially captures.One of the reasons why modelling emergence seems to be importantis because it offers the key to a mode of control that does not requirean absolute and total knowledge of all the states of each singlecomponent of the system or a rigid specification that rules behaviourexactly and sequentially. This new mode of control is ‘soft’, it appliesa minimum amount of force, and modulates ‘specification vs.creativity, closure and replicability vs. open-endedness and surprise’. 39The abstract machine of soft control is thus concerned with finetuning the local conditions that allow machines to outperform thedesigners’ specifications, that surprise the designers but spontaneouslyimprove on them, while also containing their possible space ofmutation.Thus, the founder of the Institute for Bionomics, for example,confidently argues for the necessity of abandoning the dependenceof economic science on Newtonian physics.Where mainstream economics is based on concepts borrowedfrom classical Newtonian physics, bionomics is derived from theteachings of modern evolutionary biology. Where orthodox thinkingdescribes the economy as a static, predictable engine, bionomicssees the economy as a self-organizing, ‘chaotic’ informationecosystem. Where the traditional view sees organizations asproduction machines, bionomics sees organizations as intelligentsocial organisms. Where conventional business strategy focuses onphysical capital, bionomics holds that organizational learning isthe ultimate source of all profit and growth. 40We have come to associate such statements with a short-livedphase of capitalist euphoria, but notions of bottom-up organizationof large numbers within fluid spaces are still central categories inour apprehension of a medium such as the Internet. To many, infact, the strange behaviour of the Internet, especially its capacityto expand and mutate with no plan and no central controller incharge, has appeared uncannily lifelike. The Internet as a mediumand a culture appears to many as a macroscopic demonstration ofthe existence and feasibility of acentred and leaderless forms oforganization that mirror some of those that we are familiar with


120 Network Culturein the natural world. After all, the Internet mainly developed as aparallel, piecemeal, localized activity, so that it can be classified,according to Sadie Plant for example, as ‘bottom-up’. 41 ElectronicFrontier Foundation pioneer Jon Gilmore’s claim to immortality isprobably to have coined the sentence: ‘The Net interprets censorshipas damage and routes around it.’ 42This catchy statement, a refrain of network culture, is groundedin important technical features of the Internet and guaranteed by asocio-technical culture that similarly emphasizes autonomous anddistributed forms of organization of labour. The popularity of peerto-peernetworks, open-source software or recent phenomena suchas web logs is only the most recent example of what seems to manyan intrinsic vitality of a bottom-up, piecemeal, parallel approach toorganization and the culture that it supports. This spontaneousproductivity is said to be intrinsically related to the distributed anddecentralized organization of large numbers of interacting peers andto be a feature of social, technical and natural systems. It is an excessiveproduction of cooperation and interaction that has brought forththe development of new techniques of control.Does this imply, then, that the Internet as a medium and as acultural multiplicity by virtue of its loose, bottom-up technicalstructure, has managed to replicate some of the features of the naturalworld? The description of the Internet as an ecosystem, inhabitedby knowledge, and substantially self-organizing was common in themid 1990s, when neoliberal and conservative writers such as AlvinToffler, George Gilder, Esther Dyson and Newt Gingrich used it toforcefully reject the Clinton administration’s description of cyberspaceas an ‘information superhighway’. 43 In Being Digital, NicholasNegroponte, popular columnist on Wired and director of the MediaLab at MIT, similarly considered the Internet to be a remarkable‘example of something that has evolved with no apparent designerin charge, keeping its shape very much like the formation of a flockof ducks.’ 44Controversy around such statements marked the ‘Californiaideology’ wars of the early 1990s, controversies that pitted the technoutopianismof Californian hippy entrepreneurs against the criticalobjections of sociologists and cultural activists. The controversyaround the self-organising and lifelike nature of the Internet splitthe 1990s cybercultures along neat ideological lines. Such anopposition could be possible in an atmosphere where social scientistsand humanities scholars had mostly aligned themselves with a radical


Soft Control 121social constructionism, according to which nothing social can alsobe natural. Starting from the notion that everything that is naturalis fixed and predetermined, many writers rejected the analogybetween the Internet and natural systems on the basis of the idea(often justified) that such statements implied a kind of neo-socialDarwinism. To say that the Internet might be lifelike was theequivalent of sanctioning the ravages brought by rampant freemarketcapitalism on the ‘excluded’ masses.But wasn’t somehow such an exclusive and often vehementrejection of ‘natural metaphors’ and ‘analogies’ as they were calledmissing something as well? The notion that the Internet presentedfeatures and behaviours that could also be observed amongst naturalphenomena was not simply a set of statements meant to organizethe perception that gave social form to a new medium. The study oflifelike behaviour, in fact, is not simply a rhetorical exercise but hasbeen accompanied by a larger move that connects social to naturaland technical components. The biological turn is, as we have seen,not simply a new approach to computation, but it also aspires tooffer a social technology of control able to explain and replicate notonly the collective behaviour of a distributed network such as theInternet, but also the complex and unpredictable patterns ofcontemporary informational capitalism. Thus, the simulation of thebehaviour of ‘a multitude of simultaneous actions’ is also seen as thekey to understanding not only the behaviour of stock markets, butalso that of ‘fashion and fads’. 45The biological turn thus seems to extend from computing itselftowards a more general conceptual approach to understanding thedynamic behaviour of the Internet, network culture, milieus ofinnovation and contemporary ‘deregulated’ markets – that is of allsocial, technical and economic structures that are characterized bya distributed and dynamic interaction of large numbers of entitieswith no central controller in charge. These systems are notunstructured or formless, but they are minimally structured or semiordered.Since the turbulent flow of information ensuing from a‘multitude of nonlinear simultaneous actions’ is not exclusive to theInternet and market economies, but can also be observed in a varietyof natural phenomena, then it is quite appropriate that this specificpotential of the natural world should become the object of intensetechnical, cultural and economic interest. The ‘business network’ ofthe Institute for Complexity in Santa Fe, for example, includesCitibank/Citicorp, Coca-Cola, Hewlett-Packard, Intel, Interval, John


122 Network CultureDeere, Shell International B.V., Xerox and a variety of financialadvising companies. ‘Early on, the CEO of Citibank/Citicorp becameinterested in SFI and helped begin SFI’s program in understandingthe world economy as a complex evolving system.’ 46Any discussion of the Internet as a lifelike phenomenon seemsthus to be entangled with the reformulation of the problem of control,in terms which are more appropriate to the behaviour of the newentities that contemporary scientific and technological research areliterally discovering, as Ilya Prigogine put it, after years of neglect:open systems subject to a large variety of semi-autonomous variables.Control here is cybernetically defined in two ways: as the oppositeof mechanical rationality (step-by-step programming), because thelatter is too rigid and ultimately too brittle to operate on such terrain;and also as the antithesis of centralized government, because thelatter presupposes a complete knowledge of each individualcomponent of the overall system, which is impossible to achieve inthese types of structure.Taylorism and governmentality are thus both rejected as unsuitedto this new turbulent, but also hugely productive terrain. At the sametime, however, cybernetic control as defined by the first wave ofcybernetics, that associated with the work of Norbert Wiener, is alsorejected. It no longer sufficient to neutralize all positive feedback,that is all new variations and mutations, by bringing the system backto a state of equilibrium (negative control is acknowledged asultimately ineffective in staving off the forces of chaos). The openand productive systems studied by the biological turn are, bydefinition, always operating in far-from-equilibrium conditions,dynamically perched between two layers and conditions: one rigid,unmovable and ultimately sterile, associated with permanence andstasis; and another chaotic and turbulent, opening up onto unexpextedand potentially catastrophic transformations. The problem ofcontemporary modes of control is to steer the spontaneous activitiesof such systems to plateaus that are desirable and preferable. Whatwe seem to have then is the definition of a new biopolitical planethat can be organized through the deployment of an immanentcontrol, which operates directly within the productive power of themultitude and the clinamen.THE UNHAPPY GENEIt seems then, as if the science of multitudes has definitely given upon the individual, which it dismisses as an epiphenomenon that is


Soft Control 123simply too coarse and rigid to be more than a by-product of emergence.If there is an abstract social machine of soft control, it takes as itsstarting point the productivity of an acentred and leaderless multitude.However, we might also say of the individual what Michel Foucaultsaid of the family in his analysis of the rise of governmentality inthe modern state. Talking about the new place of the family in themode of governmentality, Foucault comments that the familydisappears as a model but is kept as a tool of government. We couldsay a similar thing for the development of soft control. The new placeof the individual in the mode of immanent control is not as a modelfor the organization of a multitude, but as a tool that allows theovercoding and the ultimate containment of the productive powerof flows. To the decoding of the mass into a network culture, to thedissolution of the individual into the productive powers of amultitude, corresponds an over-coding of the multitude onto theindividual element understood as a unit of code modelled on thebiological notion of the gene.Among the mixture of disciplinary insights drawn upon bybiological computing, in fact, we find the controversial thesis ofsociobiological thinkers such as Richard Dawkins, author of popscience bestsellers such as The Selfish Gene, The Blind Watchmaker,and River Out of Eden. To simplify Dawkins’ work somewhat, we mightsay that he understands the variations of populations as ultimatelydetermined by the ‘selfish’ drive of individual genes. This selfish drivecompels them to replicate themselves at the expense of othercompeting genes. The human body, or for that matter the whole ofthe world of organic and inorganic life, is simply about the set ofdevices through which selfish genes manage to replicate and protectthemselves by competing with other genes in an environmentcharacterized by scarce resources. Dawkins himself experimentedwith biological computation (and he wrote about it in The BlindWatchmaker).The concept of the selfish gene is crucial to biological computing,and therefore relevant to our understanding of soft control. It is herenot simply a matter of ideological affinity between the white malemilieu of Alife researchers (as described by Stephen Helmreich) andthe sociobiological perspective. It is not so much, in fact, thatbiological computing is influenced by sociobiology, as that they bothshare a keen understanding of the necessity of introducing somekind of ‘cut’ in the fluid fabric of a population for the purposes ofartificial synthesis of the computational capacities of natural life. In


124 Network Culturethis sense, sociobiological work such as Dawkins’ does not so muchinform as clarify the modalities according to which the individual isgiven a new role to play within the open plane of emergence.Dawkins defines a gene in computational terms as ‘a sequence ofnucleotide letters lying between a START and an END symbol andcoding for one protein chain’. 47 What characterizes this unit is itscapacity to replicate itself and to survive through a large numberof successive individual copies. 48 There is no fixed measure forsuch units:I am using the word gene to mean a genetic unit that is smallenough to last for a large number of generations and to bedistributed around in the form of many copies. This is not a rigidall-or-nothing definition, but a kind of fading-out definition, likethe definition of ‘big’ or ‘old’. The more likely a length ofchromosome is to be split by crossing-over, or altered by mutationsof various kinds, the less it qualifies to be called a gene in the sensein which I am using this term. 49This unit is endowed with a minimum set of capacities: the capacityto replicate itself, where replication is a kind of dynamic mobility,because by replicating, genes also tend to mutate; the capacity tocompete for scarce resources; and the capacity to collaborate, butonly if collaboration suits the selfish aims of the gene, that is itsfreedom of replication.What I have done is to define a gene as a unit which, to a highdegree, approaches the ideal of indivisible particulateness. A gene isnot indivisible, but it is seldom divided. It is either definitely presentor definitely absent in the body of any given individual. 50What Dawkins’ theory allows is the replacement of the individualby the unit or, as Deleuze named it, a ‘dividual’ resulting from a ‘cut’within the polymorphous and yet nondeterministic mutations of amultitude. 51 Dawkins is very explicit in defining the individual asan unsuitable basic unity for the kind of giant computationalcapabilities that underlie the evolutionary process. It is not a matterof immortality, because individual genes or units of code are notimmortal. They have emerged at some times out of the chemicalinteractions of a turbulent matter–energy continuum and will dieeventually, even if their lifespan can be measured in thousands or


Soft Control 125even millions of years. Genes, however, are not individuals but unitsin the sense that they do not grow senile, they are never young orold, that is they are not subjected to the second law of thermodynamicsthat decrees that the individual organism is bound to die and decay.As informational units, all that matters is their capacity to replicatethemselves and survive in their many copies – or fail in replicatingsuccessfully and hence disappear. In this sense, the life of a unit isbinary: it is either there or not there; it does not grow any older oryounger with time; it is a cut in the body of the multitude that makesit more manageable from the point of view of the replicability andsynthesis of a specific type of control diagram.Dawkins’ formulation of the gene is thus perfectly suitable to findits application in the biological turn, because it defines genes bymeans of the cut that gives a program a functional beginning andan end. The computational abilities of the gene unit are notconstructed or attributed by Dawkins to the gene, but, on the basisof contemporary scientific knowledge and research, they have beenshown to correspond to some of the capacities of genetic molecules.They have also provided the basic concept through which thebiological turn has managed to translate these ideas into actualworking pieces of software, that are capable of producing their ownemergent phenomena. In this sense and at its most basic level,Dawkins’ understanding of the gene is of genuine relevance to anyattempt at modelling natural laws in a technical machine. If a geneis a unit of code, that is identifiable as lying between two symbols,one designating Start and the other End, then it can be easily codedby a computer. If one writes several units of codes and lets them befree to pursue their survival by replication, they will at some pointmanifest different degrees of emergent behaviour. At the level ofsimulation, identifying a unit of code as an individual allows bettermanipulation, and greatly enhances the possibility of determiningand applying local rules of behaviour. It also allows the identificationof units that can be rewarded and/or punished, selected and/orrejected.Biological computing suggests that the bounded organism containsboth the pre-individual and the collective – two levels of being thatare infinitely more productive than the individual as such. They aremore productive because they do not produce thermodynamically, likethe organism, for example, that burns heat and progressively drivesitself to a slow decay and finally death. As we have seen, units ofcode are mortal, but they do not grow old. They are either there or


126 Network Culturenot there and when they are there they are always productive, alwaysdoing something.The most common critique to Dawkins’ theory from sociologistsand cultural theorists is that this little unit of code is classified as‘selfish’, that is juxtaposed with a category that belongs to verydifferent planes of organization – those of a Protestan–capitalistapparatus of subjectification. 52 Why should a unit of code be subjectedto the moral universe of good and evil, which is where ‘selfishness’and ‘altruism’ are located? In the passage from the description of thegene as a unit of code and the unconvincing description of the geneas ‘selfish’ like a ‘trade unionist’, or a ‘Chicago Gangster’, somethingelse happens. The unit of code that we know as the gene has beenreturned to individuality so that it might assume the attributes ofselfishness and altruism, competition and collaboration. The modeof existence of the selfish gene (the individual) as opposed to the‘gene’ (unit of code; genetic algorithm) is the distance that separatesthe simulation of molecular life and the capture of the powers of amultitude in a network culture. If the gene is a unit of code thatmakes evolution a computing machine, the selfish gene is thesubjectifying function that turns a multitude into an assemblage ofisolated individuals.The selfishness of the subject of informational capitalism whichis the underlying metaphor here has little to do with actual genes.As Dawkins himself admits, genes have no ‘purposes’, they obeyobscure impulses dictated by complex chemical laws. With no senseof purpose, arguably, there is no self and hence no selfishness. Whatthe term ‘selfishness’ does, however, is to betray some of the waysin which the social powers of the multitude are captured. Selfishnessis defined by Dawkins as a sociobiological tension between competitionand collaboration – where the gene is like a calculating machinealways weighing the advantages of collaborating or competing inorder to gain an advantage of survival. If the selfish gene is a subject,it is because it thinks, and it can think only two thoughts: in aparticular situation, do I increase my chances of survival bycollaborating with other units? Or am I better off looking afternumber one to the exclusion of and in competition with others?Selfishness closes the open space of a multitude down to a hole ofsubjectification.The selfish gene is a simple diagram of the apparatuses ofsubjectification that the abstract machine of soft control distributesand perpetuates not so much among molecules as among collectivities.


Soft Control 127Television, as the culturally sensitive medium that it is, has beenquick to pick up and amplify these mechanisms in ways whichcomplement our analysis of soft control in the biological turn. RealityTV games such as Big Brother, Survivor, Desert Island, and Pop Idoldramatize the schizoid tensions that emerge when the subject isplaced within an abstract machine that requires the coexistence ofcompetition and collaboration under the aegis of selfishness. Therehas been much talk about how the New Economy turned around theclassic Malthusian emphasis on scarcity in order to promote theunlimited promise of abundance of the digital domain. Within realityTV games, we find some of the outlines of the peculiar forms of postscarcitycompetition, and their psychic drives, that are widespreadin informational capitalist cultures.Reality games can be seen as cellular automata that operate bycapturing a segment of the audience within a space that is both closed(a house, a competition) and open (subjected to the whims of ratingsand popular votes). As such they demand the impossible from theirwilling participants: that they relinquish their individuality by beingforced to interact continuously with a group which will decide theirfate (hence they must become a unit ready for selection); that theyrelinquish their privacy by being continuously placed under thesurveillance gaze of a camera; and, at the same time, that they holdonto and reinforce such individuality as part of the competitivestructure of punishments and rewards of the game. They demandthen a self that is stripped down to the capacity to collaborate andcompete by a strict set of rules operating within an economy ofpunishments and rewards, which determine the persistence ordisappearance of the self as such. The group dynamics that areengendered by the distribution of the space, the set of initialconditions, the state of the cells within the system (the contestants),and the rules applied by a transcendent entity (Big Brother’s voice…) produce a kind of ‘emergent entertainment’. In reality gameshows, the competition is, at an immediate level, for the prize thatonly one of the contenders will be able to gain. More fundamentally,however, the competition is for the fleeting, rather than scarce, flowof the audience’s attention and sympathy that from the outside keepimpinging on the game and repeatedly push it towards a claustrophobicinstability. Franco Berardi has accurately described some of the effectsof these contradictory pressures on the informational subjectivitiesas the ‘unhappiness factory’. 53


128 Network CultureIt is understandable, then, that rebellion to the claustrophobicselfishness structure (with its two poles of competition andcollaboration) should be explicitly marked by the rejection of beingsubjectified as selfish genes striving for survival at the expense ofothers. The most challenging areas of network culture in terms ofcontrol are those that emerge out of a choice for the chemicalinteraction of relationships of affinity and/or war within a space thatis radically open over that of selfishness/cooperation within a closesubjectivity structure. Not altruism against selfishness, but relationshipsof affinity and war (a wholly different economy of relations) that cutthrough the space of the individual without reducing it to a unit – butfreeing up a potential for transformation and even catastrophe. Fromthe point of view of the abstract machine of soft control, there is noontological difference between the threat of a global network ofterrorists able to carry out devastating attacks on the heart of empireand the threat of a global network of anticapitalist activists (hencethe recurrent and contested claims, after 11 September, that themovement for global justice was potentially terrorist), or the behaviourof connected peers exchanging copyrighted files without paymenton peer-to-peer networks. I am not implying that they are all the same,of course, or that the punishments can be compared. The potentialsfor destruction and creation are also very differently weighted. Butfrom the perspective of this mode of cybernetic control they doexpress different sides of the same rebellion: the rejection of theexistential condition of living as a stripped down selfish gene, endowedwith the intoxicating capacity to form a multitude, but recoded withinthe claustrophobic black hole of the selfishness structure (cooperation/competition). The threat of these swerves, from the perspective of theengineers of control, is that by rejecting the system’s most basic setsof constraints, by rejecting the micromoulding of dividualism, theymight push it out of control, towards a new plateau, whose outcomenot only cannot be predetermined but might also veer the systemviolently towards catastrophic transformations.There is a big gap, of course, between the small pieces of code thatwe know as genetic algorithms and cellular automata, and thedynamics of political resistance in network societies – a gap thatactualizes itself in divergence and turbulence. The selfish gene,however, is not just a metaphor, or a moralization of natural life oran ideological justification of cut-throat competition in the ‘free’market economy, but more insidiously a technique. It is a mode ofcapture of value produced by an increasingly interconnected and


Soft Control 129interdependent culture in as much as the latter is also an industry– and hence a mode of labour. This excessive value that can neverbe really reabsorbed in a logic of exchange and equivalence isfrequently referred to in autonomist Marxist writings as a kind ofbiopower of labour – that is a power of making and remaking theworld through the reinvention of life. That nothing could be furtheraway and yet so close to the models of biological computation sayssomething about the political stakes involved in the emergence (andcontrol) of network societies.CODA ON SOFT CONTROLIf we open up a path for critical inquiry and conceptual engagementwith biological computation beyond the deconstructive critique, arewe necessarily playing the game of power, that is accepting thenaturalization of social relations? In a way, we are, in the sense thatwe accept that the game of power is the only game in town in asmuch as it identifies and enacts an indetermination of the social andthe natural across a microphysical continuum that denies the humanthe ontological status of an exception. On the other hand, beyondthe easy rhetoric of popular accounts of self-organization, the naturalthat emerges out of biological computing is as artificial as the social– indeed it is the artificiality of the natural that the social takes overand reinvents.In this sense, biological computing opens up two interestingquestions for a cultural politics of network culture. On the one hand,it challenges us to think about how a certain mode of distributedorganization can become also the milieu for the development of newmodes of control. Thus it takes the notion of self-regulation andorganization in large numbers outside any mythological landscapeof a utopia to be realized and places it firmly within the horizon ofemerging modes of power. Self-organization, in other words, is notincompatible with transcendent control or with the ‘unhappinessfactory’ assembled by informational capitalism.On the other hand, and more intriguingly, an engagement withbiological computation and the sciences of emergence offers us away to engage with the political concept of the ‘multitude’ beyondthe temptation of reconstituting a new, indefinite subject of history.As defined by Hardt and Negri in Empire, and as adopted within theactivist milieus of network culture, a multitude defines a politicalmode of engagement that is located outside the majoritarian and


130 Network Culturerepresentative model of modern democracies in their relation withthe recomposition of class experience. Unlike class, however, amultitude is not rooted in a solid class formation or a subjectifyingfunction (although it is also a matter of class composition). It is tooindefinite a concept to carry such power. For Franco Berardi, ‘[t]henotion of the multitude describes a tendency to dissolution, theentropy that is diffused in every social system and which rendersimpossible (‘asintotico,’ infinite, interminable) the labour of powerbut also the labour of political organisation’. 54 Like the smoothmilieus of biological computation, the multitude too is a necessarilyvague term that is defined mainly by a fluidity of movement and bythe formations that such fluidity leaves behind as a kind of aftereffect.As such, it does not deny the existence of the stratificationsof identity and class, but it opens up another dimension where suchpositions are caught in terms of other types of capacity. If this is thecase, then biogical computation (in its widest possible sense) is anattempt to ‘hack the multitude’ – to hack the social at its most fluidand least stratified, wherever it escapes the constrictions of rigid formsof organization but also of identity and class. As such and beyondsome of its most simplistic applications, the CA model has much tooffer to any attempt to think about processes of bottom-uporganization and emergence in network culture, their relationshipto the reorganization of capitalist modes of production and thepolitical potentials that such reorganization opens up. Hacking themultitude is still an open game.


5Communication BiopowerTHE NEW SUPERPOWERIn his bestselling critical account of his time as the chief economistand senior vice-president of the World Bank, Joseph Stiglitzrepeatedly suggests that one of the problems with internationalinstitutions is their lack of transparency and hence accountability.The systematic manner in which the International Monetary Fundmanaged effectively to undermine growth in developing countriesand enrich foreign investors was for him a result of a culture of secrecyin which the actions of the Fund were not subject to a sustainedpublic scrutiny (whilst also being skewed in favour of US lobbies’economic interests).Secrecy also undermines democracy. There can be democraticaccountability only if those to whom these public institutions aresupposed to be accountable are well informed about what they aredoing – including what choices they confronted and how thosedecisions were made. 1Thus for Stiglitz, freedom of information is paramount and ‘sunshineis the best antiseptic’, that is exposure of such dealings to the lightof public opinion is conducive to healing the malaise of internationalgovernance. Or, as it has also been put, within a one-world powersystem, ‘public opinion is the new superpower’.In asking for more transparency and better accountability, Stiglitzis echoing one of the most fundamental assumptions of modernpolitical thought, in which the relationship between transparency ofcommunication and democracy is foundational. 2 From Diderot andVoltaire to Thomas Payne, modern conceptions of democracy startfrom the demands of bourgeois revolutionaries for free speech andpolitical representation. A democracy does not just guarantee but isguaranteed by the rights of its citizens to representation in the spheresof both politics and communication. These rights include that ofaccessing information concerning the exercise of public authority131


132 Network Culture(as expressed for example in the US Freedom of Information Act(1966)) and that to have one’s position represented in the spectrumof positions. Freedom of information and communication sustainsfreedom of speech and freedom of speech supports democracy.Without access to a public space of information and communication,citizens would not be able to learn how the res publica is run, todevelop informed opinions, and to express them and exercisepressures on governments. Without a public space in which toexpress and communicate ideas and form a shared opinion, thereis no democracy. 3In The Structural Transformation of the Public Sphere, Jürgen Habermasoutlined the original model for such a conception of the relationshipbetween communication and the political – the bourgeois publicsphere emerging in Western Europe in the late eighteenth century.Habermas defines the bourgeois public sphere as a distinctive spacewhere private individuals assemble to form a public body. ‘Theythen behave neither like business or professional people transactingprivate affairs, nor like members of a constitutional order subjectto the legal constraints of a state bureaucracy.’ 4 As constitutedwithin a public space of communication (such as the press, clubsand societies of eighteenth-century Europe), the public mediatesbetween constituted power, that is the state and its institutions, andthe private autonomous interests of free economic agents. A politicalpublic sphere is based on freedoms of assembly and association,including the freedom to express and make public one’s opinions.For Habermas, in its early manifestations when it coincided withthe emergence of the bourgeoisie as an ascending class, the publicsphere allowed an independent space from where to ‘rationalize’ thepublic exercise of authority. Thus the public ‘transforms political intorational authority within the medium of the public sphere’. Publicopinion accepts that the power of government must be delegated,but it reserves the right to check and monitor the actions of politicians.‘In a large public body, this kind of communication requires specificmeans for transmitting information and influencing those whoreceive it.’ 5 At the same time, the overwhelming power of the mediain the political life of social democracies has led some to argue thatthe media have now become the ‘new public sphere’. This mediationof politics by the media in mass democracies has been a constantmotif of crisis for the liberal axis linking communication to reasonand progress. 6 It is as if communication had been returned to a pre-Enlightenment mode – that of the spectacle, gossip and manipulation


Communication Biopower 133which are seen as undermining reason, rather than being a mediumfor its expression.The relationship between communication and democracy hasnever actually been the exclusive domain of a ‘public sphere’ in whichcitizens can monitor the action of elected and accountablegovernments. Communication has never only been about thesunshine of reason illuminating the dark secrets of governance, butit has always cast its own shadows – those of a manipulation thattakes as its object the blind passions of the masses. At least since thetotalitarian regimes of the mid twentieth century proved the powerof mass communication, the relationship between communicationand the political has been a murky affair that cannot but puzzleenlightened reason. It has become impossible to ignore the way inwhich much communication is not simply about access to informationand public debate, but is also about manipulation by way of positive(spin, propaganda, hegemony) and negative tactics (censorship,exclusion, distortion, etc.). The impossible task of the public spherethus becomes that of returning communication to an older, purerfunction by combating the corrupting influence of manipulation,censorship, propaganda and spin. However, this activity would bepointless if this public sphere did not aim to represent and addressan inherently interested and enlightened public to whose opiniongovernments are bound. If such reasonable public opinion was shownnot to exist or to be ineffectual in influencing the actions of politicians,then the organs of public opinion would lose much of their power.The problem is that this entity, this public which is deemed to existsomewhere at the end of the communication process, those citizens/audiences/readers, often do not seem to embody the qualities of the‘enlightened citizen’ at all. Media power, specifically the power of themass media, appears as partially incompatible with the eighteenthcentury’s model of political communication. How else to explain thefact that it is predominantly populist and/or authoritarian voicesthat seem to be almost natural masters of the mediascape? How doesone explain right-wing talk-show hosts, Ronald Reagan, MargaretThatcher, tele-evangelists and tele-marketeers, Silvio Berlusconi,George W. Bush, Pop Idol, Osama bin Laden and Tony Blair?The most common way to explain all of the above (with all duerespect to the differences), is to point out how material access andcontrol of the media is restricted to those who can afford it. Thecentrality of communication to political life has made a massiveinvestment into media culture by corporate actors and institutional


134 Network Cultureparties both rational and inevitable. It is simply a matter of capitalexpenditure: it pays off to control the media, and after all, if youhave money and power, access to the media almost comesautomatically. Communication and media culture reproduce theopinions of their owners and it is no wonder that they should supportconservative ideological formations. With control of the media firmlyin the hands of the ruling classes, the masses can be seduced into aconsensus. Whether it is about TV tycoons turned politicians or aboutdirect political control of the media by corporate lobbies,communication is today deemed to be driven by private interestsand thus on the whole to be unabashedly manipulative and openlypopulist. The domination of our mediasphere by gossip, celebrities,fashion, salesmen, propagandists and special effects is thus a signalof this ‘perversion of communication’.The manipulation of public opinion today is no amateur businessbut is a field of systematic research, corresponding to the developmentof specific techniques that make the formation of hegemonicconsensus an affair for professionals. 7 Thus, the public sphere of thewelfare state and mass democracy is described by Habermas in termsthat are markedly different from those of the bourgeois public sphere.While the bourgeois public sphere comprises individuals engaged inpublic discussion, within mass democracy, the state and privateinterests are directly involved, as the pressure on journalists and theaggressively televisual nature of politics demonstrates. The currentpublic sphere is not a sphere of mediation between state and civilsociety, but the site of a permanent conflict, informed by strategiesof media warfare. Communication is not a space of reason thatmediates between the state and society, but is now a site of directstruggle between the state and different organizations representingthe private interests of organized groups of individuals. The corruptionof communication that many see at the heart of the corruption ofdemocratic life is thus blamed on the illegitimate interference ofprivate interests in the public sphere. But if this is the case, andconsidering the entrenched interests that rule the media industrywith ferocious determination and expansionist aims, and the passivityof the masses, can communication be saved at all? And if not, howdo we switch it off?THE MASSES’ ENVELOPMENTThere is always another solution of course. If communication hasbeen ‘corrupted’ by private interests, the argument goes, then the


Communication Biopower 135reconstitution of a free and open space of communication should bea key force in driving the return of a more authentic democratic life.If the worst danger to democratic life today is the political andoligopolistic control of the media system, then a new medium ofcommunication that would be somehow free of this ‘old media’baggage is extremely important. It is on these foundations that thehopes for ‘new media’ (and more specifically the Internet) were laid.As soon as the Internet started to materialize as a set of relays andlinks between different computer networks, it produced a widespreadand hopeful expectation of a resurgence of the public sphere in a‘cyberdemocratic mode’. A networked multitude, possessing its ownmeans of communication, freed from the tyranny of broadcasting,would rise to challenge the phony public sphere of television andthe press. Since then, the Internet, as we shall see, has proved to bean effective political medium in terms of its power of mobilizationand the openness of its information space. It has thus made visiblethe existence of a global networked mass with a stake in regional,national and global political processes. At the same time as suchmasses have appeared on the global mediasphere, however, theproblem has become that of the other mass, the silent majority, thetelevision public held hostage by the powerful media monopolies ina topsy-turvy world of propaganda and simulation.It is not so much a question here of opposing a good, networkedmass with a bad, couch-potato mass (there are plenty of couchpotatoes on the Internet). It is rather about understanding what kindof relation might exist between these two formations as they aregiven to us within the sphere of communication. The politicalcategory of the mass, or even that of the silent majority, is not verypopular within media and cultural studies – which, from RaymondWilliams onwards, has tended to identify it with a kind of conservativemodernity, apopulist and thus implicitly anti-working class. It wasalso one of Jean Baudrillard’s most unsettling propositions (at leastfor his critics), that the masses do not need or want a ‘political–intellectual class’ (including activists and critics) to teach them howto avoid manipulation by the media or to coalesce behind anotherconsensus. On the contrary, for Baudrillard, one should always keepin mind that the masses are a stronger medium than the media andthat the masses have never been on the side of reason, to which theyalways preferred the seductive power of the spectacle – whether ofgladiators’ fights, public executions, sports, games, ceremonies,fireworks or special effects. For Baudrillard the media do not


136 Network Culturemanipulate the masses, but it is the masses who ‘envelop the media’because they are themselves already a medium.Indeed it does seem at times as if the media manipulation byspin doctors, cultural populists and public relations officers wereactually giving in to a disturbing request, coming from somewhereon the other side of the camera and the screen. The power that isformed by a mass or that gives rise to a mass is amorphous anddemanding, always implicated in the rise of a desire that demandsdistraction – more and better images; better and bigger effects. ForBaudrillard, the spectacularization of communication, and hence thespectacularization of politics, is thus not imposed on the masses, butdemanded by the formation of a mass. The masses are not specificsocial classes, but more of a generalized dynamics that takes overwhen you take away all attributes, predicates, qualities or referencesfrom a large number of people. The mass, that is, is a lowest commondenominator, not in the sense of a loss of quality but as a kind ofpre-individual and collective potential to be affected. This is aboutthe physical capacity of a large number of bodies to form a kindof passive mass – a receptacle for the affective power of images.No longer the mass congealed and energized by the containmentstrategies of the industrial revolution and its disciplinary enclosures,but a kind of terminal mass – atomized and dispersed at the end ofcommunication receivers, deprived of its revolutionary power in akind of entropic dispersion.In Baudrillard’s astute analysis of the mass, however, the latterretains throughout its history a kind of passive power – that of excess.Baudrillard’s own example is that of medicine: the medicalization oflife devised as a means of chemical control of the masses’ bodies ispushed to the extreme by patients’ demands for more and moredrugs: ‘an excessive, uncontrollable consumption of medicine, apanicked conformity to health injunctions’. 8 We might imagineBaudrillard’s take on the recent vicissitudes of the stock market. Lookat what happened when the masses entered the market, he mightsay. They demanded more stocks, larger and quicker profits, and theyalmost broke it! They inflated the market, causing it to crash. Thehyperconformity of the masses did not spare anybody: CEOs andregular employees; small and large investors; investment firms andnational governments; directors of national banks and privateinvestors. As a result of the entry of the masses into the New Economy,the whole stock market went through an unsustainable accelerationuntil it crashed. 9 If one really thought it through, a public sphere,


Communication Biopower 137understood as the space of free formation of public opinion, wouldbe incompatible with mass democracy. Wherever there is a mass,there is the preponderance of an economy of spectacles,hyperconformity and excess.It would be easy to ‘correct’ Baudrillard’s provocations by pointingout that there is a good deal of speculation going on, both in thepharmaceutical and in the financial industry; and that in the end,it was the small investors who paid dearly for some cynical operationsperformed by large investment banks. Or we might object that suchan understanding of the masses is aristocratic in nature and does notdo justice to the active relationship between individuals and mediapower. On the other hand, such arguments would miss the point ofBaudrillard’s challenge. What his understanding of the masses or thesilent majority explains is a certain quality of communication thatis usually perceived as a problem of media societies, at least from theperspective of a modern conception of politics based on the exerciseof reason within a transparent public sphere. Baudrillard’s argumentcould be reinterpreted to indicate that the powerful presence of themasses, or of silent majorities, in mainstream media or in massdemocracies poses a repeated problem for postmodern politicalthinking. This problem is that of imaging a ‘political without thesocial’ – if we understand the masses as a nonsociological category,a category that does not possess any social qualifications such as classor gender or ethnicity or even a geographical place. Understood inthis sense, as a political entity with no social foundations (either inthe relations of production or in the economy of gender or ethnicityor race), the mass appears as an inertial force and a zone of implosionof social energies. (As the a.f.r.i.k.a group put it in a posting to themailing list nettime, ‘Everybody knows that the Ozone belt is fadingaway. Everybody knows that the rich are getting richer and the poorare getting poorer …’.) 10At the same time, however, this asocial quality of the masses iswhat makes it a crucial entry point into another relation betweencommunication and the political – beyond and beneath the play ofreason. Baudrillard calls the moment when the masses take over thezero degree of the political, the moment where the spectacle meets‘the grey eminence of politics’. The paradox is that such zero degreeof the political is defined as the moment when society has reacheda stage of maximum socialization – where everything has becomesocial, that is mediated, signified and subjectified. For social andcultural theory, in fact, in order to become a subject, in order to be


138 Network Cultureable to act, we must first be ‘socialized’ – that is our raw subjectiveexperience must be made social. We are socialized when we learnwho we are and where we belong by acknowledging and internalizingour place in the great differential grid of society. We are socializedwhen we learn about the difference between self and other, subjectand object, between the way ‘we’ are (women, white, children,workers, managers, abuse survivors, single mothers, single men, etc.)and the way ‘they’ are (men, black, adults, capitalists, workers,abusers, husbands, single women, etc). Socialization implies theintervention of mediations and signifying chains in the formationof subjectivity. We become what we are by assuming a role that isdefined for us by another subject – that is somebody who re-presentsand makes that role meaningful to us. However, for theorists of latemodernity from Fredric Jameson to Anthony Giddens, these images,beckoning to us, asking us to identify with them, have simply becometoo many (the mirror of the social has multiplied in a fun-houseeffect). From every side, the social demands our attention.Socialization thus implies an overproduction of meaning, a stateof always being told and asked too much. It is a matter of havingbeen told too many truths and too many opinions and perspectivesso that communication ceases to be representational and becomestactical and strategic. It does not simply represent and makemeaningful, but it shifts, it touches and it commands. This situationhas not just produced a state of ‘reflexivity’, where we continuallypolice ourselves trying to produce the right identity, but also a stateof being a mass, the relief of being in a mass. Masses (‘you, me andeverybody else’) are thus not definite sociological categories likeclasses. The masses are everywhere and in everybody in as much asthey lie at the points where all mediations have collapsed andmeaning no longer takes hold. The masses are the place wheremeanings and ideas lose their power of penetration, the place offascination and dismediation where all statements, opinions andideas flow through without leaving a mark. The masses ‘disperse’ and‘diffuse’ meaning, and this is their political power.Baudrillard is not really implying here that people do notunderstand media messages or that they do not make meanings withthem. He acknowledges that from the point of view of individualsand groups, we still have resistant readings of media messages, suchas, for example, when a group decodes a message by translating itinto its own code in acts of frontal resistance (trade unionists listeningto the official news of a strike; ethnic minorities listening to


Communication Biopower 139mainstream media coverage, etc.) What appears more relevant,however, is that the overall effects of all these little acts often endup being dispersed in the mass domain. When seen from theperspective of government, that is of the macropolitical control ofmajorities, the masses are always constituted as silent. If theycommunicate at all, it is by way of focus groups and opinion pollsin a language of forever fluctuating and often contradictorypercentages. In this emergence of the masses as a statistical source,Baudrillard would argue, there is not just the mass’s manipulation,but also its complicity. Hence his uncomfortable hypothesis that thecentrality of meaning formation and the exchange of ideas to politicallife are mainly the sites of investment by a ‘political class’ (includingcritical intellectuals) who needs them to justify their existence.The political sphere also only survives by a credibility hypothesis,namely that the masses are permeable to action and discourse,that they hold an opinion, that they are present behind the surveysand statistics. It is at this price alone that the political class canstill believe that it speaks and that it is politically heard. 11The question, then, is not that the public sphere has been corruptedby private interests or even by the contamination of images. Thereis no inherent condemnation of the visual in favour of speech andwriting; on the contrary. Images are always predominant in theformation and emergence of a mass and the mass is not only surveymaterial, but also a kind of ‘zero degree of the political’, that is themoment where the political starts again, as from a zero degree or astate of fullness and potential. For Baudrillard, the degree zero of thepolitical is the moment of maximum depoliticization, the momentwhen the social and its mediation appear to have exhausted the playof social forces, when everything appears as a representation andrepresentation becomes a hyperreality. At the same time, however,such a point of absolute socialization is also seen as a turning point,the moment when the political makes its comeback.Whatever one thinks of Baudrillard’s analysis, we cannot avoidconsidering that the current reconfiguration of the relationshipbetween communication and the political is also connected to thisrelationship between masses and images. What complicates mattersis that, in a way, we cannot really say that the masses ‘see’ images.To see something (unlike to watch or to look) implies some kind ofresidual social qualities and we have determined that as long as we


140 Network Cultureconstitute a mass, whether we like it or not, or even if we resist it,we have none. Neither can we really talk about a ‘mass perspective’,in as much as the idea of perspective implies a subject which surveysand organizes, and traditionally the mass has always been describedas a passive, ‘feminine’ entity. The notion of the mass implies a kindof distracted perception, of the kind, for example, that WalterBenjamin associated with architecture and the age of mechanicalreproduction. The mass, that is, represents not so much a sociologicalperspective that identifies it with a specific class or class composition,as a relationship with the image. By all accounts, the relationship tothe image that is expressed by the mass is one of fascination, that isa perception that deprives images of fixed qualities in order to amplifytheir intensities. Images are not so much decoded for meaning asconsumed, that is absorbed and relayed. What the mass perceives inthe image is its excess power of holding the gaze in fascination. Aswe have seen, we cannot really assimilate the mass to a historicalsubject in its modern sense. The way in which we have used thisterm here is to explain a certain relationship to images and mediaculture that constitutes a problem, for example, for the way we thinkabout the politics of communication. If the masses are what withinsociety resist the social, what resist mediation by relating to imagesin ways that neutralize their social qualities and meanings, then thisis not a subject on which reason really takes hold. The masses implya distracted perception that can be related to only as such – that is,as perception. It is only in so much as the masses perceive that theirmaterial composition and political disposition can be affected.Rational debates and communicative action have a marginal effecton this dynamic of fascination and distracted perception that weassociate with mass culture.This centrality of the relationship between perception anddistraction within mass culture is recognized by communicationmanagers and experts. Among the techniques and fields of expertiseincluded in the area, for example, we count ‘perception management’.The term was first coined within military and intelligence circles (theCIA under the directorship of William J. Casey) 12 but it is alsoexpanding in the commercial sector with the emergence of perceptionmanagement consultancy firms (interestingly enough many of themlocated in the Islamic world). 13 Perception management includespublic relations work, knowledge of local conditions, informationwarfare and media manipulation, but in ways that explicitly recognizethat what needs to be managed is not simply the knowledge that


Communication Biopower 141surrounds a certain event, but its ‘perception’. Perception managementis thus not mainly addressed to what we might recognize as a ‘publicopinion’. Although it does make some concessions to public opinion,it does not address it as a public that is able to discriminate betweentrue and false and develop an informed position on this basis. In thissense, perception management belongs to the order of simulation.There is no scandal, but everything is more or less performed in thelight (it is an obscene form of power, as Baudrillard remarked). Whatis important is not to convince public opinion of a truth that isdemonstrated on the basis of logical arguments as the manipulationof an informational milieu. Images are not representations, but typesof bioweapons that must be developed and deployed on the basis ofa knowledge of the overall informational ecology. If the relationshipbetween masses and images is characterized by the erasure of socialqualities from the image, then all we are left with is not just a mass,but a universe of images, acting and reacting on each other – anartificial informational ecology of image flows. Like Henri Bergson,perception management too starts from the notion that perceptionis, first of all, in the images.Seen from this perspective, it is not surprising that the mostsignificant feature of contemporary mediascapes is their oversaturationwith image and information flows (including the acousticimage or sound). To all effects these flows form a hyperreality or acybernetic space, where images act and react on each other, givingrise to phenomena of parasitism, overcrowding, local image niches,underground micro-ecologies and so on. Thus, for example, onecould analyse the image ecologies of global media culture by askingwhat is the rate of distribution of images in different locales; whatkind of images achieve a kind of global dominance and which othersare kept locally confined; what kind of networks organize differentimage flows and how these networks relate to each other; whatcharacterizes a successful image and what happens to images thatare not selected for mass diffusion; which images reinforce each other,which ones must be kept separate, which ones wage war on eachother and so on.When perceived by a mass, therefore, the universe of images seemsto be more rather than less material. It is not a question of the mediauniverse coming to constitute a map that replaces the real (asBaudrillard claimed, to think in terms of images as a copy of the realis to hark back to representation). It is more as if the universe ofimages has ceased to play the game of appearances and mediation


142 Network Cultureto be openly displayed as a field for the propagation of intensities oraffects – a battlefield for the war staged on the terrain of perception.What is important of an image, in fact, is not simply what it indexes– that is, to what social and cultural processes and significations itrefers. What seems to matter is the kind of affect that it packs, themovements that it receives, inhibits and/or transmits. The place ofan image is thus always within an ecology, not only because imagesderive their meanings from the overall semiotic system, but alsobecause literally they act and react on each other – they wage waron each other or establish alliances (the bin Laden vs. Bush duets ofthe Afghan War; suicide bombers in city centres and army tanks inrefugee camps; images of starvation side by side with images ofconspicuous consumption); they find niches in which to proliferateand mutate (forgotten conspiracy theorists’ fanzines and websites;idiosyncratic sexual fetishes); they infiltrate alien image environmentsand disrupt them by introducing new types of intensities (thegraffitied trains delivering hip-hop intensity from Brooklyn into theheart of Manhattan in the late 1970s). This is the sense in which thehyperreal does not really involve a metaphysics of being andappearance so much as a kind of information ecology which alsoincludes a dimension of warfare – a warfare to determine thedifferential power and dominance of some types of images overothers. It is no longer a matter of illusion or deception, but of thetactical and strategic deployment of the power of affection of imagesas such. It is no longer a matter of truth and appearance, or even ofthe alienating power of the spectacle as ‘opium of the masses’, butof images as bioweapons, let loose into the informational ecology witha mission to infect.If this world could appear to some as a world where appearancesor spectacles have triumphed over reality, this is only because of ametaphysical prejudice that needs images to uphold the value of atruth that must always be uncovered. But is there anything that it isreally left uncovered and secret within a distributed culture ofcommunication? Is our problem really that media propaganda is usedto cover up the truth? Or is it more the case that the truth is not evencovered up any more because what is important is not to shieldpeople from the truth but to have an effective strategy that is ableto capture and hold together a certain type of intensity?From this point of view, the emergence of a mass, of social entropy,always implies an intensification of communication strategies thatfocus on the intensity of the image and the afterlife that such


Communication Biopower 143intensities carry. The context of the first elaboration of theories ofmass culture was notoriously that of twentieth-century totalitarianregimes for which the aesthetic management of perception was a keypolitical technique (as the Frankfurt School theorists noticed anddescribed). For Deleuze and Guattari, the totalitarian experiences ofthe twentieth century have also taught us about the power of onetype of image over others – the face or the machine of faciality. Notonly does the emergence of mass culture within modernity coincidewith the rise of a star system that gravitates around a fascinationwith faces, but fascism as well (understood as a configuration of desiregiving rise to authoritarian political systems) is inconceivable withoutthe hegemony of a face. Within the ecology of images, faces play animportant role and some faces become veritable black holes of socialenergies that are sunk into the empty space linking the eyes to themouth. In this sense, George Orwell’s Big Brother was about asurveillance society as much as it was about the imperialism of theface – the interplay of the eyes and mouth forming a closed circuitof communication in which the masses so to speak sink. Within theecology of images, faces are like cluster nodes in the informationalmilieu and the succession of faces marks thresholds of passages andtransformations in authoritarian political cultures (not only theiconic totalitarianism of Hitler, Mussolini, Stalin and Saddam Hussein,but also the populist authoritarianism of Ronald Reagan and MargaretThatcher, the neo-imperialism of Tony Blair and George Bush, thetheocratic pull of Osama bin Laden’s face and so on).But can we really reduce the relationship between communicationand the political today to the perception of the masses – a pureperception that frees images of their socio-indexical anchorage inorder to give free reign to an ecology of images understood as apropagation of intensity? Or wouldn’t this be a reduction like thatwhich says that all political communication is an appeal to reason?This does not mean to deny the existence either of a mechanism offormation of public opinion or of a relationship between images thatare fundamentally ecological and microbiological. On the otherhand, the mass culture or simulational hypothesis seems tooverestimate the actual disappearance of the socio-indexical qualitiesof images in favour of a pure perception whereby images only actand react on each other. If on the one hand the configuration of therelationship between communication and the political that we callthe masses indicates an important strategic target (involving a wholetechnology of perception management and information warfare),


144 Network Cultureon the other hand this is not the whole story. A network culture, infact, implies a complexification of the mass media environment – itis the dimension that envelops the multiple durations of disparatecultural formations and milieus. We are no longer in the mass-cultureregime where the mass could be opposed on the one hand to a highculture of aesthetic discernment and reasonable debate and on theother to a folk culture that authentically expressed the power of thepeople. But neither can we oppose the TV masses to the Internetmultitude without conveniently cutting out their intersections andrelays in a common informational milieu. The image ecology ofnetwork culture is highly differentiated and this implies, paceBaudrillard, that images do not just flow through, but are channelledthrough a segmented and capillary system of communication. Thereis not simply an amorphous mass, but a fractal ecology of socialniches and microniches. We are not living, that is, in a pure massculture, but in a configuration of communication where a pure massperception clashes and interacts with a fractured and microsegmentedinformational milieu.AN INTOLERANT WORLDAs Armand Mattelart has pointed out, communication is a moderninvention. It is modern, not because communication did not existbefore the European eighteenth century, but rather because, as aterm, communication belonged to another semiotic order. Emergingfour centuries ago alongside the ideas of reason and progress,communication had older religious connotations of sharing,community, contiguity, incarnation and exhibition. In the 1753 entryto the Encyclopédie (one of the crowning achievements of theEnlightenment), the negative definition of communication explicitlypoints to the religious implications of the term.Written by a clergyman, it has the double merit of making us realizehow much the original matrix of ‘communication’ owes to thelanguage of the church while not being confined to it.Excommunication is defined in this article as the ‘separation fromcommunication or trade with a person with whom one previouslyenjoyed it… [any] man excluded from a society or a body, andwith whom the members of that body no longer havecommunication, may be said to be excommunicated. 14


Communication Biopower 145This does not simply make today’s ‘socially excluded’ theexcommunicated of the information age, but also explains the affinitybetween religious sensibility and the universe of communication.We can only mention here the key function of communication inthe production of a global community in the work of MarshallMcLuhan – a converted Catholic; 15 and take in how Muslim thinkersimmediately grasped the Internet in terms of its potential to producean electronic Ummah. 16The early wave of theorization on the phenomenon of computermediatedcommunication (CMC) also pointed to the importance ofcomputer networks in the formation of ‘virtual communities’, asthey became controversially known. 17 In spite of attempts to linkthem back to the pioneering times of the North American frontier,virtual communities looked more like the technological successorsto the ‘imagined communities’ of modernity – including the nationalcommunities materialized by media events such as Franklin DelanoRoosevelt’s radio chats; presidential funerals and royal weddings; orfilms and TV series exploring the problems of a nation’s history. 18The relationship between communication and community, whichwithin modernity was mostly confined to the boundaries of thenation, is today problematized by a kind of geographical dispersion.Mediated communities are no longer mostly enclosed by nationalboundaries, but increasingly materialize at the intersection ofmanifold connections. Whether we are considering themultichannelled and multilingual universe of satellite television; ordigitally encoded texts, images and sound on the Internet; or thevisual icons of a global consumer culture devoted to brands andblockbuster action movies; or gossip and news relayed through phonecalls and face-to-face communication – the nonlinear and distributedmovement of information across a global communication matrixmakes it hard to determine its relationship to a ‘community’ in themodern sense of the world.The emergence of a global and differentiated communicationmatrix has also foregrounded the power of communication to undobonds, rather than simply reinforce them – a feature that is particularlytroubling for the modern association between communication andcommunity. Referring to a comment by Abdel Monem Said on theinflammatory effects of images of the Palestinian Intifada on Arabyouth, New York Times journalist George Packer was struck by howcommunication technologies (such as global television and Internetaccess) seem to have produced what he calls ‘a world less tolerant’. 19


146 Network CultureFrom Marshall McLuhan to Ted Turner, Packer explains, media gurushave been predicting the emergence of a global village held togetherby a single communication infrastructure. Packer translates this tomean a world where information about human rights violations inthe South would provoke the indignation of public opinion in theaffluent North. Ideally, the resulting protests by indignant Westerncitizens would produce a synchronization of the planet on the valuesof the most ‘advanced’ societies – that is on the liberal values oftolerance, democracy and the recognition of fundamental humanrights. Thus the communication of information about child labourin Thailand or Mexico would cause outrage in London and New Yorkand this outrage would force politicians and corporations to changetheir policies. This was to be expected as the result of the formationof a global public opinion coinciding with the emergence of a globalcommunication infrastructure. A smaller, more liberal world, then,would be a world where flagrant injustices could not be tolerated,and within which new international coalitions of activists, membersof the public, and benevolent governments would gradually addressglobal wrongs.Cultural imperialism, always displayed a civilizing flair, but thereis also another side to this failure of communication to pursue itssupposedly civilizing mission. The failure of communication involvesthe crisis of the modern ‘science of communicating’, a science thatDiderot in the same Encyclopédie article named ‘rhetoric’: the ‘modeof understanding through reason’. 20 It is because the science ofcommunicating is not simply that of understanding through reasonthat, for liberal public opinion, communication is making thingsworse. Who could have predicted that Third World youth would bedriven to desire for and hatred of the West by the images beamed byWestern sources to their TV sets and computer screens? Rather thanproviding fodder for indignation and reasons for action to citizensof the West, they are watching the media themselves and gettingangry. Packer’s examples are those of the Egyptian youth who wantto become ‘rock throwers’ after seeing images of the Palestinian kids;and the Sierra Leonean teenager, who reportedly became a fighterbecause he was enraged by images of Western wealth and inflamedby American icons such as Rambo. Communication, the journalistconcludes, is failing the world.What the media provide is superficial familiarity – images withoutcontext, indignation without remedy. If the world seems to be


Communication Biopower 147growing more, rather than less, nasty these days, it might havesomething to do with the images all of us now carry around inour heads. 21Packer’s article (and the long discussion that it engendered on theslashdot BBS immediately after its publication) is a good example ofa recurring set of questions that are troubling the professionals ofpublic opinion in relation to global communication. The notion thata more harmonic world would emerge out of better means ofcommunication and open access to information and that this betterworld would lead to a progressive adaptation by ‘backward’ countriesto advanced ‘democratic’ values has backfired. In the West, imagesof pain and suffering by citizens of less privileged countries revealthe limits of ‘public opinion’ actually to determine substantialpolitical changes; in the South, disillusionment with Western culturaland political values produces widespread disenchantment and areturn to ideas of religious purity.How does the notion of an ecology of images compare to the thesisof communication as something that is making the world lesstolerant? The answer to this question must be related somehow tothe different dimensions that compose a network culture. A networkculture is not a simple entity, but a composite and complex one: itincludes a mass sensibility which characteristically deprives imagesof their socio-indexical qualities; a postmodern or late modernpanache for fragmentation and difference; and a highly differentiatedcommunication matrix in which images are continuously circulated,transformed and relayed at different times and across a variety ofchannels (Starsky and Hutch are always chasing villains somewherein the mediascape …). At this level, we have to acknowledge thateven if images can no longer be considered primarily at the level ofrepresentation, they have not completely lost their indexicalrelationship to the social. Thus it would be hard to claim that thereare no differences between images or that the differences of intensitythat determine the relationship of images to each other have norelation to socially constructed meanings referring to specific socioculturalsegments. There are social and geopolitical reasons whyEgyptian youth should react to images of the Palestinian Intifada theway they do. We thus do not simply have a mass, but also a fracturedmass, and even a microsegmented one. The mass is not simplymassaged by the medium, as Marshall McLuhan argued, but alsosegmented by the media. Image flows are not simply determined by


148 Network Culturethe internal relationship between images as such, but also by theexternal relations that different images have with the social world.This segmentation of the mass by way of the differentiation ofimage flows was summarized by Manuel Castells in his discussion ofthe sociological literature on ‘the new media and the diversificationof mass audience’. Castells identifies this mutation of thecommunication system with the diffusion of personal media devicessuch as the Sony Walkman, specialized radio, VCRs, and the explosionof cable and satellite television. Quoting Françoise Sabbah’s assessmentof new trends in the media in 1985, Castells acknowledges theenduring significance of her analysis for understanding contemporarycultural trends.In sum, the new media determine a segmented, differentiatedaudience that, although massive in terms of numbers, is no longera mass audience in terms of simultaneity and uniformity of themessage that it receives … Because of the multiplicity of messagesand sources, the audience itself becomes more selective. Thetargeted audience tends to choose its messages, so deepening itssegmentation … 22Image flows are thus no floating signifiers, but they work as materialforces by virtue of their very differentials. They move at differentspeeds, have different paces, and their relationship with the worldof solids is rather complicated. They do not just ‘smooth’ solids (asin pebbles or in the solid borders of national territories), but theysort them out (as rivers sort out different types of stone or ascommunication channels sort out different audiences).An example of the complex relation linking these movements ofsegmentation to socio-cultural indexing is the emergence of whathave been called ‘migrant media’. Migrant media are media that caterspecifically to the informational and cultural requests of the migrantcommunities that have coalesced all over the world as a result ofwidespread economic and political displacement of local populations.In a sense, that is, migrant media define all cultural consumption ofmedia products that unify migrant communities across dispersedgeographical spaces. Some of these migrant media have also causeda crisis within media activist circles, in as much as they seem to defysome of the tactics that worked well when the configuration ofcommunication corresponded to a simple opposition between radicalgroups of anarchist or socialist disposition and the mainstream media


Communication Biopower 149as such. Tactical media activist and theorist David Garcia has givenus an interesting example of the complexity inherent in the differentspeeds of image flows in a brief article on the relationship betweenmainstream media, tactical media and ‘migrant media’ in theNetherlands in the early years of the twenty-first century.The background of the paper is the complex and overlappingarrangement of information flows and media systems in theNetherlands at the beginning of the twenty-first century (anarrangement that is both typical of countries of comparable wealthand atypical in as much as the Dutch context is marked by aparticularly active grassroots media movement). The variety ofcommunication networks that coexist in Amsterdam range from theworld of mainstream media, including Dutch national media andinternational broadcasting (including French, German and Britishnational television) to global media networks (from CNN to MTVand SKY); a thriving ‘public access’ cable television spearheaded byan active grassroots media movement of video producers and mediaactivists; a high rate of Internet penetration, including the usualshare of cybercafés and public access; and the whole network matrixof communication with which we are familiar (including intranets,telephony, wireless etc.). Rather than constituting a singlecommunication space equally accessible to all parties concerned, this(a)typical network matrix is challenged, in Garcia’s story, by theemergence of separate image flows, what he calls ‘migrant media’ –that is the increasing use by migrant communities of specificcommunication networks that do not overlap either with nationalor with global television. 23Garcia recounts how the progressive consolidation of such migrantmedia as self-enclosed islands catering to the needs of local migrantcommunities (and in particular Muslim communities) is implicatedwith the Dutch attitude to cultural differences (the epitome ofliberalism in the West). Garcia suggests that in the Dutch model,differences are allowed to coexist, but only if they remain discreteand bounded, that is if they stay within their own confines. ForGarcia, this situation underlines the development of the cultural andmedia milieu within the Dutch territory. On the one hand, a nativepopulation enjoying a more or less common media experience, thatinvolves the consolidation of a national identity, but also of a specificexperience of the global as mediated by the multinational mediacompanies (such as CNN, Sky, BBC, etc.). On the other hand, theinformal networks of migrant communities and local migrant media


150 Network Culturecable channels, with a strong presence of theocratic Islamic groups.In explaining the feedback loop between these separate media andcultural networks dividing migrant from native, the use by a theocraticIslamic group of public access cable TV and the emergence of ananti-immigrant Dutch right, Garcia recounts the confusion of thetactical media movement. While the latter started by advocating thenecessity of public access to the media, it did not predict the formationof parallel media networks or that some of the forces taking oversuch networks would be so problematic from the point of view of aclassic ‘leftist’ and ‘new-leftist’ politics.It is not simply a matter here of showing how access to differentmedia (let’s say, to follow the Islam/West example, Al-Jazeera vs.CNN) produces different perspectives on issues such as the conflictbetween Israel and Palestine, or the Gulf War. As Packer argued, it isnot just a matter of experience, but of what material relations ofsynchronization determine the dynamic emergence and becomingof national identities, migrant perspectives, emotions of solidarityand indignation, empathic identification and anger. This overallcommunication dynamics makes the world ‘less tolerant’ in as muchas it aggravates the friction of differences that are both emphasizedand left to rub off against each other outside of any mediation. Thisis why a network culture is not the United World of Benetton, whereall differences are simply allowed to coexist and aesthetically enjoyeach other. Communication networks latch onto the segmentationof the social, undermine and reinforce cultural identifications andrelease social antagonisms fed by shared experiences of injustice orindignation at imagined or real wrongs.A network culture cannot thus be simply separated and opposedto the domain of the manipulated ‘mass’. In as much as the dimensionenveloped by a network culture crosses disparate domains ofcommunication, it also seems to undercut and undermine even thevolatile mass that occasionally lifts a pop icon, a TV format, or abrand logo out of the mostly flat input of the culture industrymachine. There are thus mass phenomena within a network culture,but it is almost as if they always expressed only a dimension of theoverall dynamics of communication. In a network culture, a mass isa transversal cut in the body of an informational milieu that neverceases to be microsegmented, highly differentiated and at the sametime interconnected. If this mutual although uneven segmentation(where the masses segment and are segmented by the media) is


Communication Biopower 151possible at all, it is because differences have not simply becomeinterchangeable, and social and cultural processes have not lost theircapacity to qualify and differentiate image flows. At the same time,it has become necessary to think through the relationship betweenthese social qualities and the mass perception identified above – weare always simulatenously both mass and class, mass and multitude,mass and race, mass and nation; and so on.Psychoanalytic theory has of course spent a great deal of timeanalysing this relation, but mostly from the point of view of theindividual subject rather than of the mass as such. Similarly, the oldfield of mass psychology developed according to very different modesof communication and was inflected by the political preoccupationsof the time. Our starting point is not the problem of the ‘revolutionary’or ‘blind’ masses, but the interplay between intensity and meaningas it takes place within a segmented mass. This interplay betweenintensity and meaning within an admittedly small mass (but a massin a way has no size) has been recently described by Brian Massumiin ‘The Autonomy of Affect’. 24 In his rereading of a behaviouristexperiment on a group of children and their perception of differentversions of a TV short (silent; with a matter-of-fact explanatorycommentary; or with an emphatic and sentimental voice-over),Massumi shows that the affective perception of an image seems tobe characterized by a gap. This gap is that between the content of theimage (that is the social indexing or quality) and its effect (the strengthor duration of the image, that is its intensity). In the experimentrecounted by Massumi, the data produced by the electronicmonitoring of the children’s bodily reactions to different versions ofthe short film, and their stated reaction as articulated in their answersto the questionnaires, contradicted each other. Electrodes andquestionnaires, the skin and the mouth, yielded different answers.Massumi argues that the experiment confirms the hypothesis of agap between content and effect, a gap that results in a kind of‘autonomic remainder’ of affect. In their collective response to theTV short, the children formed a ‘mass’, a bad conductor of meaning,but a fine conductor of intensities.This gap between the social quality and the intensity of imageperception is not simply a matter of sliding or floating signifiers, butis also about the autonomic potential of bodies and the chaoticdynamics of perception as an essential component of any culturalpolitics of communication. In the missing half-second between themoment when the skin reacts to the image and the moment when


152 Network Culturethe brain registers the stimulus, all kinds of crossing of wires takeplace. The whole body is filled by the vibrations produced by theimpact of images on sensory organs, including eyes, ears and skin.What we actually come to perceive consciously is only a fraction ofwhat has touched us. In the movement from the first impact ofperception to the moment of conscious elaboration, a whole chaoticdynamic unfolds, yielding extraordinarily stable results (socialmeanings consistent overall with the layout of social stratifications)but also pointing to the existence of autonomic bodily remainders,of unrealized, or virtual, potentials. This implies that the ‘absorptionof images’ that characterizes the relationship between the massesand the media is far from being a simple process of manipulationand fascination.If images are not the metaphysical cause of the corruption ofcommunication, then we might have to consider them as the basicconditions within which a return of the political might take place.We might argue that the codes which organize media messagescannot be understood simply in terms of signification, that is as ifthe matter was simply that of manipulating a mass or mediating ameaning. Before such a capture of the image can be accomplished,a whole set of other operations must have taken place, operationsthat are both material and semiotic, but nevertheless microphysical.The power of communication and the media is not only the powerof imposing an ideology, forming a consensus or manipulating theopinion of the majority, but also a biopolitical power, that is, a powerof inducing perceptions and organizing the imagination, ofestablishing a subjective correspondence between images, percepts,affects and beliefs. 25 What appears challenging for cultural and mediatheory is that these flows of images/perceptions/sensations/intensitiesare not necessarily anchored in cultural and social identities narrowlyconceived. Of course, this does not mean that people do not identify(with their race and gender, class and religion but also with celebritiesand objects, pop stars and political leaders). These identificationsqualify the images, but their social meanings do not completelycapture the play of intensities – their autonomic remainder, asMassumi put it. It is this field of intensity that is invested bycommunication biopower, but, at the same time, this is also a site ofemergence for another mode of politics that is not dependent on themodern problematic of communication. It is at this point, then, thatwe can return to our networked multitude: not as to a new subjectrising to defy both the passivity of the mass and the corruption of


Communication Biopower 153communication, but as a mass mutation involving an experimentationwith the zero degree of the political.NETWORKED MULTITUDESThere is a mass, then, in network culture, as well as segments andmicrosegments, and an informational dimension that links them all.It is this peculiar feature of the overall communication system thata medium such as the Internet captures with uncanny fidelity. As innetwork culture at large, there is a mass psychology of the Net,unfolding, as Geert Lovink put it, within ‘large-scale systems, filledwith amorphous, more or less anonymous user masses’. 26 There aremass phenomena such as portal sites, the big search engines and freeemail services, but also the entertainment giants and the corporatenews providers. At the same time, the segmentation of the audiencethat is observable in the new media landscape of satellite and localTV, or VCRs and DVDs, is replicated and intensified by the Internet– with its thousands of specialized web channels catering for nicheaudiences, but also with its myriad little electronic soapboxes withzero or so traffic and its firewalled networks and pushy newsletters.The concentration of portals, that is, does not preclude a high levelof microsegmentation of usage across the dust-like galaxies of minorand specialized nodes. At the same time, however, this separationcan never really neutralize the interconnectedness of the whole space,the overall vulnerability to informational dynamics, chain reactions,viral infections, the pollution of spam, or the powerful ripples ofnonlinear information flows (the forwarding of links, petitions, criesfor help, group emails, warnings and bugs, postings, and so on).The Internet, that is, seems to us to capture (and reinforce) a featureof network culture as a whole – the way it combines masses, segmentsand microsegments within a common informational dimension inwhich all points are potentially even if unevenly affected by all otherpoints. Within the Internet medium, this peculiar combination ofmasses and segments does not produce a peaceful coexistence of twodifferent modes (the amorphous majority massing somewhere to themiddle of a Bell curve and the rigid segments produced by socioculturalprocesses of stratification). In a network culture, thedifferentiating power of image flows achieves a kind of hydrodynamicstatus characterized by a local sensitivity to global conditions. Ratherthan being dispersed at the receiving ends only to re-emerge as surveyfodder, a networked mass displays a kind of active power of


154 Network Culturedifferentiation. It is still a mass, but it cannot be made to form astable majority around some kind of average quality or consensus.The segments have lost some of their rigidity (whether of social orcultural identity) under the recombinant assault of informationalflows. The result seems to be a political field that cannot be made tounite under any single signifier (such as the working class) or evenunder a stable consensus; while at the same time it cannot really splitoff into separate segments with completely separate socio-culturalidentities (even hybrid ones) – a space that is common, without beinghomogeneous or even equal. As such, the Internet gives visibility to alarger feature of our communication milieu – and one that is on itsway to becoming hegemonic, as all communication systems becomeever more interconnected.There is nothing idyllic about this political configuration. As apolitical milieu, a network culture looks more like a permanentbattlefield than like a neo-socialist utopia. It is the plane over whichbattles for market shares and for the determination of public opinionsare fought; it is a field of research into and deployment of advancedtechniques and strategies of manipulation and control; it is thetheatre of violent attacks and group hatreds. And yet, it also offersplenty of opportunities for experimentation with political tactics andforms of organization. This experimentation addresses both thedimension of an overall informational dynamics (that is the necessityto develop informational tactics able to counteract the overbearingpower of corporate and governmental actors in the communicationsectors); and also a political and cultural milieu that can no longerbe subsumed (if it ever was) under a majority, led by a class/avantgarde/ideaor even made to form a consensus that is not inherentlyfractured and always explosively unstable.The recent movements against the neoliberal policies of the greatinternational economic bodies such as the International MonetaryFund (IMF), the World Bank, the World Trade Organization (WTO)and the group of the eight most powerful economies in the world(the G8); the Social Forum meetings assembling an internationalmovement of mayors, political parties, NGOs, indigenous groups,media activists and others (from Porto Alegre, Brazil 2000 to Mumbai,India 2004); the anti-war coalitions giving rise to the global demosagainst the Second Gulf War in February 2003; these are all examplesof such experimentation (and of the potentials and problems thatare inherent in such a process). All these events have given temporary,but powerful visibility to a process of horizontal and diffuse


Communication Biopower 155communication that draws both upon the latest technologies (fromvideo cellular telephony to wireless Internet access) and also uponmore established strategies, such as conferences, talks, camps,workshops, meetings and travelling caravans (such as in thepreparation of the 1998/2000 anti-IFM, G8 and WTO demos). 27 Thesemultiple modes of communication correspond to a mostlyuncoordinated proliferation of organizations, micro-organizationsand groups, with more or less shifting boundaries, but with commoninterests and, most importantly, operating within a commoncommunication matrix.As a result, these first years of the twenty-first century haveconsistently displaced the familiar opposition of the politicalspectrum (inherited from the cold war) between left and right. Whathas displaced them, however, is neither the fetish of difference (asin post-1960s social movements) nor that of public opinion as a newsuperpower, but a more general compossibility of relations within afluid and yet segmented bio-informational milieu. In this sense, theencounter between the spectacle and politics suggested by Baudrillardopens up onto challenging scenarios. If the degree zero of politics,as Sylvère Lotringer put it in a different context, is related to ‘thedesire to allow differences to deepen at the base without synthesizingthem from above, to stress similar attitudes without imposing ageneral line, to allow points to co-exist side by side’, then this desireis tested by a communicational milieu that demands it pragmaticallyrather than simply discursively or ideologically. 28Theorists of network politics have repeatedly pointed out how thisimpossibility of building a consensus or stable forms of organizationis a key resource (rather than a limit) of its political potential. ForHardt and Negri, a network culture constitutes a new occasion forthe re-emergence of the multitude – a political category that theyoppose to the preconstituted unity of a ‘people’ (the multitude is ‘amultiplicity, a plane of singularities, an open set of relations, whichis not homogeneous or identical with itself and bears an indistinct,inclusive relation to those outside of it….an inconclusive constituentrelation…’). 29 For the Critical Art Ensemble, the absence of a unitarypurpose or shared meaning is an advantage:conflicts arising from the diversity of the cells would function asa strength rather than a weakness; this diversity would produce adialogue between a variety of becomings that would resist


156 Network Culturebureaucratic structures as well as provide a space for happyaccidents and breakthrough inventions. 30For Harry Cleaver, computer-linked social movements form a‘hydrosphere’, a fluid space ‘changing constantly and onlymomentarily forming those solidified moments we call“organizations.” Such moments are constantly eroded by the shiftingcurrents surrounding them so that they are repeatedly melted backinto the flow itself.’ 31The virtual movements of this early twenty-first century haveoffered a challenging glimpse of the political field opened up bycommunication biopower. A network micropolitics able to traversethe global space of communication is not some kind of easy utopia,where differences are allowed to coexist or go their separate ways– the domain of a blissfully unproblematic self-organization. Onthe contrary, it is the ways in which the global communicationmatrix allows such connections and organizations to take placethat reveals the hard work implied. This scattering, this tendency todiverge and separate, coupled with that of converging and joining,presents different possible lines of actualization: it can reproducethe rigid segments of the social and hence its ghettos, solipsismsand rigid territorialities. And it also offers the potential for a politicalexperimentation, where the overall dynamics of a capillary communicationmilieu can be used productively as a kind of common ground– allowing relations of compossibility as well as concerted actions.Within this context, a cultural politics of communication involvesnot simply the exercise of an abstract faculty of reason, but also avery material engagement with relations of composition and decompositionbetween affectively charged and often competing beliefs.Thus it cannot simply dismiss or despair at the state of the mass, thatis, of those that reason believes to be misled and hoodwinked. Whatthis ultimately boils down to is a capacity to synthesize not so mucha common position (from which to win the masses over), but acommon passion giving rise to a distributed movement able to displacethe limits and terms within which the political constitution of thefuture is played out. What this effort starts with, however, is notReason, in the sense of a universal faculty that, thanks to interactivityand a decentralized distribution of communication capabilities,finally resurfaces after a long slumber. As with the masses, thispolitical mode cannot but start with affects – that is with intensities,variations of bodily powers that are expressed as fear and empathy,


Communication Biopower 157revulsion and attraction, sadness and joy. It does not see such affectsas an aberration of communication (as the ‘communication is makingthe world less tolerant’ thesis would argue) but as their beginning– as expressing the zero degree of the political as such. It is a reasonthat cannot be moulded by an effort to transcend and regulate thebody’s affects (which was always implicit in the modernist politicsof the avant garde, where the question was that of enlightening amass). On the contrary, it arises out of affective investments and worksthrough an inventive and emotive political intelligence on the terrainof the common – the constituent terrain of the contemporary politicsof communication. 32


NotesINTRODUCTION1. Virilio is referring here to Albert Einstein’s famous statement in the 1950sthat humanity would have to face three kinds of bombs: ‘The first bomb,the atomic bomb, was manufactured in the United States during theSecond World War and dropped on Hiroshima in Japan in 1945. Thesecond bomb was the information bomb. The third bomb was thepopulation bomb set to explode in the twenty-first century’ (JohnArmitage in Paul Virilio and Friedrich Kittler ‘The Information Bomb: aconversation’, esp. p.81). In Virilio’s reading, the information bomb is a‘technological and political weapon… largely the product of US-militaryand America-owned multinational firms’ (ibid.). The effects of this bombare ‘the acceleration of world history and unprecedented technologicalconvergence together with the appearance of “real time,” thedisappearance of physical space, and the rise of “technologicalfundamentalism,” and “social cybernetics”’ (ibid.).2. Virilio and Kittler ‘The Information Bomb’, p. 85. See also Paul VirilioThe Information Bomb.CHAPTER 11. Friedrich A. Kittler ‘A History of Communication Media’.2. Ibid.3. C.E. Shannon and W. Weaver The Mathematical Theory of Communication,p. 1.4. Jérôme Segal, Théorie de l’information.5. Norbert Wiener Cybernetics, p. 39.6. Jeremy Campbell Grammatical Man, p. 17.7. Jacob D. Bekenstein ‘Information in the Holographic Universe’, esp.p. 50.8. Wiener Cybernetics, p. 39.9. Claude E. Shannon ‘A Mathematical Theory of Communication’, esp.p. 5.10. Warren Weaver ‘Recent Contributions to the Mathematical Theory ofCommunication’, in Shannon and Weaver The Mathematical Theory ofCommunication, p. 99.11. F. J. Crasson ‘Information Theory and Phenomenology’, p. 100.12. Ibid., p. 128.13. Ibid., p. 129.14. Michel Serres Hermes: Literature, Science, Philosophy, p. 67.15. Serres Hermes, p. 67.16. J. C. R. Licklider and Robert W. Taylor ‘The computer as a communicationdevice’, p. 21.158


Notes 15917. Gilbert Simondon L’individuation psychique et collective.18. See William Bogard ‘Distraction and Digital Culture’.19. Shannon ‘A Mathematical Theory of Communication’, p. 15.20. Ibid., p. 1121. Claude Shannon apparently stumbled on Ludwig Boltzmann’s formulafor entropy as an effective measure of information, but was very reluctantto use the two terms together or even use them at all (he had doubtsabout adopting the term information as such because of its use incommon parlance; and entropy because of its controversially metaphysicalstatus within the natural sciences). An anecdote suggests that it was Johnvon Neumann who suggested that Shannon referred to entropy anyway.Von Neumann apparently convinced him that ‘since nobody knowswhat entropy is, in a debate you will be sure to have an advantage’(quoted in Howard Rheingold, Tools for Thought, p. 125). This did notstop Shannon from decrying the abuse that he felt was directed towardsinformation theory and the ‘bandwagon effect’ that, especially afterWatson and Crick’s claim in 1952 that they had cracked the genetic code,made information the next big thing after energy.22. See Shannon and Weaver The Mathematical Theory of Communication,p. 100–1.23. Wiener Cybernetics, pp. 10–11.24. The example is taken from W. Ross Ashby Introduction to Cybernetics.25. Weaver ‘Recent Contributions’, p. 100.26. Wiener Cybernetics, p. 10.27. Thus for Brian Massumi,[w]hat characterizes communication is that it is designed to be‘transparent’: no conversion is supposed to take place by virtue of theconnection in and of itself… information is a feed. Neutral packets(‘data’) are consumed on one side of the window (or screen) to feed aprocess already understood or under way, with known effect andintent. Nothing new… The connection is segregated from theconversion. (Brian Massumi ‘Sensing the Virtual, Building theInsensible’, p. 1081).28. Marco d’Eramo ‘L’abisso non sbadiglia più’, p.29 (my translation).29. See Henri Bergson Matter and Memory; Gilles Deleuze Bergsonism; and alsoBrian Massumi ‘Sensing the Virtual, Building the Insensible’; and PierreLévy Becoming Virtual.30. Shannon ‘A Mathematical Theory of Communication’, p. 20.31. Fisher quoted in Jérôme Segal Théorie de l’information (my translation).32. Campbell Grammatical Man, p. 39.33. Ibid.34. Ibid., p. 44.35. See Luciana Parisi Abstract Sex.36. Quoted in Armand Mattelart The Invention of Communication, p. 228.37. See Scott Lash and John Urry The End of Organized Capitalism.38. Norbert Wiener The Human Use of Human Beings, p. 64.39. Ibid., p. 65.


160 Network Culture40. Gregory Bateson Mind and Nature, p. 49.41. Wiener The Human Use of Human Beings, p. 11.42. Pierre Levy Collective Intelligence, p. 4843. James Gleick ‘Push Me Pull You’.44. Michel Foucault’s analysis of Velásquez’s Las Meninas is still the bestanalysis of the relation between perspectival space, representation andthe subject in modernity (in M. Foucault The Order of Things).45. See Rodney Brooks Flesh and Machines.CHAPTER 21. Antonio Negri ‘On Gilles Deleuze and Felix Guattari, A Thousand Plateaus’,p. 1188.2. See http://www.swatch.com/fs_index.php?haupt=itime&unter= (lastaccessed 12 March 2003).3. Beyond Swatch, the two other virtual times quoted by Lovink are theopen source venture, XTime and the art project TIMEZONE. See GeertLovink Dark Fiber, esp. p. 143.4. Lovink Dark Fiber p. 142.5. See David Holmes Virtual Globalization.6. Michael Hardt and Antonio Negri Empire.7. Joseph Stiglitz has described the ‘Washington consensus’ that ruled theeconomic governance of globalization as a kind of ‘market fundamentalism’(see Joseph Stiglitz Globalization and Its Discontents).8. Manuel Castells The Rise of the Network Society, p. 398.9. See Paul Virilio The Information Bomb.10. A threat to the navigability of the Internet is constituted by ‘alternateroots’, that is, domain spaces that are not approved by ICANN or eventhe Internet Architecture Board. The threat of such domains is that of afragmentation of network space, in as much as being located on acompeting grid would impact on how accessible a document is. On theproblem of ‘alternate roots’, see ‘The Domain Name System: A Non-Technical Explanation – Why Universal Resolvability Is Important’,InterNic, http://www.internic.net/faqs/authoritative-dns.html (lastupdated 25 Mar 2002; last accessed 10 April 2002).11. On ICANN and the problems inherent in Internet governance, see StefaanVerhulst ‘Public legitimacy: ICANN at the crossroads’.12. On the relationship between net art and the corporate Internet, see alsoJosephine Berry, ‘The Thematics of Site-Specific Art on the Net’.13. Tim Berners-Lee Weaving the Web, p. 128.14. On the vectorial dynamics of global communication spaces see MackenzieWark Virtual Geography.15. See Duncan J. Watts Small Worlds; and Albert-László Barabási Linked.16. See Barabási Linked, pp. 166–7.17. See Ien Ang ‘Global Media/Local Meanings’; and also George Ritzer TheMcDonaldization of Society.18. On this subject, see Keith Ansell Pearson and John Mullarkey‘Introduction’, in Henry Bergson: Key Writings.


Notes 16119. Gilles Deleuze Cinema 1: The Movement-Image, p. 11.20. Gregory Bateson Mind and Nature, p. 174.21. Reliable statistics about email traffic are notoriously difficult to acquire.However, we have access to some other data such as spam statistics whichoffer at least a glimpse of the sheer scale of overall email traffic. TheKorean Information Security Agency, for example, has estimated that in2002 alone, the number of spam messages received by Korean email userson a daily basis was about 915 million. The annual figure was 333.9billion. (See Korean Information Security Agency ‘ Spam causes W2.6tril. in damage a year’, http://www.kisa.or.kr/english/trend/2002/trend_20020501_01.html [last accessed 17 June 2003].)22. See Barabási Linked.23. See Janet Abbate Inventing the Internet.24. ‘Celebrating the Birthday of the Internet January 1, 1983, the Cutoverfrom NCP to TCP/IP’, Telepolis, http://www.heise.de/tp/english/inhalt/te/14017/1.html (last accessed 28 January 2003). Thanks to RondaHauben for signalling this on the nettime list.25. On content censorship in authoritarian regimes see Shanthi Kalathil andTaylor C. Boas ‘The Internet and State Control in AuthoritarianRegimes’.26. Thirty years later, with the routine use of Computer Assisted Design inarchitectural practice, something of the very same pliability anddynamism of space would make its way back into the world of nonelectronicspace; see Brian Massumi ‘Sensing the Virtual, Building theInsensible’.27. Network Working Group (ed. B. Carpenter) ‘Architectural Principles ofthe Internet’.28. Stephen Segaller Nerds 2.0.1, p. 22.29. Abbate Inventing the Internet, p. 51.30. Ibid., p. 128.31. Antonio Negri has made this shift from the mass worker to immateriallabour as the centre of growth a cornerstone of his understanding of theshifting terrain of political antagonism in the post-1970s period. As iswell known, he has argued that immaterial labour implies a shift from‘economicist’ (or quantitative) understanding of value to a biopoliticalmodel (addressing the active powers of cooperation of the many). (Onthis subject see Antonio Negri Guide.)32. Berners-Lee Weaving the Web, p. 14.33. Abbate Inventing the Internet, p. 48.34. Berners-Lee Weaving the Web, p. 15.35. Hardt and Negri Empire, p. 166.36. Ibid., p. 167.37. Ibid., p. 112.38. As usual, technological breakthroughs are characterized by multiplepoints of emergence. Thus the principles of a packet-switched networkwere also outlined by a British engineer, Donald Davies, at the BritishNational Physics Laboratory; he had come up with a similar idea forcommunication between computers (see Segaller Nerds 2.0.1).39. Network Working Group ‘Architectural Principles of the Internet’.


162 Network Culture40. Ibid.41. Paul Baran ‘On Distributed Communications’.42. See David Tetzlaff ‘Yo-Ho-Ho and a Server of Warez’.43. Sociologists of the Internet have already pointed out some of the socialreasons for this feature of the Internet, such as the early constituency ofcomputer scientists and engineers. On the relation between Internetarchitecture and social groups, see Tim Jordan Cyberpower; and alsoManuel Castells The Internet Galaxy.44. See Rebecca Blood ‘Weblogs’.45. See Phil Agre on the entropic dangers of mailing lists in ‘Subject: Avoidingheat death on the Internet’.46. See Geert Lovink ‘The Moderation Question: Nettime and the Boundariesof Mailing List Culture’, in Dark Fiber, pp. 68–121.47. See Allucquère Rosanne Stone ‘Will the Real Body Please Stand Up?’48. Inke Arns and Andreas Broeckmann ‘Rise and Decline of theSyndicate’.49. Agre ‘Subject: Avoiding heat death on the Internet’.50. On this subject, see Richard W. Wiggins’ account of the effect of 11September on Google (Richard W. Wiggins ‘The Effects of September 11on the Leading Search Engine’), for an account of the propagation ofnews about the disaster, see also Michael Blakemore and Roger Longhorn‘Communicating Information about the World Trade Center Disaster’.CHAPTER 31. This chapter has been made possible by research carried out with thesupport of the ‘Virtual Society?’ programme of the Economic and SocialResearch Council (ESRC) (grant no. L132251050). I shared this grant withSally Wyatt and Graham Thomas, Department of Innovation Studies,University of East London. The chapter has previously been publishedas ‘Free Labor: producing culture for the digital economy’ in Social Textvolume 18, number 2 (2000), pp. 33–582. See Andrew Ross’s ethnography of NYC digital design company Razorfish,No-Collar.3. http://www.disobey.com/netslaves/. See also Bill Lessard and SteveBaldwin’s playful classification of the dot-com labour hierarchies in NetSlaves.4. Lisa Margonelli ‘Inside AOL’s ‘Cyber-Sweatshop”’, p. 138.5. See Paolo Virno and Michael Hardt Radical Thought in Italy; and ToniNegri The Politics of Subversion and Marx Beyond Marx.6. Negri The Politics of Subversion.7. Donna Haraway Simians, Cyborgs, and Women, p. 159.8. Paul Gilroy The Black Atlantic, p. 40.9. Manuel CastellsThe Rise of the Network Society, p. 395.10. Antonio Negri Guide, p. 209 (my translation).11. In discussing these developments, I will also draw on debates circulatingacross Internet sites such as nettime, Telepolis, Rhizome and Ctheory. Onlinedebates are one of the manifestations of the surplus value engendered


Notes 163by the digital economy, a hyperproduction which can only be partlyreabsorbed by capital.12. Ross No-Collar, p. 9.13. See Richard Barbrook ‘The Digital Economy’; and ‘The High-Tech GiftEconomy’. See also Anonymous ‘The Digital Artisan Manifesto’; andAndrew Ross’s argument that the digital artisan was an expression of ashort-lived phase in the Internet labour market corresponding to atemporary shortage of skills that initially prevented a more industrialdivision of labour (Andrew Ross No-Collar).14. Barbrook ‘The High-Tech Gift Economy’, p. 135.15. Ibid., p. 13716. Don Tapscott The Digital Economy, p. xiii.17. Ibid., p. 35 (my emphasis).18. Ibid., p. 48.19. For a discussion of the independent music industry and its relation withcorporate culture, see David Hesmondalgh ‘Indie’. Angela McRobbie hasalso studied a similar phenomenon in the fashion and design industryin British Fashion Design.20. See the challenging section on work in the high-tech industry in JosephineBosma et al. Readme!Filtered by Nettime.21. Martin Kenney ‘Value-Creation in the Late Twentieth Century: The Riseof the Knowledge Worker’, in Jim Davis et al. (eds) Cutting Edge: Technology,Information Capitalism and Social Revolution; in the same anthology seealso Tessa Morris-Suzuki ‘Capitalism in the Computer Age’.22. See Darko Suvin ‘On Gibson and Cyberpunk SF’, in Storming the RealityStudio, ed. Larry McCaffery (London and Durham: Duke University Press,1991), 349–65; and Stanley Aronowitz and William Di Fazio The JoblessFuture. According to Andrew Clement, information technologies wereintroduced as extensions of Taylorist techniques of scientific managementto middle-level, rather than clerical, employees. Such technologiesresponded to a managerial need for efficient ways to manage intellectuallabour. Clement, however, seems to connect this scientific managementto the workstation, while he is ready to admit that personal computersintroduce an element of autonomy much disliked by management(Andrew Clement ‘Office Automation and the Technical Control ofInformation Workers’).23. Barbrook ‘The High-Tech Gift Economy’.24. See Kevin Robins ‘Cyberspace or the World We Live In’.25. See Frank Webster Theories of the Information Society.26. Maurizio Lazzarato (1996) ‘Immaterial Labor’ in Saree Makdisi et al. (eds)Marxism Beyond Marxism, p. 133.27. The Criminal Justice Act was popularly perceived as an anti-ravelegislation and most of the campaign against it was organized aroundthe ‘right to party’. However, the most devastating effects of the CJAhave struck the neo-tribal, nomadic camps, basically decimated or forcedto move to Ireland in the process. See Andrea Natella and Serena Tinari,eds, Rave Off.28. Maurizio Lazzarato ‘Immaterial Labor’, p. 136.


164 Network Culture29. In the two volumes of Capitalism and Schizophrenia, Gilles Deleuze andFelix Guattari described the process by which capital unsettles and resettlesbodies and cultures as a movement of ‘decoding’ ruled by ‘axiomatization’.Decoding is the process through which older cultural limits are displacedand removed as with older, local cultures during modernization; the flowsof culture and capital unleashed by the decoding are then channelledinto a process of axiomatization, an abstract moment of conversion intomoney and profit. The decoding forces of global capitalism have thenopened up the possibilities of immaterial labour. See Gilles Deleuze andFelix Guattari Anti-Oedipus; and A Thousand Plateaus.30. See Franco Berardi (Bifo) La Nefasta Utopia di Potere Operaio, p. 43.31. See Kevin Kelly Out of Control.32. Eugene Provenzo ‘Foreword’, in Pierre Levy Collective Intelligence,p. viii.33. Pierre Levy Collective Intelligence, p. 13.34. Ibid., p. 1.35. See Little Red Henski ‘Insider Report from UUNET’ in Bosma et al. Readme!Filtered by Nettime, pp. 189–91.36. Paolo Virno ‘Notes on the General Intellect’ in Makdisi et al. (eds) MarxismBeyond Marxism, p. 266.37. Karl Marx Grundrisse, p. 693.38. Paolo Virno ‘Notes on the General Intellect’, p. 266.39. Ibid., p. 270.40. Ibid., p. 27141. See Maurizio Lazzarato ‘New Forms of Production’ in Bosma et al. Readme!Filtered by Nettime, pp. 159–66; and Tessa Morris-Suzuki ‘Robots andCapitalism’ in Davis et al. (eds) Cutting Edge, pp. 13–27.42. See Toni Negri ‘Back to the Future’ in Bosma et al. Readme! Filtered byNettime, pp. 181–6; and Donna Haraway Simians, Cyborgs, Women.43. Andrew Ross Real Love.44. See Richard Barbrook ‘The High-Tech Gift Economy’.45. The work of Jean-François Lyotard in The Postmodern Condition is mainlyconcerned with knowledge, rather than intellectual labour, but stillprovides a useful conceptualization of the reorganization of labour withinthe productive structures of late capitalism.46. See Arthur Kroker and Michael A. Weinstein Data Trash.47. See Howard Rheingold The Virtual Community.48. See Howard Rheingold ‘My experience with Electric Minds’ in Bosma etal. Readme! Filtered by Nettime, pp. 147–50; also David Hudson Rewired.The expansion of the Net is based on different types of producers adoptingdifferent strategies of income generation: some might be using moretraditional types of financial support (grants, divisions of the publicsector, in-house Internet divisions within traditional media companies,business web pages which are paid for like traditional forms of advertising)or by generating interest in one’s page and then selling the user’s profileor advertising space (freelance web production); or by innovativestrategies of valorization such as book publishing (e-commerce).49. See Margonelli ‘Inside AOL’s “Cyber-Sweatshop”’.


Notes 16550. Andrew Leonard ‘Open Season’, p. 140. Open source harks back to thespecific competencies embodied by Internet users in its pre-1994 days.When most net users were computer experts, the software structure ofthe medium was developed by way of a continuous interaction ofdifferent technical skills. This tradition still survives in institutions likethe Internet Engineering Task Force (IETF), which is responsible for anumber of important decisions about the technical infrastructure of theNet. Although the IETF is subordinated to a number of professionalcommittees, it has important responsibilities and is open to anybodywho wants to join. The freeware movement has a long tradition, but ithas also recently been divided by the polemics between the free softwareor ‘copyleft’ movement and the open-source movement, which is moreof a pragmatic attempt to make freeware a business proposition (seedebates on www.gnu.org; www.salonmag.com).51. Andrew Leonard ‘Open Season’.52. Ibid., p. 14253. It is an established pattern of the computer industry, in fact, that youmight have to give away your product if you want to reap the benefitslater on. As John Perry Barlow has remarked, ‘[F]amiliarity is an importantasset in the world of information. It may often be the case that the bestthing you can do to raise demand for your product is to give it away’(John Perry Barlow ‘Selling Wine Without Bottles’, p. 23). Apple startedit by giving free computers to schools, an action which did not determine,but certainly influenced the subsequent stubborn presence of Applecomputers within education; MS-DOS came free with IBM computers.54.… the technical and social structure of the Net has been developed toencourage open cooperation among its participants. As an everydayactivity, users are building the system together. Engaged in ‘interactivecreativity’, they send emails, take part in listservers, contribute tonewsgroups, participate within on-line conferences and producewebsites (Tim Berners-Lee, ‘Realising the Full Potential of the Web’). Lacking copyrightprotection, information can be freely adapted to suit the users’ needs.Within the hi-tech gift economy, people successfully work togetherthrough ‘…an open social process involving evaluation, comparisonand collaboration’. (Richard Barbrook ‘The High-Tech Gift Economy’,pp. 135–6).55. John Horvarth ‘Freeware Capitalism’, posted to nettime, 5 February1998.56. Ibid.57. Netscape started like a lot of other computer companies: its founder,Marc Andreessen, was part of the original research group which developedthe structure of the World Wide Web at the CERN laboratory, in Geneva.As with many succesful computer entrepreneurs, he developed thebrowser as an offshoot of the original, state-funded research and soonstarted his own company. Netscape was also the first company to exceedthe economic limits of the computer industry, in as much as it was the


166 Network Culturefirst successful company to set up shop on the Net itself. As such, Netscapeexemplifies some of the problems which even the computer industrymet on the Net and constitutes a good starting point to assess some ofthe common claims about the digital economy.58. Andrew Ross Real Love.59. Chip Bayers ‘Push Comes to Show’, p. 113.60. Ibid., p. 15661. Ibid.CHAPTER 41. Lewis Mumford Technics and Civilization, p. 163.2. On the artificiality of the plane of nature, see Luciana Parisi’s discussionof molecular biology in Abstract Sex.3. See Kevin Kelly Out of Control.4. Christopher G. Langton ‘Artificial Life’, in C. G. Langton (ed.) ArtificialLife, p. 5.5. Charles Taylor and David Jefferson ‘Artificial Life as a Tool for BiologicalInquiry’, p. 1.6. Manuel De Landa, drawing on the scientific literature on the subject, hassummarized this shift as from the ‘ideal type’ to the ‘population’. Theconcept of the ‘ideal type’ is of Aristotelian origin and dominatedbiological thought for over 2000 years. In the concept of the ‘ideal type’,‘a given population of animals was conceived as being the more or lessimperfect incarnation of an ideal essence’. We can understand the notionof ‘ideal type’ also according to the principles of semiotics, where eachsign is composed of a ‘signifier’ (in De Landa’s example the word ‘zebra’)that refers to a ‘signified’ (the ‘ideal’ zebra conceived as possessing allthe essential features of zebrahood). All real zebras (the referent inSausserian terminology) are therefore specific and necessarily imperfectincarnations of the ideal type ‘zebra’ that constitutes the signified orconcept underlying the noun. The revolution introduced by ‘populationthinking’ in the 1930s (and a necessary moment in the production ofthe current biological turn) is thus that the ideal type (or we could say,the sign) is abandoned for the population. At the centre of evolutionarytheory we do not find any longer an ideal animal, made up of all thosetraits that allowed it to evolve and survive in a specific environment, buta set of differentiated traits, spread across all the population of all zebras,a relatively stable system crossed by continuous variation. The new objectof evolutionary theory, then, is not the fit individual but the dynamicof populations. (Manuel De Landa ‘Virtual Environments and theEmergence of Synthetic Reason’).7. John H. Holland Emergence, p. 88.8. See Rodney Brooks Flesh and Machines.9. Gregory Bateson Steps to an Ecology of Mind, p. 288.10. Kevin Kelly Out of Control, pp. 16–18.11. Langton ‘Artificial Life’, p. 3.12. Ibid., p. 6.


Notes 16713. George A. Cowan ‘Conference Opening Remarks’ in Cowan et al. (eds)Complexity, p. 3.14. Homa Baharami and Stuart Evans ‘Flexible Recycling and High-TechnologyEntrepreneurship’, in Martin Kenney (ed.) Understanding Silicon Valley,p. 166.15. Holland Emergence, p. 113.16. Manuel Castells and Peter Hall Technopoles of the World.17. Kelly Out of Control, p. 13.18. Holland Emergence, p. 241.19. Kelly Out of Control, p. 21.20. Ibid., p. 12.21. Atomic theory, a pre-Socratic suggestion, claims that the universe iscomposed of all the possible combinations of tiny invisible and indivisibleelements called ‘atoms’ falling freely through an unbounded and voidspace. As Lucretius, the author of the Epicurean poem–treatise De RerumNatura, or On the Nature of the Universe, put it: ‘When the atoms are travellingstraight down through empty space by their own weight, at quite indeterminabletimes and places they swerve ever so little from their course, just so much thatyou can call it a change of direction. If it were not for this swerve[clinamen], everything would fall downwards like raindrops through theabyss of space. No collision would take place and no impact of atomupon atom would be created. Thus nature would never have createdanything.’ (Lucretius On the Nature of the Universe, p. 43.22. Ilya Prigogine and Isabelle Stengers Order Out of Chaos, p. 141.23. Michel Serres Hermes: Literature, Science, Philosophy, p. 100.24. Steven Levy Artificial Life, p. 109.25. Tim Berners-Lee Weaving the Web, p. 203.26. Prigogine and Stengers Order out of Chaos, p. 14.27. Manuel De Landa ‘Deleuze and the Use of the Genetic Algorithm inArchitecture’.28. See Manuel De Landa, ‘Virtual Environments’.29. Duncan J. Watts Small Worlds, p. 181.30. Mark Ward Virtual Organisms, p. 78.31. See Brian Massumi ‘Chaos in the “total field” of vision’, in Parables forthe Virtual.32. See George Caffentzis ‘Why Machines Cannot Create Value’, in J. Daviset al. (eds) Cutting Edge, pp. 29–56.33. See Watts Small Worlds, p. 186.34. Ibid.35. One of the ambitions of CA researchers, in fact, is not simply to build anabstract machine able to overcome the limits of the Turing machine, butalso that of modelling the logic of life. It is life, in fact, that is imaginedas a great computational machine able to program matter and hence toengender the wide variety of forms that evolution has produced on earth.In particular, by taking on the perspective of populations (in this case populationsof cells or particles), CA researchers aspire to copy the mechanicalcapacity of evolutionary dynamics to invent new forms of life. Artificiallife scientists in particular are very keen to point out that there is novitalism at stake here. They do not believe that life is a mysterious quality


168 Network Cultureable to produce living beings. On the contrary, as Langton remarked,the understanding of evolutionary dynamics in place in ALife is strictlymechanistic (although nondeterministic). Life is no mysterious qualitythat descends from above to the earth, but an emergent process thatresults from the interaction of a multitude of elements in relations oflocal connection. (See Langton ‘Artificial Life’.)36. Lev Manovitch The Language of New Media, p. 213.37. Kelly Out of Control, p. 27.38. Luc Steel ‘Emergence Functionality in Robotic Agents through on-lineevolution’ in Rodney A. Brooks. and Pattie Maes (eds) Artificial Life IV,p. 839. Cariani, Peter ‘Emergence and Artificial Life’, in C. G. Langton, et al.(eds) Artificial Life II, p. 775.40. See the Bionomics Institute site at http://www.bionomics.org/text/insttute/sop.html (last accessed 15 August 2000).41. Sadie Plant ‘The Virtual Complexity of Culture’, p. 206.42. Quoted in Howard Rheingold The Virtual Community, p. 7.43. Ibid.44. Nicholas Negroponte Being Digital, p. 18145. Taylor and Jefferson ‘Artificial Life’, p. 8.46. Stefan Helmreich Silicon Second Nature, p. 47.47. Richard Dawkins The Selfish Gene, p. 29.48. Ibid., p. 2649. Ibid., p. 3450. Ibid., p. 35.51. Gilles Deleuze Negotiations.52. On this subject see Andrew Ross The Chicago Gangster Theory of Life.53. Franco Berardi (Bifo) La fabbrica dell’infelicitá.54. Franco Berardi ‘Social Entropy and Recombination’, p. 20.CHAPTER 51. Joseph Stiglitz Globalization and Its Discontents, p. 229.2. On this subject, see Armand Mattelart The Invention of Communication.3. For an influential critique of this notion of communication, see JacquesDerrida ‘Signature, Event, Context’, in P. Kamuf (ed) A Derrida Reader.4. See Jürgen Habermas ‘The Public Sphere’, p. 102.5. Ibid.6. See John Hartley The Politics of Pictures. For a discussion of cyberdemocracy,the Internet and the public sphere see also Mark Poster‘Cyberdemocracy’.7. On this subject, see Timothy Bewes ‘Truth and appearance in politics:the mythology of spin’, in T. Bewes and J. Gilbert Cultural Capitalism,pp. 158–76.8. Jean Baudrillard In the Shadow of the Silent Majorities, p. 46.9. See John Cassidy Dot.con.10. See autonome a.f.r.i.k.a.-gruppe, Luther Blissettt and Sonja Bruenzels‘What about Communication Guerrilla? A message about guerilla


Notes 169communication out of the deeper German backwoods’, Version 2.0,posted to nettime, 16 September 1998 (available at http://amsterdam.nettime.org/Lists-Archives/nettime-l-9809/msg00044.html; last accessed23 June 2003).11. Jean Baudrillard In the Shadow, p. 37.12. See Robert Parry ‘Lost History’.13. For an example, see Mediators, a Pakistan-based firm of perceptionmanagement consultancy, http://www.praffairs.com/perciption.htm,whose clients include Microsoft and Shell Pakistan.14. Armand Mattelart The Invention of Communication, p. xiii.15. See Arthur Kroker ‘Digital Humanism: The Processed World of MarshallMcLuhan’, in Arthur and Marilouise Kroker (eds) Digital Delirium.16. See Gary Bunt Virtually Islamic.17. See Howard Rheingold, The Virtual Community. For a sample of thepolemic that greeted Howard Rheingold’s book see also Kevin Robins‘Cyberspace or the World We Live In’.18. On the subject of communication, community and nationalism, seeBenedict Anderson Imagined Communities; and also Paddy Scannell Radio,Television, and Modern Life.19. George Packer ‘Where Here Sees There’.20. Mattelart, Intervention, p. xiii.21. Packer ‘Where Here Sees There’; see also the ‘Communication Makingthe World Less Tolerant’ thread at http://slashdot.org/article.pl?sid=02/04/21/1238236.22. Quoted in Manuel Castells The Rise of the Network Society, pp. 339–40.23. David Garcia ‘Islam and Tactical Media on Amsterdam Cable’, posted tonettime, 3 April 2002 (available at http://amsterdam.nettime.org/Lists-Archives/nettime-l-0204/msg00018.html).24. Brian Massumi Parables for the Virtual.25. To interrogate the relationship between images, affects and ideas involvesan assessment of the relationship between bodies and minds, ethics andmorality. From this perspective, it is useful to remember Gilles Deleuze’s‘morality test’, which suggests that we follow a simple method fordefining the difference between an ethical and a moralistic perspectiveon the world. Whenever you have an opposition of interests andtendencies between body and soul, there you have the moral law.Wherever you think that the body’s needs and desires are directly opposedto and in contrast with those of the consciousness/mind/soul, there youhave an ignorance of the power of the body and hence an emergingmorality (see Gilles Deleuze Expressionism in Philosophy).26. Geert Lovink Dark Fiber, esp. p. 137.27. See Tiziana Terranova ‘Demonstrating the globe: virtual and real actionin the network society’ in D. Holmes (ed.) Virtual Globalization.28. Sylvère Lotringer and Christian Marazzi (eds) Italy.29. Michael Hardt and Antonio Negri Empire, p. 103.30. Critical Art Ensemble ‘Electronic Civil Disobedience, Simulation, andthe Public Sphere’; see also their pamphlet Electronic Civil Disobedienceand other unpopular ideas.


170 Network Culture31. Cleaver prefers the notion of a ‘hydrosphere’ to that of the net in asmuch as the latter seems to him to be more appropriate to globalorganizations such as the NGOs that rely on stable nodes organized witha view to act on specific issues. Network-based movements, on the otherhand, seem to him to exceed the network because of the intrinsic mobilityof their elements, connected together by a multiplicity of communicationchannels, converging and diverging in mobile configurations. Harry M.Cleaver ‘Computer-linked Social Movements and the Global Threat toCapitalism’. On forms of agency in networked environments, see alsoAndreas Broeckmann ‘Minor Media – Heterogenic Machines’.32. It was a pre-Enlightenment materialist thinker, Baruch Spinoza, whoconvincingly spoke of the importance of affects and passions as the basicterrain of politics; and it was again Spinoza who considered the productionof common notions as the basic process through which the ethicalconstitution of the world takes place. On this subject, see Benedictus deSpinoza The Collected Work of Spinoza. See also Antonio Negri The SavageAnomaly; and Moira Gatens Imaginary Bodies.


BibliographyAbbate, Janet Inventing the Internet (Cambridge, Mass.: MIT Press, 1999)Agre, Phil ‘Subject: Avoiding heat death on the Internet’, in J. Bosma et al.(eds) Readme! Filtered by Nettime (New York: Autonomedia, 1999),pp. 343–56Anderson, Benedict Imagined Communities: reflections on the origin and spreadof nationalism (London: Verso, 1991, second edition)Ang, Ien ‘Global Media/Local Meanings’, in Living Room Wars: RethinkingMedia Audiences for a Postmodern World (London and New York: Routledge,1996)Anonymous ‘The Digital Artisan Manifesto’, posted to nettime, 15 May1997Arns, Inke and Andreas Broeckmann ‘Rise and Decline of the Syndicate: theEnd of an Imagined Community’, posted to nettime-l@bbs.thing.net, 13November 2001Aronowitz, Stanley and William Di Fazio The Jobless Future: Sci-Tech and theDogma of Work (Minneapolis and London: University of Minnesota Press,1994)Ashby, W. Ross An Introduction to Cybernetics (London: Chapman & Hall,1957)Barabási, Albert-László Linked: The New Science of Networks (Cambridge, Mass.:Perseus Publishing, 2002)Baran, Paul ‘On Distributed Communications ‘ (Memorandum RM-3420-PR),August 1964, Rand Corporation, http://www.rand.org/publications/RM/RM3420/ (last accessed 15 May 2003)Barbrook, Richard ‘The High-Tech Gift Economy’, in Josephine Bosma et al.Readme! Filtered by Nettime: ASCII Culture and the Revenge of Knowledge (NewYork: Autonomedia, 1999), pp. 132–8—— ‘The Digital Economy’, posted to nettime, 17 June 1997 (also at www.nettime.org)Barlow, John Perry ‘Selling Wine Without Bottles: the Economy of Mind onthe Global Net’, in Peter Ludow (ed.) High Noon on the Electronic Frontier:Conceptual Issues in Cyberspace (Cambridge, Mass.: MIT Press, 1996)Bateson, Gregory Mind and Nature: A Necessary Unity (New York: E. P. Dutton,1979)—— Steps to an Ecology of Mind (St. Albans: Paladin, 1973)Baudrillard, Jean In the Shadow of the Silent Majorities (New York: Semiotext(e),1983)Bayers, Chip ‘Push Comes to Show’, Wired, volume 7, number 2 (February1999), p. 113Bekenstein, Jacob D. ‘Information in the Holographic Universe’, ScientificAmerican, August 2003, pp. 48–55171


172 Network CultureBerardi, Franco ‘Social Entropy and Recombination’ in ØYES make-world paperno. 2 (November 2002) (translated by Erik Simpson and Arianna Bo),p. 20—— La Nefasta Utopia di Potere Operaio (Roma: Castelvecchi/DeriveApprodi,1998)—— La fabbrica dell’infelicitá. New Economy e movimento del cognitariato (Roma:DeriveApprodi, 2001)Bergson, Henri Matter and Memory (translated by N. M. Paul and W. S. Palmer)(New York: Zone Books, 1988)Berners-Lee, Tim Weaving the Web: The Original Design and Ultimate Destiny ofthe World Wide Web (New York: Harper Business, 2000)—— ‘Realising the Full Potential of the Web’, http//www.w3.org//1998/02/Potential.html (last accessed 2 February 2003)Berry, Josephine, ‘The Thematics of Site-Specific Art on the Net’ (unpublishedPh.D. thesis), University of Manchester, 2001Bewes, Timothy and Jeremy Gilbert (eds) Cultural Capitalism: Politics AfterNew Labour (London: Lawrence & Wishart, 2000)Blakemore, Michael and Roger Longhorn ‘Communicating Information aboutthe World Trade Center Disaster: Ripples, Reverberations, and Repercussions’,First Monday, volume 6, number 12 (December 2001), http://firstmonday.org/issues/issue6_12/blakemore/index.html (last accessed 22 August2003)Blood, Rebecca ‘Weblogs: a history and perspective’, Rebecca’s pocket, http://www.rebeccablood.net/essays/weblog_history.html (last updated September2000; last accessed 22 April 2003)Bogard, William ‘Distraction and Digital Culture’, Ctheory, Articles: A088, 10May 2000, http://www.ctheory.net/text_file.asp?pick=131 (last accessed 21August 2003)Bosma, Josephine, Pauline Van Mourik Broekmann, Ted Byfield, MatthewFuller, Geert Lovink, Diana McCarty, Pit Schultz, Felix Stalder, McKenzieWark and Faith Wilding Readme! Filtered by Nettime: ASCII Culture and theRevenge of Knowledge (New York: Autonomedia, 1999)Broeckmann, Andreas ‘Minor Media – Heterogenic Machines’, posted tonettime-l@Desk.nl, 13 November 1998 (also at http://www.nettime.org/Lists-Archives/nettime-l-9811/msg00029.html; last accessed 23 June2003)Brooks, Rodney Flesh and Machines: How Robots Will Change Us (New York:Vintage Books, 2002)Brooks, Rodney A. and Pattie Maes (eds) Artificial Life IV: Proceedings of theFourth International Workshop on the Synthesis and Simulation of Living Systems(Cambridge, Mass: MIT Press, 1995)Bunt, Gary Virtually Islamic: Computer-mediated Communication and CyberIslamic Environments (Cardiff: University of Wales Press, 2000)Campbell, Jeremy Grammatical Man: Information, Entropy, Language, and Life(New York: Simon & Schuster, 1982)Cassidy, John Dot.con: The Greatest Story Ever Sold (London: Penguin Press,2002)Castells, Manuel The Internet Galaxy: reflections on the internet, business, andsociety (Oxford: Oxford University Press, 2001)


Bibliography 173—— The Rise of the Network Society (Oxford: Blackwell, 1996)Castells, Manuel and Peter Hall Technopoles of the World: The making of 21stCentury Industrial Complexes (London and New York: Routledge, 1994)Clement, Andrew ‘Office Automation and the Technical Control of InformationWorkers’ in Vincent Mosco and Janet Wasko (eds) The Political Economy ofInformation (University of Wisconsin Press, 1988)Cleaver, Harry M. ‘Computer-linked Social Movements and the Global Threatto Capitalism’, posted to aut-op-sy@lists.village.virginia.edu, 11 December1999 (also at http://www.cseweb.org.uk/downloads/cleaver.pdf)Clough, Patricia Ticineto Autoaffection : unconscious thought in the age ofteletechnology (Minneapolis: University of Minnesota Press, 2000)Cowan, George A., David Pines and David Meltzer (eds) Complexity: Metaphors,Models, and Reality, Proceedings Vol. XIX, Santa Fe Institute, Studies in theSciences of Complexity (Reading, Mass.: Addison-Wesley PublishingCompany, 1994), pp. 1–3Crasson , F. J. ‘Information Theory and Phenomenology’, in Crasson andSayre (eds) Philosophy and Cybernetics (University of Notre Dame Press,1967)Critical Art Ensemble ‘Electronic Civil Disobedience, Simulation, and thePublic Sphere’, posted to nettime-l@Desk.nl, 11 January 1999—— Electronic Civil Disobedience and Other Unpopular Ideas (New York:Autonomedia, 1996)Davis, Jim, Thomas Hirsch and Michael Stack (eds) Cutting Edge: Technology,Information, Capitalism and Social Revolution (London: Verso, 1997)Dawkins, Richard The Selfish Gene (New York: Oxford University Press,1976)De Landa, Manuel ‘Deleuze and the Use of the Genetic Algorithm inArchitecture’, http://boo.mi2.hr/~ognjen/tekst/delanda2001.html (lastaccessed 21 June 2003)—— ‘Virtual Environments and the Emergence of Synthetic Reason’, http://www.t0.or.at/delanda.htm (1998; last accessed 20 May 2002)Deleuze, Gilles Bergsonism (New York: Zone, 1988)—— Cinema 1: The Movement-Image (London: Athlone Press, 1992), p. 11—— Expressionism in Philosophy: Spinoza (New York: Urzone, 1990)—— Negotiations: 1972–1990 (New York: Columbia University Press, 1995)Deleuze, Gilles and Felix Guattari A Thousand Plateaus: Capitalism andSchizophrenia (London: Athlone Press, 1988)—— Anti-Oedipus: Capitalism and Schizophrenia (London: Athlone Press,1984)d’Eramo, Marco ‘L’abisso non sbadiglia più’, in G. Baglione, F. Carlini, S. Carrà,M. Cini, M. d’Eramo, G. Parisi and S. Ruffo (eds) Gli Ordini del Caos (Roma:Manifesto Libri, 1991), pp. 19–70Foucault, Michel The Order of Things: An Archaeology of the Human Sciences(London and New York: Routledge, 1992 [1970])Garcia, David ‘Islam and Tactical Media on Amsterdam Cable’, posted tonettime, 3 April 2002 also at http://amsterdam.nettime.org/Lists-Archives/nettime-l-0204/msg00018.html)Gatens, Moira Imaginary Bodies: ethics, power and corporeality (London,Routledge, 1996)


174 Network CultureGilroy, Paul The Black Atlantic: Modernity and Double Consciousness (Londonand New York: Verso, 1993), p. 40Gleick, James ‘Push Me Pull You’, New York Times Magazine, 23 March 1997(also at http://www.around.com/push.html, last accessed 23 June 2003)—— Chaos (Abacus, 1987)Habermas, Jürgen ‘The Public Sphere: An Encyclopedia Article’, in MeenakshiGigi Durham and Douglas M. Kellner (eds) Media and Cultural Studies:Keyworks (Oxford: Blackwell, 2001, pp. 102–7), p. 102Hall, Stuart ‘Encoding, Decoding’, in Simon During (ed.) Cultural Studies: AReader (London and New York: Routledge, 1993), pp. 90–103Haraway, Donna Simians, Cyborgs, and Women: The Reinvention of Nature(London: FA Books, 1991)Hardt, Michael and Antonio Negri Empire (Harvard University Press, 2000)Hartley, John The Politics of Pictures: The Creation of the Public in the Age ofPopular Media (New York: Routledge, 1992)Helmreich, Stefan Silicon Second Nature: Culturing Artificial Life in a DigitalWorld (Berkeley, Los Angeles, London: University of California Press,2000)Hesmondalgh, David ‘Indie: The Aesthetics and Institutional Politics of aPopular Music Genre’, Cultural Studies volume 13, number 1 (January 1999),pp. 34–61Holland, John H. Emergence: From Chaos to Order (Cambridge, Mass.: PerseusBooks, 1998)Holmes, David (ed.) Virtual Globalization: Virtual Spaces/Tourist Spaces (Londonand New York: Routledge, 2000)Hudson, David Rewired: a brief (and opinionated) net history (Indianapolis:Macmillan Technical Publishing, 1997)Jordan, Tim Cyberpower: the culture and politics of cyberspace and the Internet(London: Routledge, 1999)Kalathil, Shanthi and Taylor C. Boas ‘The Internet and State Control inAuthoritarian Regimes: China, Cuba, and the Counterrevolution’, FirstMonday, volume 6, number 8 (August 2001), http://firstmonday.org/issues/issue6_8/kalathil/index.html; (last accessed 16 June 2003)Kamuf, Peggy (ed.) A Derrida Reader: Between the Blinds (New York: ColumbiaUniversity Press, 1991)Kelly, Kevin Out of Control: The New Biology of Machines, Social Systems, andthe Economic World (Reading, Mass.: Addison-Wesley, 1994)Kenney, Martin (ed.) Understanding Silicon Valley: The Anatomy of anEntrepreneurial Region (Stanford, Calif.: Stanford University Press, 2000)Kittler, Friedrich ‘A History of Communication Media’, Ctheory, 30 July 1996,http://www.ctheory.net/text_file.asp?pick=45 (last accessed 21 August2003)Kroker, Arthur and Marilouise (eds) Digital Delirium (New York: St. Martin’sPress, 1997)Kroker, Arthur and Michael A. Weinstein Data Trash: the Theory of the VirtualClass (New York: St. Martin’s Press, 1994)Langton, Christopher G. (ed.) Artificial Life: The Proceedings of an InterdisciplinaryWorkshop on the Synthesis and Simulation of Living Systems Held September


Bibliography 1751987 in Los Alamos, New Mexico, Vol. VI (Redwood City, Calif.: AddisonWesley, 1989)Langton, Christopher G., Charles Taylor, J. Doyne Farmer, Steen Rasmussen(eds) Artificial Life II: Proceedings of the Workshop on Artificial Life HeldFebruary 1990 in Santa Fe, New Mexico, Proceedings Vol. X, Santa Fe Institute,Studies in the Sciences of Complexity (Redwood City, Calif.: AddisonWesley, 1992)Lash, Scott Critique of Information (London: Sage, 2002)Lash, Scott and John Urry The End of Organized Capitalism (Cambridge: PolityPress, 1987)Lazzarato, Maurizio ‘Immaterial Labor’, in Saree Makdisi et al. (eds) MarxismBeyond Marxism (London: Routledge, 1996)Leonard, Andrew ‘Open Season’, Wired, volume 7, number 5 (May 1999)Lessard, Bill and Steve Baldwin NetSlaves: True Tales of Working the Web (NewYork: McGraw Hill, 2000)Levy, Pierre Becoming Virtual: Reality in the Digital Age (translated by RobertBononno) (New York and London: Plenum Trade, 1988)—— Collective Intelligence: Mankind’s Emergent World in Cyberspace (Cambridge,Mass.: Perseus Books, 1999)Levy, Steven Artificial Life: A Report from the Frontier where Computers meetBiology (New York: Vintage Books, 1992)Licklider, J. C. R. and Robert W. Taylor ‘The computer as a communicationdevice’, in In Memoriam: J. C. R. Licklider 1915–1990 (Palo Alto: CA SystemsResearch Centre, 1990)Lotringer, Sylvère and Christian Marazzi (eds) Italy: Autonomia: Post-PoliticalPolitics (New York: Semiotext(e), 1980)Lovink, Geert Dark Fiber: Tracking Critical Internet Culture (Cambridge, Mass.and London, England: MIT Press, 2002)Lucretius On the Nature of the Universe (translated by R. E. Latham; revisions,introduction and notes by John Godwin) (London: Penguin Books,1994)Lyotard, Jean-François The Postmodern Condition: A Report on Knowledge(translated by Geoff Bennington and Brian Massumi) (Minneapolis:University of Minnesota Press, 1989)McCaffery, Larry (ed.) Storming the Reality Studio (London and Durham: DukeUniversity Press, 1991)McRobbie, Angela British Fashion Design: Rag Trade or Image Industry? (Londonand New York: Routledge, 1998)Makdisi, Saree, Cesare Casarino and Rebecca E. Karl (eds) (for the Polygraphcollective) Marxism Beyond Marxism (London: Routledge, 1996)Manovitch, Lev The Language of New Media (Cambridge, Mass.: MIT Press,2001)Margonelli, Lisa ‘Inside AOL’s “Cyber-Sweatshop”’, Wired, volume 7, number10 (October 1999), p. 138Marx, Karl Grundrisse (London: Penguin Books, 1973)Massumi, Brian ‘Sensing the Virtual, Building the Insensible’, in Gary Genosko(ed.) Deleuze and Guattari: Critical Assessments of Leading Philosophers(London and New York: Routledge, 2001)


176 Network Culture—— Parables for the Virtual: Movement, Affect, Sensation (Durham and London:Duke University Press, 2002)Mattelart, Armand The Invention of Communication (Minneapolis: Universityof Minnesota Press, 1996)Mumford, Lewis Technics and Civilization (London: George Routledge andSons, 1934)Natella, Andrea and Serena Tinari (eds) Rave Off (Roma: Castelvecchi, 1996)Negri, Antonio ‘On Gilles Deleuze and Felix Guattari, A Thousand Plateaus’,in G. Genosko (ed.) Deleuze and Guattari (London and New York: Routledge,2001)—— Guide. Cinque Lezioni su Impero e dintorni (Milano: Raffaello CortinaEditore, 2003)—— Marx Beyond Marx: Lessons on the Grundrisse (New York: Autonomedia,1991)—— The Politics of Subversion: A Manifesto for the Twenty-First Century(Cambridge: Polity Press, 1989)—— The Savage Anomaly: The Power of Spinoza’s Metaphysics and Politics(translated by Michael Hardt) (Minneapolis and Oxford: University ofMinnesota Press, 1991)Negroponte, Nicholas Being Digital (New York: Alfred A. Knopf, 1995)Network Working Group (ed. B. Carpenter) ‘Architectural Principles of theInternet’, http://www.ietf.org/rfc/rfc1958.txt June 1996; last accessed 16June 2003)Packer, George ‘Where Here Sees There’ New York Times, 21 April 2002, http://www.nytimes.com/2002/04/21/magazine/21WWLN.html?ex=1051070400&en=82a6368b52165a3b&ei=5070. (last accessed 1 December 2002)Parisi, Luciana Abstract Sex: Philosophy, Biotechnology and the Mutations of Desire(London: Continuum Press, 2004)Parry, Robert ‘Lost History: CIA perception management’, Consortium, http://www.consortiumnews.com/archive/lost12.html (accessed 8 June 2003)Pearson, Keith Ansell and John Mullarkey ‘Introduction’, in Henry Bergson:Key Writings (London and New York: Continuum, 2002), pp. 1–45Plant, Sadie ‘The Virtual Complexity of Culture’, in G. Robertson et al.FutureNatural (London and New York: Routledge, 1996), pp. 203–17Poster, Mark ‘Cyberdemocracy: The Internet and the Public Sphere’, in DavidPorter (ed.) Internet Culture (New York and London: Routledge, 1997),pp. 201–18.Prigogine, Ilya and Isabelle Stengers Order Out of Chaos: Man’s New Dialoguewith Nature (Toronto, New York, London and Sidney: Bantam Books,1984)Rheingold, Howard The Virtual Community: Homesteading on the ElectronicFrontier (Harper Perennials, 1994)—— Tools for Thought: The History and Future of Mind-Expanding Technology(Cambridge, Mass: MIT Press, 2000 [1985])Ritzer, George The McDonaldization of Society (Thousand Oaks, Calif. : PineForge Press, 2000)Robins, Kevin ‘Cyberspace or the World We Live In’, in Jon Dovey (ed.) FractalMedia: New Media in Social Context (London: Lawrence & Wishart, 1996)


Bibliography 177Ross, Andrew No-Collar : the humane workplace and its hidden costs (New York:Basic Books, 2003)—— Real Love: In Pursuit of Cultural Justice (London and New York: Routledge,1998)—— The Chicago Gangster Theory of Life (London and New York: Verso,1994)Scannell, Paddy Radio, Television, and Modern Life: a phenomenological approach(Oxford : Blackwell, 1996)Segal, Jérôme Théorie de l’information : sciences, techniques et société de la secondeguerre mondiale à l’aube du XXIe siècle (Thèse de Doctorat, Faculté d’Histoirede l’Université Lyon II, Chaire Interuniversitaire d’Histoire des Sciences etdes Techniques, 1988), http://www.mpiwg-berlin.mpg.de/staff/segal/thesis/(last accessed 7 October 2003)Segaller, Stephen Nerds 2.0.1.: A Brief History of the Internet (New York: TVBooks LLC, 1998)Serres, Michel Hermes: Literature, Science, Philosophy (edited by Josue V. Harariand David F. Bell) (Baltimore and London: Johns Hopkins University Press,1982)Severin, Werner and James Tankard Communication Theories (fourth edition)(New York: Longman, 1997), p. 47Shannon, Claude E. ‘A Mathematical Theory of Communication’, in N.J.A.Sloane and Aaron D. Wyner (eds) Claude Elwood Shannon Collected Papers(New York: IEEE Press, 1999), pp. 5–83Shannon, Claude E. and Warren Weaver The Mathematical Theory ofCommunication (Indiana: Universiy of Illinois Press, 1963 [1949])Simondon, Gilbert L’individuation psychique et collective. À la lumiére des notionsde Forme, Information, Potentiel et Métastabilité (Paris: Editions Aubier,1989)Spinoza, Benedictus de The Collected Work of Spinoza (Princeton, N.J.: PrincetonUniversity Press, 1985)Stiglitz, Joseph Globalization and Its Discontents (London: Penguin Books,2002)Stone, Allucquère Rosanne ‘Will the Real Body Please Stand Up? Boundarystories about virtual cultures’, in D. Bell and B.M. Kennedy (eds)The Cybercultures Reader (London and New York, Routledge, 2000),pp. 504–28Suvin, Darko ‘On Gibson and Cyberpunk SF’, in Larry McCaffery (ed.)Storming the Reality Studio (London and Durham: Duke University Press)pp. 349–65Tapscott, Don The Digital Economy (New York: McGraw Hill, 1996)Taylor, Charles and David Jefferson ‘Artificial Life as a Tool for BiologicalInquiry’, in C.G. Langton (ed.) Artificial Life: An Overview (Cambridge,Mass.: MIT Press, 1995, pp. 1–13)Tetzlaff, David ‘Yo-Ho-Ho and a Server of Warez: Internet Software Piracy andthe New Global Information Economy’, in A. Herman and T. Swiss (eds)The World Wide Web and Contemporary Cultural Theory (London and NewYork : Routledge, 2000), pp. 99–126


178 Network CultureVerhulst, Stefaan ‘Public legitimacy: ICANN at the crossroads’, Open Democracy,http://www.opendemocracy.net/forum/document_details.asp?CatID=18&DocID=611&DebateID=109 (September 2001)Virilio, Paul The Information Bomb (translated by Chris Turner) (London andNew York: Verso, 2000)Virilio, Paul and Friedrich Kittler ‘The Information Bomb: a conversation’,Angelaki: journal of the theoretical humanities (special issue MachinicModulations, New Cultural Theory and Technopolitics) volume 4, number 2(September 1999), pp. 81–90Virno, Paolo and Michael Hardt Radical Thought in Italy: A Potential Politics(Minneapolis: University of Minnesota Press, 1996)Ward, Mark Virtual Organisms: The Startling World of Artificial Life (London:Macmillan, 1995)Wark, Mackenzie Virtual Geography: Living With Global Media Events(Bloomington: Indiana University Press, 1994)Watts, Duncan J. Small Worlds: The Dynamics of Networks between Order andRandomness (Princeton, NJ.: Princeton University Press, 1999)Weaver, Warren ‘Recent Contributions to the Mathematical Theory ofCommunication’, in C.E. Shannon and W. Weaver The Mathematical Theoryof Communication (Urbana: University of Illinois Press, 1963 [1949])Webster, Frank Theories of the Information Society (London and New York:Routledge, 1995)Wiener, Norbert Cybernetics or Control and Communication in the Animal andthe Machine (Cambridge, Mass.: MIT Press, 1961)—— The Human Use of Human Beings: Cybernetics and Society (London: FABooks, 1989).Wiggins, Richard W. ‘The Effects of September 11 on the Leading SearchEngine’, First Monday, volume 7, number 10 (October 2001), http://firstmonday.org/issues/issue6_10/wiggins/index.html (last accessed 22August 2003)


Index11 September 71, 128, 162n50Architecture 19, 56, 140Agre, Phil 70Affect 151–2; affects 156–7America Online 73, 86, 91–2Analogue machines 8, 32, 33Advertising 14, 17, 74ARPANET 18, 54–5, 57–8, 64, 76Artificial intelligence 36connectionism 36, 101–2neural networks 102Atomic theory 167n21Autonomia Operaia 4, 7, 83, 86–8,see also MarxismBarabási, Albert-László 48Biological computation 4, 99–101,108–14artificial life 102–3, 167–8n35cellular automata 109–13,167–8n35connectionism 101emergence 100, 102, 105–6, 114,115, 119genetic algorithms 114–15neural networks 67, 99, 101–2Baran, Paul 56, 64–6Barbrook, Richard 76–7, 89, 93,165n54Bateson, Gregory 33, 102Baudrillard, Jean 14, 32, 135–9, 141Bell’s Laboratories 11, 12Benjamin, Walter 140Berardi, Franco 127, 130Bergson, Henri 26, 50–2Berners-Lee Tim 45–6, 61–2, 108,165n54Bionomics 119Biophysicalprocesses 27properties 42systems 63Boltzmann, Ludwig 21, 29, 159n21Bottom-up organization 99–100,101, 102, 105, 116Brillouin, Louis 29–30Broadcasting 40, 64Brooks, Rodney 36, 102Bulletin Board Systems 68Bush, Vannevar 61, 109Caffentzis, George 112‘California Ideology’, The 120Canguilhem, George 98Capital 84, 89Capitalism 32, 73, 77, 80informational 121late 84Capitalist valorization 94–6Castells, Manuel 43, 104, 148Cerf, Vincent 55CERN 61Chaos theory 32, 99, 106, 113Class 82, 130, 140working 82, 154Cleaver, Harry 156, 170n31clinamen 106Cooperation, productive powers of61Commodification 77Communicationbiopower 151–2channel 2, 12, 14, 15, 16, 23–4, 25chaos 3code 21–2, 25and democracy 133distributed 64engineering 11–12, 29global 7management/managers 10, 14,16, 17, 34, 140–1mathematical theory of 3, 10, 15,22–4strategies 14technologies 40CommuniTree Bulletin Board 69–70179


180 Network CultureCopyright 7Crasson, F. J. 13–14Critical Art Ensemble 155–6Criminal Justice Act 83, 163–4n27Culturalactivism 17economy 73, 75–6, 83–4imperialism 146labour 84theory 8–9, 20Culture industry 80Culture jamming 17Cybernetics 11, 29–30, 32control 4, 43, 100, 108, 114–16,118, 122feedback 7, 18, 27, 122Cyberpunk 87Cyberspace 42–3, 46, 75, 87DARPA 54–5, 61Dawkins, Richard 123–7Selfish Gene 123–7Databases 34–5De Landa, Manuel 166n6Deleuze,Gilles 17, 26, 34, 124, 143,164n29, 169n25d’Eramo, Marco 26Dery, Mark 17Dematerialization 40Democracy 131–2cyberdemocracy 135Design 19Difference, cultural politics of 35–6,60–1DigitalCity, Amsterdam 69–70machines 33techniques 34media 7economy 4, 75–80, 89–91, 93–4Digitization 34Disintermediation 74DNA 30–1Domain Name System 43–7and alternate roots 160n10Duration 50–2Electric Mind 69Einstein, Albert 2, 158n1Email 53and statistics 161n21e-mailing lists 69Engelbart, Douglas 54, 85Enlightenment 16, 144Entropy 3, 7, 9, 18, 21, 27, 29–30,32, 44, 159n21Empire 40, 62, 129Epicurus 106Evolutionary theory 99External economies 75Face, imperialism of the 143File-sharing 27, 53Fluid mechanics 4, 32, 69, 106–8Fordism/post-Fordism 78, 88Frankfurt School 143Freedom of Information Act (US)132Frequency 8, 14Foucault, Michel 123, 160n44Garcia, David 149–50General Intellect 4, 86–7Genetics 123–4Gibson, William 42, 48Giddens, Anthony 138Gift economy 76–7Gilmore, Jon 64, 120Gilroy, Paul 74, 75Governmentality 123Globalcomputation 109–10, 112culture 1, 39, 49, 108, 145/local 67networks 43village 39time 40Globalization 43, 47, 49, 71communication systems 71, 145cultural 47Grid topologies 45–6Guattari, Felix 98, 143, 164n29Habermas, Jurgen 132–3, 134Hall, Peter 104Hall, Stuart 1Haraway, Donna 74Hardt, Michael 40, 129, 155


Index 181Hegemony 8Helmereich, Stephen 123Hip hop 142Holland, John 109–10Horvarth, John 93Husserl, Edmund 13Hypernetwork 41Hyperreality 90, 139ICANN 45, 160n10Identity 31, 32, 34, 35, 130Cultural 10politics 8, 35, 36Imageflows 148, 151; and networkculture 153ecologies 142–4and warfare 142Incompatibility 56–7, 59, 61, 71Informationarchitecture 53, 56, 61bomb 2, 158n1as content 6–7, 8, 13, 14cultural politics of 3, 9–10, 13,14, 17–19, 25, 27, 34flows 2, 3, 8, 10, 15, 35, 37, 42,60, 71, 154industry 75–6overload 1, 90societies 25superhighway 120technologies 6theory 3, 8, 9–13, 20–5, 29–30,33–4tranmission 22, 25warfare 4, 143Informationalactive power of 19capitalism 121commodity 6, 7culture 6, 8, 9, 14dynamics 1–2, 3, 7–8, 9, 17, 28,52, 153ecology 68, 141–2milieu 4, 5, 8, 9, 10, 16, 17, 25,37, 41, 42, 68, 100, 116, 144topologies 10, 37, 48Informationalism 31Intellectual property 7Intelligencecollective 84–6fringe 66human 78–9Interconnectedness 2International Monetary Fund 131,154Internetarchitecture 42as database 47design philosophy 3–4culture 39, 63, 67and entropy 44as general intellect 87and multitude 144and microsegmentation 153layers 59portals 48protocols 39–43and self-organization 119–20statistics 161n21space 36time 39Internetwork 42, 54, 58, 62, 64Jameson, Fredric 138Jefferson, David 101Journalists 16, 17, 19Kahn, Robert 55Kaufmann, Stuart 109Kelly, Kevin 84–5, 106, 118Kittler, Friedrich 8Knowledgeeconomy 85, 95workers 75, 78, 81–2, 88Labourabstract 84free 4, 91–2, 93–4immaterial 61, 82–4living 87users’ 91Langton, Christopher 112–13Lash, Scott 32Lazzarato, Maurizio 82–3Lévy, Pierre 27, 34, 85Licklider, J.C.R. 18, 54, 61Lotringer, Sylvère 155


182 Network CultureLovink, Geert 39–40, 62, 153Lucretius 167n21Lyotard, Jean-François 164n45Management theory 78–9Manovitch, Lev 35, 117Marketing 34–5Marx, Karl 73, 84, 86–7Marxism 4, 7, 83, 86–8Real subsumption 4fixed capital/living labour 87MassAudiences 34communication 5, 13, 64, 135–8culture 140democracies 132, 137and distracted perception 140factory 61formation 5intellectuality 87–8media system 64networked 135pull of the 49psychology 153segmented 147–8, 150–1society 31and statistics 28Massumi, Brian 26, 151–2, 159n27Mattelart, Armand 144Matrix, The 87Maxwell’s demon 30Meaning 12–14Mediaactivists 148–9and cultural studies 13, 76culture 134Dutch 149effects 10, 19meanings 8, 138messages 2, 12–13, 14, 16modern 8, 15, 64migrant 148–9multinational 149power 133tactical 149technologies 98Maxwell, James Clerk 28–9Maxwell’s Demon 30McLuhan, Marshall 39, 145, 146,147Merleau-Ponty, Maurice 13Meshwork 1–2, 41, 58Microsoft 92Migrant communities 150MIT 11, 120Mobotics 36–7Modernity 28, 46, 64Modulation 4, 19, 28, 34, 35, 108,114Mumford, Lewis 96, 117Multitude 104–6, 115–16, 118, 123,129–30, 152, 155Nation state 40Neatherlands and media 149Negentropy 3, 7, 30–1Negri, Antonio 39, 40, 62, 129–30,155, 161n31Negroponte, Nicholas 120Netscape 92–3, 165–6n57Netslaves 73nettime mailing list 137NetworkArchitecture 3, 118culture 1–2, 3, 38, 62, 72, 130,144, 147–8, 150–1, 153–4distributed 57duration 57, 63, 71dynamics 63, 69engineers 60intelligence 75memory 67micropolitics 70mutation 153neural 67open 57, 60, 118physics 3, 69–70politics 3, 155power 62as production machine 100societies 43, 129as structure 118time 39, 43topos 37Network Working Group 56, 66New Economy 4, 34, 73, 76, 99,106, 116–17, 136


Index 183New Labour 61Nonlinearity 3, 27, 28, 31, 33, 37,49, 68Noise 2, 9, 10, 12, 13, 14, 15, 18,23, 32Open architecture 55–7, 58, 60space 58systems 3, 28, 42, 122Orwell, George 143Packet-switching 56, 64–7Palestinian Intifada 145, 147Peer-to-peer networking 120Perception 19and distraction 139management 140–1, 143war on 142Phase space 107Phenomenology 13Plant, Sadie 120Political, zero degree of 139, 153,155, 157Populationbiology 108–9, 166n6statistics 28Postel, Jon 45Postmodernism 13, 27, 34, 56,60–1, 74, 83, 89–90, 137Prigogine, Ilya 37, 106, 122Probability theory 20–4, 25, 26,29Propaganda 4, 16, 133Publicopinion 131–4, 143sphere 4, 132–4, 136–7, 139Quantum theory 30Quetelet, Adolphe 28, 31Razorfish 116–17Representation 8, 20, 24, 27, 28, 31,35cultural politics of 8, 10RepresentationalSpace 36strategies 32Redundancy 14Resistance 8, 37, 81Rheingold, Howard 69, 91rhizome mailing list 69Roberts, Lenny 61Roosevelt, Franklin Delano (radiochats) 145Ross, Andrew 116–17Ross, Aschby W. 22–3Schrodinger, Erwin 30Segal, Jérome 11, 28Self-organization 85, 99 (see alsobottom-up organization), 100,116Self-organizing systems 4Serres, Michel 15, 106–7Shannon, Claude E. 3, 9–13, 17, 21,29, 30, 109, 159n21Signification 24Silicon Alley 116Silicon Valley 103–4Simondon, Gilbert 18Simulation 141, 143slashdot BBS 147Social constructionism 121Social factory 4, 74Social forums 154Social physics 11, 28, 31, 32Socialization 138Sociobiology 123–4Softwarefreeware 53, 92–3open source 53, 91–4, 120,165n50shareware 92–3social 53Spinoza, Baruch 170n32Statisticallaws 21–2mechanics 21, 28, 29, 31Statistics 28–9, 139Stengers, Isabelle 37, 106Stiglitz, Joseph 131, 160n7Stock markets 6, 25, 103, 107, 136Stone, Alluquère Rosanne 69Subcultural movements 80Subjectification 126–8Subjectivity 39Swatch time 39Syndicate mailing list 69


184 Network CultureTaylor, Charles 101Taylor, Robert W. 18Telegraphy 8, 65Telephony 65Telepolis 55Television 41, 89debates 16industry 89people’s shows 95–6reality TV 17, 89, 95–6, 127–8satellite 41Thermodynamics 21, 28, 30, 31,125Turbulence 18, 68, 107, 124, 128Turing machine 109, 112, 167n35Turner, Ted 146Ulam, Stanislav 99, 109Ummah, Electronic 1, 145Urry, John 32Venturi, Robert 56Virilio, Paul 2, 43, 158n1Virno, Paolo 86, 88Virtual 10, 20, 26, 152communities 91, 145movements 155–6Virtuality 26, 37, 51, 83Virtualization 27Von Neumann, John 99, 110–11,159n21Washington Consensus 41Watts, Duncan J. 48Weaver, Warren 11–12, 23, 29Web-logs 68, 120blogosphere 68Web-rings 68Western metaphysics 50–2Wiener, Norbert 11, 22, 29, 32–3,122Williams, Raymond 135Wired magazine 94–5Wolfram, Stephen 112World War II 11World Wide Web 48, 53, 69, 94–5Zeno’s paradoxes 50

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!