Also in this section

td032 - 2/9/2006
Art at the Event Horizon
Avi Rosen
td031 - 2/2/2006
Hypervirus: A Clinical Report
Thierry Bardini
td030 - 1/10/2006
Precision + Guided + Seeing
Jordan Crandall
td028 - 12/15/2005
HERE
Barbara Mor

Mailing



Email this link to a friend HOME : 1000 DAYS OF THEORY
Printer Friendly Version 1000 Days of Theory: td029
Date Published: 12/15/2005
www.ctheory.net/articles.aspx?id=500
Arthur and Marilouise Kroker, Editors

1000 DAYS OF THEORY



The Universal Viral Machine

Bits, Parasites and the Media Ecology of Network Culture


Jussi Parikka


"Organisms are adapted to their environments, and it has appeared adequate to say of them that their organization represents the 'environment' in which they live [...]."[1]

-- Humberto Maturana


Prologue: The Biology of Digital Culture

During the past few decades, biological creatures like viruses, worms, bugs and bacteria seem to have migrated from their natural habitats to ecologies of silicone and electricity. The media has also been eager to employ these figures of life and monstrosity in representing miniprograms, turning them into digital Godzillas and other mythical monsters. The anxiety these programs produce is largely due to their alleged status as near-living programs, as exemplified in this quote on the Internet worm of 1988:

The program kept pounding at Berkeley's electronic doors. Worse, when Lapsley tried to control the break-in attempts, he found that they came faster than he could kill them. And by this point, Berkeley machines being attacked were slowing down as the demonic intruder devoured more and more computer processing time. They were being overwhelmed. Computers started to crash or become catatonic. They would just sit there stalled, accepting no input. And even though the workstations were programmed to start running again automatically after crashing, as soon as they were up and running they were invaded again. The university was under attack by a computer virus.[2]

Such articulations of life in computers have not been restricted to these specific programs, but they have become a general way of understanding the nature of the Internet since the 1990s. Its complex composition has been depicted in terms of "grassroots" and "branching structures", of "growing" and "evolution." As Douglas Rushkoff noted in the mid-1990s, "biological imagery is often more appropriate to describe the way cyberculture changes. In terms of the way the whole system is propagating and evolving, think of cyberspace as a social petri dish, the Net as the agar-medium, and virtual communities, in all their diversity, as the colonies of microorganisms that grow in petri dishes."[3]

In this article, I examine computer worms and viruses as part of the genealogy of network media, of the discourse networks of the contemporary media condition. While popular and professional arguments concerning these miniprograms often see them solely as malicious code, worms and viruses might equally be approached as revealing the very basics of their environment. Such a media-ecological perspective relies on notions of self-referentiality and autopoiesis that problematize the often all-too-hasty depictions of viruses as malicious software, products of vandal juveniles. In other words, worms and viruses are not antithetical to contemporary digital culture, but reveal essential traits of the techno-cultural logic that characterizes the computerized media culture of recent decades.

I place special emphasis on such functions of the past decades of digital culture as networking, automation, self-reproduction and copying and communication. These terms have been incorporated both in the vocabulary of media culture, as well as in the practical engineering work performed by computer scientists and other professionals who implement the principles of computing across the globe. As I have discussed the connection of computer viruses and information capitalism elsewhere [4], the present text focuses more on the socio-technological genealogy of the phenomenon, thus supplementing the work already carried out.

In 1994 Deborah Lupton suggested that computer viruses could be understood as metonyms "for computer technology's parasitical potential to invade and take control from within"[5], thus expressing the ambivalent reception -- vacillating between anxiety and enthusiasm -- with which the computer has been greeted during recent decades. In a similar manner, I ask whether viruses are a metonymy, or an index, of the underlying infrastructure, material and symbolic, on which contemporary digital culture rests. Whereas some biologists claim, "[a]nywhere there's life, we expect viruses,"[6], it seems to me that this can perhaps be extended to the world of digital culture too. Mapping the (historical) territories computer worms and viruses inhabit produces a cartography of these effective pieces of code that does not reduce them to the all-too-general class of malicious software but acknowledges the often neglected centrality such types of programs have in the network ecology of digital culture. Such pieces of viral code show us how digital society is inhabited by all kinds of quasi-objects and non-human actors, to adopt Bruno Latour's terminology.[7] In this sense, artificial life projects and the biological metamorphoses of the digital culture of recent decades provide essential keys to unravelling the logics of software that produce the ontological basis for much of the economical, societal and cultural transactions of modern global networks.

The contemporary cultural condition is often described as an essential coupling of war and media -- and the cybernetic logistics of command, control, communications and intelligence, C3I -- extended from strictly military networks to also include the entertainment media.[8] I suggest, however, that "life" and ideas such as "ecologies" and "territories" can also act as valuable theoretical points of reference in understanding the paradigms of digital culture. Cybernetics, like other scientific origins of modern day digital networks, also focus on life and coupling the biological with the technological, a theme that has won ground especially during the past few decades along with an ever-increasing amount of semi-autonomous software. Instead of simple top-down design and control, we have more and more artificial yet life-like processes of self-organization, distributed processing and meshworking -- themes that, while key cultural symbols, are also real processes underlying the media ecology of digitality.

Viruses and worms present themselves as culminations of these cultural trends, while also functioning as novel "tools for thought"[9] for a media theory that focuses on complexity and connectionism. Complexity theories have found their niche within philosophy and cultural theory emphasizing open systems and adaptability. Similarly, theories that underline the co-evolution of the organism and its environment also provide important points of view for studying digital culture, allowing thought to bypass object-subject dichotomies and see this media cultural condition as one of continuous feedback and self-recreation. The ingenious realization of various projects of digital culture has been that their understanding of "life" was based on self-reproduction and a coupling of the outside with the inside, a process of folding. This essay follows this trace, and folds this theme with cultural theory concerning digital network culture. In short, even though the aforementioned terms "life", "ecology", etc. are easily self-referential loops, or -- as in other cases -- formal models, I want to suggest a more subtle idea. When discussing the "life of network culture", it should not be taken as a form, but rather as movement and coupling in a similar manner as Deleuze's reading of Spinoza affirms:

"The important thing is to understand life, each living individuality, not as a form, or a development of form, but as a complex relation between differential velocities, between deceleration and acceleration of particles." [10]

This ecological perspective does not, then, rely on formal characteristics of life, but is a tracing of the lineages of the virtual machinic phylum of digital network culture and also a tracing of the paths of organisms that move on this plane: a biophilosophy [11] or -- genealogy of digital life. Hence, although the focus here is on genealogies of network culture, this mapping is done in order to provide rewirings for futures and becomings, as the final part of this article will illustrate.


The Universal Virus Machine

Fred Cohen has become known as the pioneer who engaged in deciphering the potentialities in viral programs at the beginning of the 1980s. Cohen's 1983 experiments on viruses have since become famous, and Cohen, then a Ph.D student in electrical engineering at the University of Southern California, has been usually cited as the one who realized the potential dangers of viral programs.[12] The "denial of services attacks" Cohen described and warned about have since been demonstrated a very feasible means of information warfare, a war taking place on the level of digital coding -- softwar(e) as a spy thriller from 1985 named it.[13] Cohen illustrated this in a piece of pseudo-code meant to give an idea of what viral programming might look like in principle:

subroutine infect-executable:=
{loop:file = get-random-executable-file;
if first-line-of-file = 1234567 then goto loop;
prepend virus to file;
}[14]

Evidently, the stir about viruses and worms that arose at the end of the 1980s was due to the realization that this far-from-inert piece of code might be responsible for the "digital hydrogen bomb", as the cult 1980s cyberculture-magazine Mondo 2000 noted.[15] As the Cold War period's anxiety over nuclear weapons seemed to be fading, computer miniprograms and malicious hackers proved a novel threat.

Fred Cohen was not, however, thinking merely of digital guerrilla war but of life in general, of the dynamics of semi-autonomous programs, highlighting that the two, war and life, are not contradictory modalities, in the sense that both are about mobilizing, about enacting. In this respect, his work has also been neglected, and I am not referring to the objections his research received in the 1980s.[16] Instead of merely providing warnings of viruses, Cohen's work and Ph.D thesis presented the essential connections that viruses, Turing machines and artificial life-like processes have. We cannot be done with viruses as long as the ontology of network culture is viral-like. Viruses, worms or any other similar programs that used the very basic operations of communicatory computers were logically part of the field of computing. The border between illegal and legal operations on a computer could not, therefore, be technically resolved -- a fact that led to a flood of literature on "how to find and get rid of viruses on your computer."

For Cohen, a virus program was able to infect "other programs by modifying them to include a, possibly evolved, copy of itself."[17] This allowed the virus to spread throughout the system or network, leaving every program susceptible to becoming a virus. The relation of these viral symbol sets to Turing machines was essential, similar to an organism's relation to its environment. The universal machine, presented in 1936 by Alan Turing, has since provided the blueprint for each and every computer there is in its formal definition of programmability. Anything that can be expressed in algorithms can also be processed with a Turing machine. Thus, as Cohen remarks, "[t]he sequence of tape symbols we call 'viruses' is a function of the machine on which they are to be interpreted"[18], logically implying the inherency of viruses in Turing machine-based communication systems. This relationship makes all organisms parasites in that they gain their existence from the surrounding environment to which they are functionally and organizationally coupled.

Although Cohen was preoccupied with the practical problems of computer security [19], his work also has more important ontological implications. Security against malicious software (and the danger of someone using them to wage war) was only one component of computer viruses, expressed in the difference between the pseudo-code of

subroutine infect-executable:=
{loop:file = get-random-executable-file;
if first-line-of-file = 01234567 then goto loop;
compress file;
prepend compression-virus to file;
}

and

subroutine trigger-pulled:=
     {return true if some condition holds}

main-program:=
{infect-executable;
if trigger-pulled then do-damage;
goto next;}

Such pieces of pseudo-code have been used ever since in illuminating the general logic of how viruses work. The small difference between these two examples demonstrates that the activities of viruses are not reducible to the potential damage malicious software is capable of inflicting on national and international bodies of order, but the very logic of self-reproducing software proves a fundamental issue regarding, of course, the ontology of viruses and the digital media culture of networking. Even if Cohen's obvious point was to find models and procedures for secure computing -- to maintain the flow of information in a society -- this task was accompanied with something of a more fundamental nature. Thus, basically, viral routines were not confined to damage but enabled the idea of benevolent viruses as well: for example a "compression virus" could function as an autonomous maintenance unit saving disk space.[20]. In a similar sense another experimenter from the early 1990s, Mark Ludwig, believed that viruses were not to be judged solely in terms of their occasional malicious payloads but by the characteristics that made it reasonable to discuss them as artificial life: reproduction, emergence, metabolism, resilience and evolution.[21]

This turns the focus to the virulence of virus programs. Being bits of code that, by definition, function only to infect, self-reproduce and activate from time to time, it is no wonder that a number of computer scientists have been unable to view them as passive material but as of something acting, spreading. Others have taken them to be examples of primitive artificial life in their capability to reproduce and spread autonomously (worms) or semi-autonomously (viruses).

I do not want to address the question of whether worms and viruses are life as we know it, but underline that in addition to being an articulation on the level of cultural imaginary, this virality is also a very fundamental description of the machinic processes of these programs, and of digital culture in general. As a continuation to the theme of technological modernization, network culture is increasingly inhabited by semi-autonomous software programs and processes, which often raised the uncanny feeling of artificial life as expressed, for instance, in the various journalistic and fictitious examples describing software program attacks. This uncanny feeling is an expression of the hybrid status of such programs that transgress the constitutional (in Latour's sense of the word) boundaries of Nature, Technology and Culture. Whereas viruses and worms have come to be the central indexes of this transgression for popular consciousness, artificial life projects have also faced the same issue. As transversal disciplines such as ALife have for decades underlined, life is not to be judged as a quality of a particular substance (the hegemony of a carbon-based understanding of life) but as a model of the interconnectedness, emergence and behaviour of the constituent components of a(ny) living system. Chris Langton suggested in the late 1980s that artificial life focuses not on life as it is, or has been but on life as it could be. This is taken up as the key idea for projects that see life emerging on various synthetic platforms, silicon and computer-based systems and networks for example. [22] In a similar vein Richard Dawkins, when he viralized cultural reality with his theory of memes in 1976, referred to the possibilities of finding life even in "electronic reverberating circuits." [23]

Consequently, a more interesting question than that of whether some isolated software programs are alive is to be found in the issue of what kind of novel approaches the field of artificial life can provide for understanding digital culture. Artificial life might at least provide us with an approach to think living systems not as entities in themselves, but as systems and couplings -- here Thomas S. Ray's Tierra-virtual ecology from the 1990s provides us with a good example. [24] This ALife-approach might also lead us to think of the contemporary media condition as an ecology of a kind, of "living" in the sense that it is based on connectionism, self-reproduction, and couplings of heterogeneous elements. This also resonates with the above-mentioned Spinozian understanding of life as affectivity: relations of varying velocities, decelerations and accelerations between interconnected particles.

What Cohen established, and this might be his lasting contribution even if one does not want to downplay his achievements in computer science, was the realization that digital culture was on the verge of a paradigm shift from the culture of Universal Computing Machines to Universal Viral Machines. This culture would no longer be limited to the noisy capabilities of people designing the algorithms. Instead, these evolutionary concepts of computing provided a model for a digital culture that increasingly relied on capabilities of self-reproductive, semi-autonomous actors. To quote Cohen's all-too-neglected words on "viral evolution as a means of computation" that crystallize the media ecology of networking digital culture:

Since we have shown that an arbitrary machine can be embedded with a virus (Theorem 6), we will now choose a particular class of machines to embed to get a class of viruses with the property that the successive members of the viral set generated from any particular member of the set, contain subsequences which are (in Turing's notation) the successive iterations of the "Universal Computing Machine." The successive members are called "evolutions" of the previous members, and thus any number that can be "computed" by a TM [Turing Machine], can be "evolved" by a virus. We therefore conclude that "viruses" are at least as powerful a class of computing machines as TMs, and that there is a "Universal Viral Machine" which can evolve any "computable" number.[25]


Code Environment

From an everyday perspective the question of technological evolution might seem oxymoronic, considering the violent intermingling of two such different spheres as "biology" and "technology." This issue has been thoroughly discussed ever since early cybernetics in the 1950s, and the articulations of biology and technology continue to prove their operationality when understood as a questioning of the dynamics of technology. As Belinda Barnet notes in her essay on the question of technological evolution and life, what is at hand is the need to grant "the technical object its own materiality, its own limits and resistances, which allows us to think technical objects in their historical differentiations."[26]

Barnet's agenda connects to my articulation of a media ecology. Computer worms and viruses, as well as other technical elements of digital culture for that matter, are not reducible to the discourses or representations attached to them, and in order to understand the complex nature with which they are intertwined in the material cultural history of digitality, one must develop alternative concepts and approaches. In this problematic, "life" and "dynamics" seem to resonate together in a manner proposed by complexity theories that value the processual nature of (open) systems based on the ongoing feedback loop between an organism and its environment. However, since these notions easily remain vague metaphors, they need to be addressed more thoroughly in order to amplify their implications for contemporary media ecology. Here I will approach the issue via a reference to the way Deleuze and Guattari have outlined the issues of the machine (as separated from technologies themselves) and machinic ontology as interconnective and interactive. That is, media ecologies can be understood as machinic processes based on certain technological and social lineages that have achieved consistency. Machinic thus refers also to a production of consistencies between heterogeneous elements.[27] In such an ontology of flow, technological assemblages are partial slowing downs of flows into more discrete functional entities. There are no humans using technologies, nor are there any technologies determining humans, but a constant relational process of interaction, of self-organization, and hence the focus is moved to "subjectless subjectivities".[28] In this sense, the life of media ecology is definable as machinic.

Life as connectionism, not as an attribute of a particular substance, has been at the centre of viral theory as well:

The essence of a life form is not simply the environment that supports life, nor simply a form which, given the proper environment, will live. The essence of a living system is in the coupling of form with environment. The environment is the context, and the form is the content. If we consider them together, we consider the nature of life.[29]

I would like to especially emphasize the coupling of an entity with its environment as the essence of what constitutes "life." This has a very important implication. As scientists who have tackled the idea of computer viruses as artificial life have already noted, it is difficult, or perhaps even impossible, to fully adopt computer viruses under the criteria of (biological) life. If we take an entity and a list of the qualities it should display (reproduction, emergence, metabolism, toleration of perturbations and evolution), then nothing other than traditional life will succeed in meeting the criteria for life.[30] I want, however, to take the suggestions for viewing life and artificial life in terms of machinic connectionism as horizons and experimental ideas with which to think the contemporary media ecology.

Hence, viruses -- and non-organic life in general -- should be viewed as processes, not stable entities. Viruses, by definition, are machines of coupling, of parasitism, of adaptation. Admittedly they might not be "life" as it is defined by everyday usage or a general biological understanding, but yet they are spectres of the media ecology that invite us to take them as, at least, "as-if-life." Considering a virus as an infection machine, "a program that can 'infect' other programs by modifying them to include a, possibly evolved, copy of itself"[31], signifies the impossibility of focusing on viruses per se, and demands that we take a wider cultural perspective on these processes of infection. As part of the logical circuits of Turing machines, viral infection is part of the computer architecture, which is part of the technical sphere and genealogy of similar technical media machines, which in turn connect to lineages of biological, economical, political, and social nature, and so forth. Viruses do not merely produce copies of themselves but also engage in a process of autopoiesis: they are building themselves over and over again, as they reach out to self-reproduce the very basics that make them possible, that is, they are unfolding the characteristics of network culture. In this, they are machinic subjects of a kind.[32] This viral activity can be understood also as the recreation of the whole media ecology, reproduction of the organizational characteristics of communication, interaction, networking and copying, or self-reproduction.[33] This is where I tend to follow Maturana and Varela and their idea that living systems are part and parcel with their surroundings and work towards sustaining the characteristics and patterns of that ecology. They occupy a certain niche within the larger ecology: "To grow as a member of a society consists in becoming structurally coupled to it; to be structurally coupled to a society consists in having the structures that lead to the behavioral confirmation of the society," [34] writes Maturana.

"Infections" or couplings were part of the genealogy of digital culture even before the 1980s in the form of John von Neumann's automata, which are often marked as the ancestors of modern day worms and viruses. Von Neumann engaged deeply in automata theory, automata referring here to "any system that processes information as part of a self-regulating mechanism."[35] Automata capable of reproduction included logical control mechanisms (modelled on the McCulloch-Pitts theory of neurons) together with the necessary channels for communication between the original automaton and the one under construction as well as the "muscles" for enabling the creation. This kinetic model of automata was soon discarded, however, as it proved to be hard to realize: a physical automaton was dependent on its environment for its supply of resources and providing it with such an ecology proved to be too cumbersome. Thus, with advice from his friend Stanislav Ulam, Von Neumann turned to developing cellular automata, formal models of reproductive systems with "crystalline regularities".[36] One of the models for formal self-reproductive patterns was the very primitive living organism bacteriophage.[37]

Nature, in the form of characteristics of simple organisms, became interfaced as part of these formal models for computation. Cellular automata as two-dimensional cell tables, with each cell being a finite automaton of its own, its state determined by the states of its neighbouring cells, were to be understood as neuronlike structures. Once put into action, the automata seemed to take on a life of their own, as demonstrated in the 1970s by John Conway at the MIT laboratories with his version, symptomatically called "Life." These were essentially coupling machines, bounded however by their formal characteristics as part of a two-dimensional habitat. While a single cell could not be thought to be alive in any sense of the word, the whole system, which was in constant interaction, seemed to contain remarkable powers of calculation and emergence.

Such ideas, which became part of complexity theories, underscored the necessity in understanding the processual nature of (computational) life: formal mathematical models, computers and perhaps the ontology of the world as well were based on forms of interaction between quasi-autonomous units. This relates to the need to emphasize that even if modern digital culture, in the archaeology fastened to the importance of World War II and the military origins of cybernetics, computers and networking, is inherently employed as a technology of death, there is also another thematics, up to now neglected, that issues computers a role in the diagrams of life.[38] In addition to military contexts, underlying, for instance, von Neumann's and Wiener's work, there exists also the striving for the "design of relatively simple simulacra of organic systems in the form of mathematical models or electronic circuitry."[39] Such aspects should lead us to bring forth new genealogies of computing for the contemporary media condition. These perspectives should furthermore complexify our notions of the history of viruses and virallike programs, as well as lead us to rethink some basic assumptions concerning the contemporary culture of technology, which is increasingly modelled and designed as a complex, interconnecting ecology.

But, considering the "nature of digital culture", are these lineages to be seen as metaphors that guided the research done at computer laboratories, or could the interconnection of life (or at least the science of life, biology) and technology be more fundamental? Instead of restricting the design work to the level of the metaphoric and language, one could also speak of the diagrammatics of computer design, piloting the research and implementation done. The research on biology and computers was coupled, both infected by each other during the latter half of the 20th-century so that the human being and nature in general were increasingly understood as informatics (especially so with the boom in DNA-research) and informatics were infiltrated by models adopted from brain research and, later, from ecological research. Thus, as Von Neumann himself thought, designing computers was a matter of designing organs and organisms[40], that is, machines that could function semi-independently as natural beings. Nature became the ultimate imaginary reference point for digital culture, not so much a mirroring but an active interfacing of the technological and the biological.

What I want to emphasize is that this interfacing is not solely linguistic, we should not talk merely about the metaphorics of computer culture (as a cultural studies perspective so often does), but see the biology of computers also as organizational in that a certain understanding of biological organisms and ecological patterns and characteristics of life is entwined as part of the design and implementation of digital culture. [41] In this sense, the cultural theory of digital culture could also turn to biology as an aid, and interface with, for example, Maturana and Varela's notions of autopoietic living machines where the component is structured as a functional part of the ambiance. As Guattari notes in Chaosmosis, this idea could be applied to an analysis of social machines as well -- and hence to analyzing the social machine of network culture, or the media ecology of networking. [42] The parts feed the structuring, while themselves being fed from the whole. Yet, the difference between mere mechanical repetition and creative living systems that Guattari notes [43] is an important one -- which I will return to later with a discussion of the virtuality of the living system.


Distributed Life Processes

To repeat, computer viruses are machines in the Deleuzo-Guattarian sense of the word in that they are connection-makers, reaching out and beyond their seeming borders in order to find functional couplings. In a restricted perspective, this means that they couple themselves to files they infect; by widening our horizon we see, however, that these couplings are inherently connections at the level of the Turing machine, that is, the architecture of the computer in general.

The ideas of coupling and biological thinking in computing gained consistency especially during the 1970s, when several network projects started to bloom. ARPANET (1969) was the pioneer, of course, but several others followed. Networking meant new paradigms for programming, as well as providing a fertile platform for novel ideas of digital ontology. Viruses and worms were a functional element within this new trend of computing. Consequently, the first archived real virus incident seems to be the Creeper-virus, which spread in the Arpanet-network in 1970. The Creeper was a utility program made to test the possibilities of network computing. Inspired by the first program, written by the network pioneer Bob Thomas, several programmers made similar virus-like programs.[44]

The worm tests made at the Xerox Palo Alto Research Center in the early 1980s were modelled on similar aspirations. As described by participating researchers John Shoch and Jon Hupp, worm programs basically meant copying parts of the program to idle machines on the network. The problem, as demonstrated by the Creeper, was of course that of how to control the spreading. Even the Palo Alto group experienced similar control problems when a worm that was left running over-night "got away": "The worm would quickly load its program into this new segment; the program would start to run and promptly crash, leaving the worm incomplete -- and still hungry looking for new segments."[45]

However, the Palo Alto scientists designed these programs -- "laboratory worms" of a sort -- with useful goals in mind. The existential worm was a basic test program with no other aim than to survive and proliferate. The billboard worm was designed to distribute messages across a network. Other applications included the alarm clock worm, the multimachine animation utility using worm-like behaviour and the diagnostic worm.[46] What is important is that basic Arpanet network programs contained worm-like routines, making the distinction between "normal" programs and parasitic routines ambiguous.

Similarly the idea of packet-switching that was pioneered with the Arpanet during the 1970s introduced local intelligence to communications: instead of being controlled from above from a centralized, hierarchical position, network communications distributed control into small packets which found their own way from sender to recipient. In a way, such packets included the idea of autonomy and local intelligence of bottom-up systems, while the network in general was formed into a distributed multiplexing system.[47] Since then, the basic architecture of the Internet has been based on data that is intelligent in the sense that it contains its own instructions for moving, using networks to accomplish its operations. In this sense, we can justifiably claim that the origins of worm-like -- and partly virus-like -- programs lie in the schematics of network computing in general. The ongoing ambivalence between anomalous and normal functionalities is part of the virus problem even today as the same program can be defined as a utility program in one context and as a malware program in another, a fact that has not changed during the history of modern computer software.[48] Similarly, many basic utility programs have for years been virallike even though often such programs have to have the consent of the user to operate.[49]

Of course, it can be argued that such programs were merely minor experiments and their significance should not be overestimated. However, they demonstrate several traits of a new paradigm of computing, or science in general. In computer science ideas of distributed programming and, later, of neural network programming, for example, were gaining ground, becoming part and parcel of the new (non-linear) order of digital culture. This was due to the growing complexities of the new networks of computation and communication. As computers had -- since the 1970s -- no longer been seen as calculation machines, but as "components in complex systems" where systems are built not from top-down but from "subsystems" and "packages", the basic idea of a programmer designing algorithms for carrying out a task and achieving a goal had grown old-fashioned. Designing distributed program environments was seen as one solution.[50]

A genealogical account might argue that this was a follow-up to the problems the military had already encountered. The entire field of cybernetics and man-machine symbiosis might be seen as part of the complexification of the military command and control structures for which computers provided the long hoped-for prosthesis to supplement the normal training of generals, admirals and field personnel.[51] In this sense, these network ecologies are not merely complex systems of a self-organizing nature but also designed systems, which aim to control the complexity and feedback loops of the system. Computer viruses and worms as well as computer culture in general are at least partially intentionally constructed, yet they cannot be reduced to being a mere human construction. Instead, network ecologies are mixtures of top-down design and bottom-up self-organization; we have both stable linear structurings and states of complexity that evolve in a dynamic fashion.

So, in addition to military purposes, (artificial) life (or more precisely the science of life, biology) is another historical context to be accounted for. In addition to distributed programming, techniques of neural network programming were introduced during the latter half of the 1980s. While these issues had already been discussed years earlier, the real boom came with the newly stated interest in computer programs with a capacity for learning:

If several different factors have collaborated to this explosion of interest, surely the discovery of algorithms allowing a neural network with hidden layers to "learn" how to accomplish a given task has had a profound influence in recent developments on neural networks. This influence is so big that to many newcomers in the field the expression "neural networks" is associated to some sort of "learning" [...].[52]

Such thematics of computer science correspond well with the general change of emphasis from top-down understanding of intelligence to bottom-up distributed systems of learning and adaptation, best illustrated in the Shoch-Hupp worm experiments, and perhaps even by the Creeper virus. The interest in such evolutionary patterns of viral learning continued all the way to the beginning of the 1990s when a new emphasis took over. The early 1990s also witnessed the first polymorphic viruses that seemed to be able to evolve in response to anti-virus actions.[53] Yet, as for example Fred Cohen accentuated, such live programs were alive only as part of their environment, in other words, as he had argued ten years earlier, a living system was comprised of living components that could reproduce, while not every component had to be alive and produce offspring.[54] Viruses as adaptive, self-reproductive and evolutionary programs were thus at least part of something live, even if not artificial life in the strongest sense of the word.[55] They were the new "Darwin machines"[56] that formed the ontology of a new digital culture, also incorporating the essential capitalist digital utopia of Intelligent agents, semi-autonomous programs that ease the pressures put on the (in)dividual by the increasing information input.[57] Intelligent agents that take care of the ordinary tasks on your computer or run such errands as reserving tickets, arranging meetings, finding suitable information from the Net and so on are, according to J. Macgregor Wise, telling of the changes in understanding agency in the age of digital culture,[58] and we might further emphasize that such programs are actually the culmination of key potentials within the ontology of digital culture. They represent a new class of actors and functions that roam across technological networks.

One way to grasp this change would be to talk of a Kuhnian paradigm shift in which "life" is no longer restricted to certain carbon-based organisms. As Manuel DeLanda stated in the early 1990s about artificial life applications in computer science:

The last thirty years have witnessed a similar paradigm shift in scientific research. In particular, a centuries-old devotion to "conservative systems" (physical systems that, for all purposes, are isolated from their surroundings) is giving way to the realization that most systems in nature are subject to flows of matter and energy that continuously move through them. This apparently simple paradigm shift is, in turn, allowing us to discern phenomena that, a few decades ago, were, if they were noticed at all, dismissed as anomalies.[59]

This, too, resonates with the shift of emphasis from top-down artificial intelligence paradigms in computing to seeing connectionism as the fruitful path to be followed in programming, referred to above. Complexity and connectionism became the key words of digital culture during the 1980s and ever since. The non-linear processes of thought and computing expressed the "new ideas of nature as a computer and of the computer as part of nature,"[60] non-reducible to analytic parts but instead functioning as an emergent whole. Concretely this meant diagrams of digital ecology that depended increasingly on viral computing and semi-autonomous programs. As Tony Sampson describes this new vision of the digital culture of Universal Viral Machines:

The viral ecosystem is an alternative to Turing-von Neumann capability. Key to this system is a benevolent virus, which epitomises the ethic of open culture. Drawing upon a biological analogy, benevolent viral computing reproduces in order to accomplish its goals; the computing environment evolving rather than being 'designed every step of the way' [...] The viral ecosystem demonstrates how the spread of viruses can purposely evolve through the computational space using the shared processing power of all host machines. Information enters the host machine via infection and a translator program alerts the user. The benevolent virus passes through the host machine with any additional modifications made by the infected user.[61]

Thus, no more "Turing's" and "Von Neumann's" or any other male designers as demiurges of computer hardware and software, except as forefathers of a posthumanistic digital culture of viral organisms. Interestingly, such depictions at the beginning of the 1990s of a viral ecology of digital culture are in accordance with a number of other narratives of posthumanism and the automated media culture of artificial life.[62] The Universal Viral Machine also seems to fulfill Friedrich Kittler's views of machinic subjectivity in the age of Turing Machines: for Kittler, machine subjects were born with the realization of conditional jump instructions, known also as the IF/THEN-pairing of program code.[63] This implies that a program can autonomously change its mode of operation during its course of action. In Kittler's schema, when computers have detached their read/write-capabilities from human assistance, the entrance of a new kind of subjectivity on the level of society is entailed. In this view, Fred Cohen's ironical notion that the first widely reported virus incident, the so-called Morris Worm (1988), was in fact "the world's record for high-speed computation"[64] proves an apt description of the potentialities of the semi-autonomous computational processes of digital culture, which exclude the human operator from the circuit. Worms and viruses might, then, also be grasped as posthumanist actors of a kind.


Media Ecology: Life and Territory

Digital culture was occupied with a new breed of vital computer programs in the 1980s and 1990s, even though such programs were merely actualizations of tendencies and aspirations of computer culture since the Second World War. Seeing these programs and digital network culture as part of the novel field of artificial life was one key attempt to conceptualize and contextualize them. In addition to being interesting examples of the capabilities of programming languages and digital network architecture, computer viruses and worms can be seen as indexes or symptoms of a larger cultural trend that has to do with understanding the life of media and the networked digital media culture through the concept of media ecology. Specifically, the coupling of nature and biology as part of digital architecture was a central trend since the pioneering work of von Neumann, Wiener and others. It gives an important clue to the genealogical traits of the modern media condition emphasizing adaptability, automation, complexity, and bottom-up intelligence, or artificial life. Viruses and worms function as immanent expressions of network culture.

On the other hand, such a conceptual perspective of media as an ecology, as life, or technological dynamism, provides a way of understanding the complexity, the connectionism and the flexibilities that function at the core of the contemporary media condition. In a way, this also accentuates the need to ground theories of digital culture in cybernetics (Wiener, von Neumann), and, even more urgently, in second-order cybernetics (Maturana, Varela, Luhmann, as well as Bateson) which might give an even more subtle and complex understanding of the connectionist technologies of contemporary culture. Such projects and orientations took their main priority to be in the couplings of systems and environments and the self-organization of complexity. Hence, approaching the issue of ecology with Gregory Bateson means apprehending ecology as the "study of the interaction and survival of ideas and programs (i.e. differences, complexes of differences, etc.) in circuits"[65], implying that prime importance should be given to the coupling of organisms and their environment as the basic unit of evolution.[66]

Ecologies should be understood as self-referential systems, or processes, where in order to understand (or observe) the functioning of the system, one cannot detach single elements from its synthetic consistency (and label some elements as purely anomalous, for example). Instead, one should focus on Humberto Maturana's question: "How does it happen that the organism has the structure that permits it to operate adequately in the medium in which it exists?"[67] In other words, attention should be on a systems approach that allows one to also think of digital culture as a series of couplings where "organisms" or "components" participate in the autopoiesis of the general system, which, in our case, is the digital culture of networking. The autopoietic system is a reproductive system, aiming to maintain its unity in organizational form:

This circular organization constitutes a homeostatic system whose function is to produce and maintain this very same circular organization by determining that the components that specify it be those whose synthesis or maintenance it secures. Furthermore, this circular organization defines a living system as a unit of interactions and is essential for its maintenance as a unit; that which is not in it is external to it or does not exist.[68]

From this perspective, computer worms and viruses are not so much anomalous, random or occasional break-ups in a (closed) system that would otherwise function without friction, as they are, rather contrarily, part of the ecology they are coupled with. Yes, such programs are often sources of noise and distortion that can turn against the network principles, but more fundamentally they repeat the essentials of network ecology, in effect reproducing it. This of course refers to the fact that viruses and worms do not have to contain malicious payloads in order to be viruses and worms. Hence, one should also analyze such entities on the abstract (machinic) level of their ecological coupling to the machinic phylum of networking.

In this sense, the network ecology should be seen as consisting of both actual and virtual parts in order to allow it a certain dynamism and to short-circuit the often too conservative focus on homeostasis found in some strands of systems theories. Where Maturana and Varela, for example, tend to emphasize that the circular system of homeostasis is self-enveloping, I would turn to a more Guattarian view where there is always an ongoing testing and experimenting of the limits of the organization to see what are the potential virtual tendencies of an ecology. [69] In this sense, media ecologies are not mere systems of empty repetition, but affecting and living entities looking for and testing their borders and thresholds.

Viruses and worms are tendencies within this machinic ecology of digital culture of the last decades. They are part of the machinic phylum of network culture, which can be understood as the level of potential interactions and connections. It is a plane of virtuality where specific actualizations, or individuations are able to occur. Thus there is always the perspective of (non-linear) evolution in such a comprehension of virtuality. The virtual as a plane of potentiality is something not actually existing (although real) for it is in constant process of becoming. Just as nature cannot be grasped as something "given", media ecologies should be seen as planes of giving, as iterative reserves. Brian Massumi writes about nature as virtuality and as a becoming, which "injects potential into habitual contexts", where "nature is not really the 'given'", but in fact "the giving -- of potential."[70]. As Massumi continues, this is Spinoza's "naturing nature", where nature cannot be reduced to an actual substance, a mere extensive and exhaustible state of being. This stance of active creation can also underscore the fact that media ecologies cannot be seen as static, hylomorphic structures of autonomous technologies but as active processes of creation, or as a useful orientation, horizon, with which to think the media condition of digital culture. The future of a media-ecological system is open-ended, making quite radical changes possible. Hence computer viruses as entropy-resisting instances of life can be seen as part of the autopoietic processes of a system yet also as potential vectors of becoming, open-ended becomings for novel conceptualizations of network culture.[71]

In short, on the plane of media ecology as a self-referential system it becomes irrelevant to label some elements as "anomalous", as not part of the system, for every element is given by the virtual system (which in itself and in its virtuality cannot be taken as a given, as a preformed platonic idea). Instead, "anomalies", if defined alternatively, are particular trackings of certain lineages, of potentials on that plane, not necessarily disruptions of a system. In addition, in accord with the communication theory of Shannon and Weaver recognizing that noise is internal to any communication system, it can be said that every media-ecological system has its white noise, essential to any functioning system. At times, of course, the noise may become too great and enact a change to another constellation.[72] Yet, fundamentally, nature works via parasitism and contagion. Nature is in fact unnatural in its constant machinic adoption.

From the point of view of a plane of immanence, Nature is not constituted around a lack or a transcendental principle of naturalness, instead it constantly operates as a self-creating process: "That is the only way Nature operates -- against itself."[73] This is also in accordance with the above-mentioned Spinozian understanding of life, which sees it as an affect: as movements, rests and intensities on a plane of nature (whether media-ecological or other). Nature is thus not merely a particular substance or a form, but a potential becoming, which connects to Guattari's project of virtual ecology, ecosophy: "Beyond the relations of actualized forces, virtual ecology will not simply attempt to preserve the endangered species of cultural life but equally to engender conditions for the creation and development of unprecedented formations of subjectivity that have never been seen and never felt."[74] "This experimental ethos amounts to a project of ecosophy that cultivates "new systems of valorization, a new taste for life."[75]

A media ecology is not, then, based solely on technical or social elements, for instance, but on the relationships of heterogeneous fields in which the conjoining rhythm of such an ecology unfolds. [76] As technical quasi-objects (or vectors of becoming) are relational to their technical environment (in the way that a virus is part of the Turing environment), such technicalities interface with the so-called-human elements of a system, leading us to realize the multifarious constitution of ecologies made up of social, political, economical, technical, and incorporeal parts, to name a few.[77]. In addition, as some critics have underlined, computer worms and viruses are not comparable to biological phenomena because they are merely part of digital code, programmed by humans. Instead of embracing such a social constructivist perspective we must, rather, see how this shows that people (Kittler's so-called-human-beings) are also part of media ecology: humans are part of the machinic composition, which connects and organizes humans and non-humans into functional systems. In this sense it would be an interesting agenda to analyze how virus-writing practices are related to general vectors of "viral autopoiesis", of the symbiotic network ecology. Or, to take another example: how the media technological logic of worms and viruses fits in with the logic of network organization, collaborative programming and "swarms" as analyzed by Hardt and Negri.[78]

The turbulent network spaces, as Tiziana Terranova refers to them, that support viral software, but also ideas and affects, is hence to be engaged head on and affirmatively. As Terranova notes, "the Internet is not so much a unified electronic grid as a chaotic informational milieu." This agrees well with my point concerning the notion of virtuality in media ecologies: media ecologies are not homeostatic grids or rigid structures, but only partially stable systems (multiplicities) with the potentiality for open-ended becomings. Discussing biological computing, concerned with the emergent bottom-up "power of the small"[79], Terranova notes that such systems do not follow any simple autopoietic movement of mechanical repetition, rather, "they are always becoming something else." [80] This "something else", this becoming at heart of the machinic phylum is what should be incorporated as part of our understanding of media ecologies as well: We are not dealing with rigid structures or platonic heavenly ideas, but potential tendencies to be cultivated and experimented upon in order to create alternative futures for digital network culture.


Notes
---------------

[1] Humberto R. Maturana and Francisco J. Varela. Autopoiesis and Cognition. The Realization of the Living, Dordrecht and London: D. Reidel, 1980, p.6.

[2] Katie Hafner & John Markoff. Cyberpunk: Outlaws and Hackers on the Computer Frontier. London: Fourth Estate, 1991, p.254.

[3] Douglas Rushkoff. Media Virus!, New York: Ballantine Books, 1996, p.247.

[4] On capitalism and computer viruses, see Jussi Parikka. "Digital Monsters, Binary Aliens - Computer Viruses, Capitalism and the Flow of Information." Fibreculture, issue 4, Contagion and Diseases of Information, edited by Andrew Goffey, http://journal.fibreculture.org/issue4/issue4_parikka.html.

[5] Deborah Lupton. "Panic Computing: The Viral Metaphor and Computer Technology." Cultural Studies, vol. 8 (3), October 1994, p. 566.

[6] "Scientists: Virus May Give Link to Life." SunHerald, May 12, 2004, http://www.sunherald.com/mld/sunherald/news/nation/8649890.htm.

[7] See Bruno Latour. We Have Never Been Modern. New York & London: Harvester Wheatsheaf, 1993

[8] In addition to perspectives articulated by e.g. Friedrich Kittler and Paul Virilio, see e.g. Stephen Pfohl. "The Cybernetic Delirium of Norbert Wiener." CTheory 1/30/1997, http://www.ctheory.net/text_file.asp?pick=86. See also Paul E. Edwards. The Closed World. Computers and the Politics of Discourse in Cold War America. Cambridge & London: The MIT Press, 1996.

[9] See Pierre Sonigo & Isabelle Stengers. L'Évolution. Les Ulis: EDP Sciences, 2003, 149. The media-ecological approach is usually connected with works by Marshall McLuhan, Neil Postman and the so-called Toronto School. On a critical evaluation of some media ecological themes, see Ursula K. Heise. "Unnatural Ecologies: The Metaphor of the Environment in Media Theory." Configurations, Vol. 10, issue 1, Winter 2002, pp. 149-168. See also Matthew Fuller's recent book Media Ecologies: Materialist Energies in Art and Technoculture. Cambridge, MA: MIT Press, 2005. Fuller discerns three strands of media ecology: 1) the organisational understanding of the information ecology at work places, etc., 2) the environmentalist media ecologies by e.g. McLuhan, Lewis Mumford, Harold Innis, Walter Ong and Jacques Ellul, which tend to emphasise homeostasis and equilibrium, and 3) the poststructuralist accounts of media ecology by e.g. N. Katherine Hayles and Friedrich Kittler, which can be seen as opening up the too humanistic emphasis of the second category. Fuller adds (pp. 3-5.) in also Félix Guattari's emphasis on experimentation and probing as a key part of his project, something that I find very valuable orientation as well, supplementing e.g. Kittler's perspectives.

[10] Gilles Deleuze. Spinoza: Practical Philosophy. Trans. Robert Hurley. San Francisco: City Lights Books, 1988, p. 123.

[11] See Eugene Thacker. "Biophilosophy for the 21st Century." CTheory 9/6/2005, http://www.ctheory.net/articles.aspx?id=472. In addition, I find Alex Galloway's notions of the protocological nature of viruses similar to my genealogical point. Viruses act as agents that take advantage of the Net architecture, yet their vectors exceed the predefined limits. See Alexander Galloway. Protocol. How Control Exists After Decentralization. Cambridge, MA & London: The MIT Press, 2004, p.186.

[12] Fred Cohen. "Computer Viruses - Theory and Experiments". DOD/NBS 7th Conference on Computer Security, originally appearing in IFIP-sec, 1984, Online: http://www.all.net/books/virus/index.html.

[13] Thierry Breton & Denis Beneich. Softwar. Paris: Robert Laffont 1985.

[14] Cohen, "Computer Viruses - Theory and Experiments".

[15] Rudy Rucker, R.U. Sirius & Queen Mu (eds.). Mondo 2000. A User's Guide to the New Edge. London: Thames & Hudson, 1993, p.276. The year 1984 Pentagon-report "Strategic Computing", aimed to bridge the "software gap" with Japan, was grounded on the ideas of autonomous predatory machines and visions of electronic software battlefields of the 1990s. Manuel DeLanda. War in the Age of Intelligent Machines. New York: Zone Books, 1991, pp.169-170.

[16] See Tony Sampson. "A Virus in Info-Space." M/C: A Journal of Media and Culture, 2004, http://journal.media-culture.org.au/0406/07_Sampson.php. Cohen's work was often neglected as not addressing a real threat. Several commentators were very skeptical about the possibility of a wide scale spread of such programs. Others regarded Cohen's tests as dangerous, in the sense that publishing the work would spread the knowledge needed to create viruses.

[17] Fred Cohen. "Computer Viruses." Dissertation presented at the University of Southern California, December 1986, p. 12.

[18] Cohen, "Computer Viruses," p. 25.

[19] This meant especially addressing the problems of transitivity, the flow of information, and in general, the trend of sharing and networking. Even if isolationism would have provided perfect security against viruses and other network problems, this was not an option in a world becoming increasingly dependant on smooth flow of information as the end product of capitalism. Transitivity of information means that any information flow from A to B and B to C means also a direct link from A to C. Thus, this describes basically an "open" system of flows (where the "openness" of the system is however subordinated to the logic of points). Partition model was conceptualized as a basic limit to this flow, closing a system into subsets and consequently restricting the free flow of information. Cohen cites the Bell-LaPadula security model (1973) and the Biba integrity model (1977) as policies that "partition systems into closed subsets under transitivity." These models that engaged with controlling information flows were among the earliest technical paradigms of computer security. See Cohen, "Computer Viruses," p. 84. "Clearly, if there is no sharing, there can be no dissemination of information across subject boundaries, and a virus cannot spread outside a single subject."

[20] Cohen, "Computer Viruses," pp.13-14.

[21] Mark A. Ludwig. Computer Viruses, Artificial Life and Evolution. Tucson, Arizona: American Eagle Publications, 1993, p.22.

[22] Chris Langton. "Artificial Life." In: Artificial Life. The Proceedings of an Interdisciplinary Workshop on the Synthesis and Simulation of Living Systems. Held September, 1987 in Los Alamos, New Mexico. Edited by Christopher G. Langton. Redwood City, CA: Addison Wesley, 1989, 2. See also Claus Emmeche. The Garden in the Machine. The Emerging Science of Artificial Life. Princeton: Princeton University Press, 1994. Christopher G. Langton (ed). Artificial Life. An Overview. Cambridge & London: The MIT Press, 1997.

[23] Richard Dawkins. The Selfish Gene. Oxford: Oxford University Press, 1977, p. 206.

[24] Tierra web page at http://www.his.atr.jp/~ray/tierra/

[25] Cohen, "Computer Viruses," pp.52-53.

[26] Belinda Barnet. "Technical Machines and Evolution." CTheory 3/16/2004, http://www.ctheory.net/text_file.asp?pick=414. Barnet revives in her essay "Technical Machines and Evolution" the question of technological evolution and life with the help of Bernard Stiegler, Niles Eldredge, André Leroi-Gourhan and Félix Guattari, among others. Barnet's project is to argue for a dynamic view of technology. In other words, Barnet seems to be committed to finding alternatives for a more traditional cultural studies understanding of technology, which reduces its dynamics to intentions, projections and discourses of human origin.

[27] Gilles Deleuze & Félix Guattari. A Thousand Plateaus. Capitalism and Schizophrenia. Minneapolis & London: University of Minnesota Press, 1987, p. 330.

[28] Paul Bains. "Subjectless subjectivities." In: A Shock to Thought. Expression After Deleuze and Guattari, edited by Brian Massumi. London & New York: Routledge, 2002, pp. 101-116. Andrew Murphie & John Potts. Culture & Technology. New York: Palgrave Macmillan, 2003, pp.30-35. This way of thinking about media ecologies could also be called eco-ethology, which underlines the connected nature of the world, applicable not only to biological phenomena but to media technological environments of connection as well. In this view, the being of an entity is only because of a world for which the entity is -- an affirmation of a certain theme of immanence that Isabelle Stengers sees flowing from the Stoics to Spinoza, Leibniz and Whitehead and on from Marx to Deleuze. Sonigo & Stengers, pp.134-144.

[29] Cohen, "Computer Viruses," p.222.

[30] See Ludwig. Cf. Eugene H. Spafford. "Computer Viruses as Artificial Life." In: Artificial Life. An Overview, edited by Christopher G. Langton. Cambridge & London: The MIT Press, 1997.

[31] Cohen, "Computer Viruses," 12.

[32] See Bains.

[33] The idea of memes as cultural reproduction machines could also provide a fertile way of understanding the abstract machine of network culture. See Fuller, 111-117.

[34] Maturana and Varela, p.xxvii. Cf. Maturana and Varela, p. 9.

[35] William Aspray. John von Neumann and the Origins of Modern Computing. Cambridge, MA.: The MIT Press, 1990, p.189.

[36] Aspray, pp.202-203.

[37] Steve J. Heims. John von Neumann and Norbert Wiener. From Mathematics to the Technologies of Life and Death. Cambridge, Massachusetts: The MIT Press, 1980, pp. 204-205, 212. Aptly, early "bacteria" -programs in mainframe computers have been listed as one of the oldest forms of programmed threats. While not explicitly damaging, they however were designed to reproduce exponentially, being a potential "clog" for the computers processor capacity, memory and disk space. Thomas R. Peltier. "The Virus Threat". Computer Fraud & Security Bulletin. June 1993, pp. 13-19.

[38] Cf. Heims.

[39] Heims, p. 325.

[40] Aspray, p.191.

[41] On this topic see Nancy Forbes. Imitation of Life: How Biology is Inspiring Computation. Cambridge MA: The MIT Press, 2004. Cf. Tiziana Terranova. Network Culture: Politics for the Information Age. London: Pluto Press, 2004, 98-130.

[42] Félix Guattari. Chaosmosis: An Ethico-Aesthetic Paradigm. Sydney: Power Publications, 1995.

[43] Félix Guattari. The Three Ecologies. London: The Athlone Press, 2000, p. 61.

[44] Hafner & Markoff, p.280. Allan Lundell. Virus! The Secret World of Computer Invaders That Breed and Destroy. Chicago & New York: Contemporary Books, 1989, p.21. According to Lundell, the Creeper was a virus which got away and a special Reaper program was designed to clean the network from Creeper programs.

[45] John F. Shoch & Jon Hupp. A. "The 'worm' programs -- early experience with a distributed computation." Communications of the ACM, Vol.25, issue 3, March 1982, p.175.

[46] Shoch & Hupp, pp. 176-178.

[47] Robert E. Kahn. "Networks for Advanced Computing." Scientific American 10/1987. On the history of packet-switching, see Janet Abbate. Inventing the Internet. Cambridge, MA & London, England: The MIT Press, 2000, pp. 27-41. Matthew Fuller makes, however, a very important point when underlining the hierarchical nature of the packet-switching techniques: even though it connotates self-organisation, it is at the same time controlled by protocols and other socio-technical dimensions. Fuller, 128-129. See also Galloway.

[48] Cf. David Harley, Robert Slade & Urs Gattiker. Viruses Revealed: Understand and Counter Malicious Software. New York: Osborne/McGraw-Hill, 2001, p.189. In Viruses Revealed! they emphasize that even if the Shoch-Hupp-worm was a reproductive worm, it did not have security-breaking intentions, nor did it try to hide itself. Ibid., p. 21. Often anti-virus researchers underscore that even benevolent viruses are harmful as they tie up computing resources (memory) from the normal operations of the system.

[49] Spafford. "Computer Viruses as Artificial Life", p. 263. Spafford's text, originally from early 1990s, provides in general a useful discussion concerning the aliveness of computer viruses.

[50] Terry Winograd. "Beyond Programming Languages." Communications of the ACM, vol. 22, 7 / July 1979, pp. 391-401. Jerome A. Feldman. "High Level Programming for Distributed Computing." Communications of the ACM, vol. 22, 6 / June 1979, pp.353-368.

[51] See Heims 1980, 313-314.

[52] Jorge M. Barreto. "Neural network learning: a new programming paradigm?" Proceedings of the 1990 ACM SIGBDP conference on Trends and directions in expert systems, New York: ACM Press, p.434.

[53] Yet, as Mark Ludwig (47) points out, these self-mutating viruses were merely able to camouflage themselves, nor exactly mutate or evolve.

[54] Frederick B. Cohen. It's Alive! The New Breed of Living Computer Programs. New York: John Wiley & Sons, 1994, p.21.

[55] Cf. Ludwig. See also Spafford.

[56] Cf. Simon Penny. "The Darwin Machine." Telepolis 09.07.1996, http://www.heise.de/tp/r4/artikel/6/6049/1.html.

[57] See Nicholas Negroponte. Being Digital. London: Hodder & Stoughton, 1995, pp. 149-159.

[58] Wise J. Macgregor. Exploring Technology and Social Space. Thousand Oaks: Sage, 1997, pp.150-157.

[59] Manuel DeLanda. "Nonorganic Life." In: Incorporations, edited by Jonathan Crary and Sanford Kwinter. New York: Zone Books, 1992, p. 129. In relation to this theme of non-organic life, see DeLanda's analyses of the computational mechanosphere in War in the Age of Intelligent Machines, pp.120-178.

[60] Sherry Turkle. Life On the Screen: Identity in the Age of Internet. London: Weidenfeld & Nicolson, 1996, p. 136.

[61] Sampson, "A Virus in Info-Space."

[62] Cf. N. Katherine Hayles's mapping of the discourse of the posthuman in How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago IL: University of Chicago Press, 1999.

[63] Friedrich Kittler. Gramophone, Film, Typewriter. Stanford CA: Stanford University Press, 1999, p. 258.

[64] Steven Levy. Artificial Life. A Report From the Frontier Where Computers Meet Biology. New York: Vintage Books, 1993, p. 324.

[65] Gregory Bateson. Steps to an Ecology of Mind. New York: Ballantine Books, 1972, p.483.

[66] Lynn Margulis is of course another pioneer of this kind of strand of thought of symbiotic evolution.

[67] Maturana and Varela, p. xvi.

[68] Maturana and Varela, p. 9. Maturana and Varela define autopoietic machines as follows: "A machine organized (defined as unity) as a network of processes of production, transformation and destruction of components that produce the components which (i) through their interactions and transformations regenerate and realize the network or processes (relations) that produce them; and (ii) constitute it as a concrete unity in the space in which they exist by specifying the topological domain of its realization as such a network." (p.135)

[69] See Guattari. Chaosmosis, pp. 37, 91-93.

[70] See Brian Massumi. Parables for The Virtual: Movement, Affect, Sensation. Durham & London: Duke University Press, 2002, p. 237.

[71] See Elisabeth Grosz. "Thinking the New: Of Futures Yet Unthought." In: Elisabeth Grosz (ed.): Becomings. Explorations in Time, Memory, and Futures. Ithaca & London: Cornell University Press,1999, pp. 15-28. On machinic phylum, see Manuel DeLanda. "The Machinic Phylum." V2, 1997, Online at http://framework.v2.nl/archive/archive/node/text/default.xslt/nodenr -70071. See also Fuller 17-20.

[72] See Michel Serres. The Parasite. Baltimore & London.: The Johns Hopkins University Press, 1982.

[73] Deleuze & Guattari. A Thousand Plateaus, pp.241-242. Of course, there are anomalous organisms in any environment in the sense that they might be harmful to the very existence of the environment that supports them -- we have examples of such programs as well. For example the Leligh-virus (1987) was actually too destructive in that it prevented also its own chances of spreading outside the university computers it was originally found in. But it represents merely one actualization of viruses.

[74] Guattari. Chaosmosis, p. 91.

[75] Ibid., p. 92. Guattari places special emphasis on aesthetic machines in such a cultivation of a virtual ecology. Hence his ecological analysis could be connected to issues of tactical media and media art. Cf. Galloway, p. 175-238.

[76] See Guattari. Three Ecologies. See also Chaosmosis, pp. 39-40.

[77] Such considerations of social and mental planes have been also part and parcel of actualizations concerning the paths taken in the media ecology of the universal viral machine discussed above. In other words, while I argue for the centrality of viruses and worms in understanding this ecology, on a more official level scientists and researchers articulating such ideas of "computer viruses as a form of artificial life" or "benevolent viruses" have been considered as irresponsible. Tony Sampson. "Dr Aycock's Bad Idea: Is the Good Use of Computer Viruses Still a Bad Idea?" M/C Journal 8.1 (2005). 20 Jun. 2005 http://journal.media-culture.org.au/0502/02-sampson.php.

[78] Michael Hardt and Antonio Negri. Multitude: War and Democracy in the Age of Empire. New York: The Penguin Press, 2004, p.91-93, 340. Cf. Parikka.

[79] Terranova, p. 103.

[80] Ibid.


--------------------

Licentiate of Philosophy Jussi Parikka is a researcher at the University of Turku, Finland. He is writing his Ph.D dissertation on the cultural history of computer worms and viruses. In addition, he is co-editing books on media archaeology, continental media theory as well as on the cultural theory of spam and other anomalous objects of digital culture. Homepage: http://users.utu.fi/juspar.

The author is grateful to Matthew Fuller, Milla Tiainen, Pasi Väliaho as well as the anonymous reviewer and the editors of CTheory for their apt advice and comments. TIESU-project (http://www.hum.utu.fi/historia/kh/tiesu/) provided me with financial support.

© CTheory. All Rights Reserved