Email this link to a friend HOME : ARTICLES
Printer Friendly Version Articles: a106
Date Published: 4/4/2002
Arthur and Marilouise Kroker, Editors

Why the Digital Computer is Dead

Chris Chesher

The digital computer is dead. The term 'digital computer' is no longer useful or appropriate. I will offer instead what I think is a better concept for these devices: invocational media. A different name is necessary because digitality is not the property, nor is computation the process that distinguishes the distinctive class of electronic devices that have become so ubiquitous. What makes new media new is that they mediate powers of invocation: powers to call things up.

In the mid-20th century there were two quite different and competing approaches to designing computing machines: digital and analog. Analog computers used continuous variations of voltage or even mechanical movement to calculate answers to equations, where digital computers operated one command at a time using discrete values and logical addresses. At first, digital computers presented many design and cost problems that some felt would never be overcome. By comparison, analog systems in the 1930s and 1940s such as Vannevar Bush's differential analyser were quite fast and powerful. However, once the general purpose design, usually attributed to Von Neumann, was widely accepted, digital systems began their extraordinary path to ubiquity. Analogue computers were increasingly relegated to narrower and narrower applications.

Digital computers ultimately had many advantages. They were more precise than analogue devices, because they counted values rather than measured quantities. They were more versatile because they worked with stored programs and data. They were more powerful because they offered language-like command sets and large addressable memories. As the technical and cost problems of storing data and instructions were resolved, the substantial advantages of digital design prevailed. In fact, analogue computers have very little in common with what we call computers today. The distinction between analogue and digital computers has become archaic - it is no longer relevant or useful.

There is a more generic distinction between 'digital' and 'analogue' that remains useful, even if it can be quite hard to understand. This is a philosophical, or semantic distinction that extends beyond computing discourse. It refers to modes of representation. Analogue representations operate by continuous variations, where digital codes use discrete values. However, two are not alternatives, but different layers or strata of meaning, since it is rare (if not impossible) for any sign to be purely digital.

One of the most notable digital codes is writing. Writing is predominantly digital because each character is discrete. As Derrida likes to point out, the letter 'e' is either the letter 'e' or it isn't.[1] However, in spite of this apparent precision at the level of the character, how writing creates meaning is not so simple. There is no single unambiguous meaning for any group of characters. Even though each reader might recognise the same letters, each one can make of them something quite different. As well as this huge problem of signification, there is always an analogue level to digital expression. In any piece of writing there is a typeface or a handwriting style which adds a non-digital, layer of meaning to any text .

On the other hand, many 'analogue' signs are inflected by digital components. Painting is largely analogue, because it works with blocks of colour. It can have figurative elements viewers recognise. However, there are often some digital components associated with images: a title for a painting; a caption for a photograph; writing within the image etc.

It is necessary to ask in each instance whether a text (or part of a text) communicates by resemblance and recognition (in which case it is functioning analogically), or by inscription with an arbitrary code, and reading (when it functioning digitally). Modern culture has tended to culturally privilege digital systems of communication over analogue modes, because they are more precise, abstract and translatable. However, the distinction should never be considered as a simple opposition. Digital codes are always tied up in analogue substrates.

The term 'digital' when used in computing discourse, while necessary in the 1940s, has now become confusing. This confusion is compounded by its conflation with the wider digital/analogue distinction. The term 'digital' has lost contact with its early engineering application, and become fetishised. While the term 'digital computer' has dropped from common usage, the term 'digital' is often used quite loosely around new media technologies. It seems to refer to anything 'high-tech' or 'computerised' - digital futures, digital classrooms, digital images, digital revolutions. Now that computers function largely as media technologies - mechanisms for distributing and displaying texts, sounds and images - the fetishisation of digitality has become particularly confusing.

For example, the idea of 'digital images' is quite misleading because it is based on the archaic distinction between early types of computer design, and not on the broader philosophical concept. When people look at supposedly 'digital' images they can't easily distinguish them from other images. These images are different because they have been invoked from memory to a screen, and not layered onto a surface like paint, projected through celluloid, or played back from videotape. The difference does not mean they are 'digital' in the philosophical sense. This mistake leaves many people confused about the wider distinction between 'digital' and 'analogue'. It also tends to ignore some important distinctions between different ways that images can be invoked - raster and vector; computer generated, scans and digital cameras; 2D and 3D.

No matter how images are created, though, people recognise analogies or icons (in Pierce's sense)[2] because they have some resemblance to something else. It does not necessarily matter to viewers whether an image is stored and distributed 'digitally' (although it does matter to designers or photographers). As signs, digitised images still most often function analogically: viewers see patterns of line and colour, and recognise resemblances. The same is true of digital sound and digital video: the technical property of digitality is not as significant as the performative capacity to call up sounds and images at will.

The second word in the phrase 'digital computer' is also anachronistic. The term 'computer' originally referred to people whose job it was to perform tedious manual calculations, handed down to them from people above them in an organisation. Calling electronic calculating devices 'computers' was an anthropomorphism, equating machines with these humans. This usage was loaded with hierarchical connotations, since the human computers of the day were most often tied up in social relationships of delegation. Senior people called on juniors to do repetitive time-consuming calculations. This cultural derivation - by which non-human components substituted for human computers in chains of command - is also largely forgotten.

The term 'computation' is also problematic because it is very narrow. It suggests a process where symbolically expressed problems are subjected to mathematical and logical manipulation. Again, the term was once more appropriate than it is now. In the 1940s when mathematicians used these devices to calculate missile trajectories and H-bomb explosive forces, and even in the 1950s when UNIVACs were used by businesses to calculate insurance risks and profitability statistics, the term was reasonably accurate. After the 1960s, though, the term 'computation' became increasingly archaic, as applications like word processing (which is not really about processing at all), image generation, sound sampling and synthesis and so on clearly go well beyond any definition of computation.

The concept of computation is not only outdated, but carries epistemological assumptions that are worth questioning. Looking at the derivation of the term, it might be paraphrased as something that combines putative truths. The suffix '-puter' is related to the Latin putare, for pruning, cleansing or reckoning.[3] So computing brings things together and cleans them up for use in estimations. It takes data - literally 'givens' - and extracts them from the contexts out of which they were gathered. Cleansing givens so that they can be brought together erases the process of collection itself. Whenever someone starts reckoning anything, you can bet he has some task in mind. He will decide what is relevant to this task, and what is not. That's when the putare happens - the choices behind the computation are discarded, and the results are presented as clean and objective.

As Bruno Latour has argued, the Modernist project attempts (but fails) to draw a Great Divide between Modernity and everything before or outside.[4] Many advocates of modern science and technology ground knowledge claims on reductive processes of pruning or cleansing that supposedly create objective truths. They identify the bottom line of truth with measurability and quantification, and refuse to recognise any other forms of knowledge as valid in the same way. At the same time, they generate the most intricate theories, explanations and conceptualisations in an attempt to account for the phenomena measured. Anything outside the carefully constructed, highly specialised and atomised truths of techno-science is supposed to be discarded as subjective, superstitious and suspect.

'High' technology supposedly transcends non-modern 'low' technologies. By privileging the discoveries of science over superseded irrational beliefs, and categorising the developments of modern technology in a totally different class from craft objects, modern phenomena are seen as completely unprecedented. By this view, using a computer is completely unlike any previous cultural practice. Technology is always advancing, and, as Langdon Winner[5] points out, is often seen as largely out of control. People are supposedly stunned by future shock, constantly catching up with the most recent radically unpredictable events.

However, actually using a computer is not completely unlike other activities. The keyboard works like a typewriter. The screen looks like a television. The text on the screen looks like writing on a page. And in spite of all cleansing, the cultural residue around actual computers is far from pure. Although data are clean in theory, in practice they are filthy with residues from the processes of their collection. Modernity builds huge messy networks of measurements, theories, technological devices, human bureaucracy and systems of knowledge. Most of these processes are quite beyond rationality. As Thomas Kuhn[6] argues, science has never discovered universal truths, but has leaped from paradigm to paradigm through culturally and politically loaded processes.

In computing discourse there are many terms in circulation that bring high-tech back to earth. Many of them are metaphors - industrial (processing), anthropomorphic (intelligence), or spatial (cyberspace, virtual reality[7]). However, the tropes I find most interesting are about magic. References to deep magic, wizards, remote method invocations, or the invoking of daemons suggest these devices emerged from far darker and dirtier origins than engineering's clean rooms.

In spite of all the noise, Modern exceptionalism was an only ever an aspiration. Superstition and religion have refused to disappear. Churches haven't closed down. There are still horoscopes in the paper. There have always been significant powerful groups directly opposing modernity. In their different ways, both Christian fundamentalist creationists and new age spiritualists directly reject modern science.

But the status of magic and mysticism in contemporary culture is not simply a war of science against superstition. Even among technology's strongest advocates magic has returned. Various manifestations of techno-millenarianism see technology as transcending earthly limits. Kurzweil's faith[8] in an approaching superhuman artificial intelligence, the 'spike' in technological advancement that Broderick predicts,[9] or the extropians movement's quest for immortality are only some examples. The promise of a moment that will rocket mankind into a time of unprecedented wealth, immortality and ubiquitous information is constantly repeated and rephrased.

The persistent cultural connection between technology and magic has attracted some significant attention recently from writers such as Erik Davis,[10] Margaret Wertheim,[11] David Noble,[12] Richard Stivers[13] and Michael Taussig[14]. And they are not the first. In Civilisation and its Discontents,[15] Freud argued that modern technology was displacing magic in its (never quite fulfilled) promises of power and libidinal satisfaction. Marcel Mauss[16] saw magic as primarily a cultural phenomenon. Even the notorious historian of magic Sir James George Frazer[17] identified a connection between primitive magic, religion, science and technology.

The subliminal return of magic in computer discourses, then, might not suggest anything paranormal or supernatural is at work. Rather, it is evidence of the resilience of non-modern cultural forms. Practices, beliefs, desires and affects usually associated with premodernity-often dismissed as irrational, superstitious or subjective - never went away. Instead, they have been translated into new expressions. Will to command can be manifest in political power, magic or technological systems. Desires for explanations of mysteries, control over the future, or a sense of community might equally be answered by religious faith, rational methods, or high-tech artefacts. The boundaries drawn between science and magic, nature and culture, technology and society are all quite artificial. There is no fundamental difference between a poet invoking the Muses for inspiration, and me invoking a search engine for material to use in this talk. Of course, they are different. But they are not so different that they cannot be compared.

Still, I hesitate in proposing to replace the term 'digital computing' with 'invocational media', and 'computer' with 'invocator'.[18] The dreaded techno-millenarians might like it too much. Or worse, Microsoft might use it in its next advertising campaign: 'Windows 2002 - invoke anything you can imagine'. Like Haraway's[19] cyborg, I present 'invocational media' as a blasphemous myth. It might offend people who take their magic, or their technology, seriously. It is partly ironic, echoing the ridiculous and exaggerated claims people sometimes make about the powers of computers. But it is also serious as a non-representational and pragmatic critical method with which to conceptualise new media.

My claim that invocation persists in technological form offers no promises of transcendence, but rather suggests cultural continuity. An invocation is characterised by a call to another for assistance or support. It is familiar particularly from Ancient, Classical and Romantic poetry -Homer, Chaucer, Milton and Shelley. The archetypical invocation is addressed to a Muse. The nine Muses were the minor goddesses of culture, history, music, science and the arts in Ancient Greece.[20] They provided inspiration and connection with things past and distant. Greek myths can be particularly canny in marking out cultural forms: it is significant that the father of the Muses is Zeus, the ruler of Olympus- the god of command, and that their mother is Mnemosyne, the goddess of memory. The invocational relationship marries command and memory. Invocational power is never direct. It is tempered by being mediated through the Muses - it calls to a past partly forgotten, to elusive inspiration, or to powers that are often unreliable.

Invocational media perform as substitute non-human others to which invocations are addressed. They too are distinguished by a coupling of command and memory. The interplay between central processing units and memory devices recalls the ancestry of Zeus and Mnemosyne. The CPU is the locus of command, a specialised collection of circuits that reads instructions from memory or other inputs, and then interprets and executes these instructions. Memory devices, including RAM, ROM, hard drives and so on, store programs and data, providing a repository for potential future invocations - invocable domains. Users intervene in the ongoing invocational processes by offering invocations of their own. They call on calculations, databases of texts and images or other documents, or peripheral devices such as printers. The services that invocational media offer also recall the forgotten delegation implicit in the original computer.

The consequent relationship between a human and an electronic invocator is a quasi-magical refrain of the ancient cultural form of invocation. When someone utters a properly formed command, the invocator seems to respond to the user's command. A web page is summoned. A document curls out of a laser printer. A song begins to play. The general purpose of these devices is to mediate invocations. But like the Muses, invocators are notoriously capricious, not always producing what the invoker had hoped.

Irrespective of their reliability, invocational media are quite different from other media forms. Broadcasting might said to be distinguished by signals radiating from antennae. Print media are characterised by mechanically produced marks on paper. Photography is characterised by the singular combinations of lenses, photosensitive chemicals and light. Of course there is more to media specificity than technical attributes. Deleuze[21] argues that cinema is characterised by the 'movement image'. He sees this as the 'genetic element' out of which arise many of the effects that cinema creates.

In invocational media, the genetic element is the electronic invocation between CPU and invocable domains. While all invocations are ultimately reducible to first order binary switchings of circuits, the most observable invocations are compound, second order events. Something as apparently simple as a user clicking on a web page link actually invokes a sequence of events at multiple levels: an application passes a request to the operating system, triggering signals in local and network hardware, passing a sequence of events through TCP-IP networks to the addressed server. The server responds to the http request, and finally, within some milliseconds, the user reads the page she has invoked. For users, though, this is only one invocational event.

Opening a web page can hardly be described as a process of computation. Nor can printing a word processor file, or playing a first person shooter. And while these operations are digital at the level of logical description, for users they are largely analogue: blocks of colour on the screen, the impact of hands on keys, soundwaves in the air. What makes this medium different is clearer in terms such as random access; non-linearity; simulation; interactivity and virtuality. These are all various manifestations of the same genetic element: invocation.

The digital computer is dead. Long live invocational media.

Invocational media have a different heritage from the recently departed digital computer. Computers belonged within histories of mathematical and logical abstraction. They inherited Aristotle and Boole's logic, Turing's incomputable numbers and Von Neumann's automata. The digital computer's distinctiveness was said to come from its capacity to automate and accelerate the symbol manipulation. Symbols supposedly stood in for objects processes in the world and sometimes modelled changes across time. The computer inherently belonged within a rationalist representationalist tradition, which has recently come under some challenge.

The theory of invocational media is an attempt to construct a non-representational concept for new media. Rather than simply critique the dominant tradition, I am attempting to create a productive alternative history, present, and future for this phylum of machine. Invocational media, by contrast with reductive rationalist digital computers, have pragmatic and material histories drawing together technology, language and magic. But computers were always invocational, and invocation to artefacts long predates computers.

Invocational media can be situated in a tradition of technologies that make the material world perform as language. A light switch allows a human hand to make a kind of a statement: let there be light. With a rudimentary gesture, often completely unconscious, a space is filled with light. The gesture performs like a command, and the command is immediately answered, because the assemblage (including not only the switch and the light, but the whole infrastructure of electrical power) has put light on call. As Heidegger[22] argues, modern technology presents the world as standing reserve, stored up to be available on demand. This almost magical event is a new form of invocation. The light and switch assemblage is what I call an invocatory device. Invocatory devices offer a single effect that can be summoned almost as easily as making a statement with a voice: a light switch, a door bell. But even the most ancient technology - a lever - is becoming invocatory because it starts to make physical action more like language.

Events of switching performed with invocatory devices are often almost invisible, because most significance is usually given to what the switching reveals. McLuhan[23] argues that the content of the electric light is what it illuminates: night baseball or brain surgery. Without electric light, these activities could not take place.

However, the act of switching often has symbolic or practical significance in itself. When a new bridge is opened, or a decorated Christmas tree unveiled, a dignitary might be invited to turn the lights on. When someone is in the electric chair, an executioner has the special task of throwing the switch. In each of these cases, the moment of switching is an event in itself. The invocatory act has a certain power. It calls on a reservoir of power, mobilising what it has called to transform and re-affirm the way things are.

Some invocatory devices use switching as language. Most notably, the telegraph and Morse code allowed messages to be transmitted by disconnecting and reconnecting signals. In this case, individual events of switching were still isolable and had discrete significance, but required intensive human intervention. Telephone switches developed quite high orders of complexity and speeds in switching, but once the connection is made, the conversation takes place through analogue modulations of signals.

Invocational media are significantly more powerful than invocatory devices. They combine command and memory into the same circuit, and work with highly abstract invocable domains. Where invocatory devices make only simple, finite, and relatively local statements, invocational media can make an infinite number of quite heterogeneous articulations. They perform more than single switchings by invoking programmed sequences of instructions, where the results of one invocation become inputs for others. They are open not only to inputs from outside through peripherals, but to distant events through networks and to records from the past on databases. This combination of components exponentially expands the range of invocations that become articulable.

Invocational media translate all events into a constant cycle of reading, interpreting and acting upon instructions. A program counter and a clock mark the place and timing for each new invocation. While each invocation is simple, in combination they rapidly become hypercomplex. The fetch-execute cycle abstracts switchings to a point where individual invocations interleave and merge into a constant stream. With millions of invocations per second, early 21st century invocational devices have become platforms for all manner of mediated events: well beyond calculating equations, invocational media are called upon to support an enormous range of cultural practices: reading, writing, viewing, playing, conversing, controlling and so on.

However, invocational powers come at a price. As with any magical power, and in spite of their apparent precision, invocational systems always entail a degree of mystery. While the fetch-execute cycle operates at the level of first order invocations, users must make invocations of a second order, or what I call invocationary acts. Unlike the Morse Code operator, who can hear every dot and dash, users of invocational devices are largely unaware of most of the lower level processes that allow them to achieve their tasks.

Users are faced with a trade-off between power and precision. If they want to control invocation with any precision, they have to articulate it in great detail by learning a low level programming language. However, to do this would be so time-consuming and redundant as to be totally impractical. All users rely upon subroutines, programs and applications written and owned by others. When I invoke a software feature I'm never entirely sure how it does what it does.

When I start using a new program I have to take some time away from my immediate task to learn how to use its features. I am called away from my usual duties by the software until I become competent to invoke what I want when I want it. However, by enduring this process of training, I have transformed myself into the particular subjectivity of a user. I have tied myself into an upgrade path. The tasks become habitual, and I can no longer perform them without this software.

While software features give users greater power, they also call users away. This is a special example of cultural processes that Weber[24] refers to as avocation. An avocation is a minor form of vocation. It is not the life-long calling such as the calling to a career in politics, ministry or another profession, but a distracting call to deviate from one's original path. Software features are a special form of avocation. They allow users to perform second order invocations, or invocationary acts. Although they are never quite right for the job at hand, they are usually adequate. When users invoke something, it is not the 'original' expression of the intention of the users. Instead, invocations are always articulated through many layers of pre-formed, programmed avocations. Where computers always promised to empower the sovereign user subject, the relationship between users and invocational media is more ambiguous. Users are equally used. Invokers are also called.

This problem is not exclusive to users, though. There are many overlaps between the limits of avocation and the limits of natural language (or langue). Linguists such as Vygotsky[25] have shown that language is intersubjective - it does not belong to any speaker. Language speaks the speaker as much as the speaker speaks it. Communication, and even thought itself, is constrained by the limits of what can be expressed in language. Invocational avocations are only one among many technologies of the self - various machineries that produce subjects.

However, there are some specificities to the invocation/avocation relationship that are worth pursuing. Most significantly, software (and hardware) avocations are owned by large companies like IBM, Microsoft and Apple. The economy of invocational media relies upon the relationships of dependence that develop through the constant call of avocations. Avocations are an additional layer outside, but intimately connected with, so-called 'natural' language. They have created quite a new kind of power, based on building an installed base, and controlling standards. The lumbering anti-trust case against Microsoft, and the open-source software movement indicate the lines of force at play in avocational politics.

Another important parallel between electronic invocations and spoken language relates to pragmatics of both natural language and computer use in context. Using invocational devices, like speaking, does more than carry information from one place to another.

Language has its own magic. The linguist J. L. Austin's[26] work on performative utterances suggests that language is not primarily a formal system of representation, but a means of making things happen. In How to Do Things with Words he proposes that many (if not all) statements are 'performative'. They are not simply descriptions of things in the world, but events with some power. Language operates first as force, and second as meaning.

The most clear examples of performatives are in formal situations such as when a judge in a courtroom passes sentence: 'I sentence you to five years hard labour'; or when a dignitary names a new ship: 'I name this ship "The Titanic"'.

However, these are only special cases of a more general principle: all utterances (and this includes statements that are written or invoked) have a performative force. A statement such as 'I'll meet you at one o' clock tomorrow afternoon on the Town Hall steps' also does something.

Austin proposes three dimensions to performative utterances: the locutionary, the illocutionary and the perlocutionary.[27] The locutionary dimension refers to the actual uttering of the sounds, words and grammatical statements that prescribe the place and a time for the meeting. The illocutionary dimension of that statement refers to the promise or commitment that has been made - what is brought about in making the statement (I've committed to being on the town hall steps, or in the other examples, a sentence is passed and a ship is named). Finally, the perlocutionary dimension is the outcome in the context in which the speech act is performed. If we end up actually meeting, then that is the perlocutionary dimension. However, the perlocutionary can be seen as something virtual that inheres in the event of the statement itself. Whether or not we ever meet, the statement is inherently an intervention that serves to increase the probability that we will. Austin's approach calls into question the dominant assumption that language represents things in the world. His work has been taken up by many writers, including Deleuze and Guattari in developing the concept of the 'order word',[28] and Winograd and Flores[29] in talking specifically about computers.

I have found it useful to extend Austin's schema to propose another dimension specific to invocational media: the invocationary act. I would perform such an act if I were to make the same arrangement to meet tomorrow by sending you an email. In this case, all three of Austin's dimensions would still be involved, but this additional form of language / technology act would also be incorporated. If the email is to get to you, I need to enter your email address into the To: field in my email client software. If anything is out of place, the message will bounce. Like many avocations, email addresses are a hybrid of human and machine readable conventions. They serve as markers of personal name, institutional and national affiliation, and, at the same time, they are part of the instructions that assure the successful delivery of the message.

The email address is both human readable, and machine readable. The act of sending is an invocationary act that has both social significance, and technical efficacy. By clicking the send button I both make a commitment and set in train a network process. But just as Austin ended up concluding that language almost always functions with some performative force, I've come to think that everything that users do with invocational devices involves performing invocationary acts: queries, savings, data entries, slick moves, double clicks and so on. Many of these involve calling on avocations owned by others, or even coalesce to become entire vocations: desktop publishing, or webmastery.

The final theme I want to examine briefly are third order invocations. At the level of the concept, any invocationary act draws upon a collection of cultural assumptions, discourses, ideologies or symbolic and material resources. Electronic mail draws on the cultural knowledges associated with the postal system. It invokes mail boxes, messages, addresses, addressees. It even invokes the business convention of sending carbon copies of messages to others with an interest in the communication who are not actually the addressee. These are often considered 'metaphors', where something stands for something else. But metaphor is not quite right. Electronic mail is a translation, or a remediation of the cultural practice of mailing. It invokes traditional mail. It is a third order invocation.

Third order invocations have often caused great philosophical consternation: artificial intelligence, virtual reality, artificial life. These are far less mysterious if conceived as invoked behaviour; invoked spatiality and invoked evolution.

Now that we can put the digital computer to rest, it is easier to challenge the transparency of third order invocations. The question is no longer whether we have a true or accurate representation of something in the world. Rather, it invites other questions: What has been invoked? Who is invoked, and in what ways? Who is invoking, and why? Who designed the avocations? What users does this produce?

So the digital computer is dead. I have hopefully shown that invocational media are characterised not by digitality nor computation, but by calling things up. The first order of invocation is the fetch-execute cycle. By putting command and memory into the same circuit, the invocatory device becomes invocational. The second order of the invocation is the invocationary act. Users compose invocations to do things, but in doing so depend upon avocations and invocable domains that pre-exist that event. Finally, third order invocations are the concepts invoked to hold together invocational platforms.

The concept of invocational media is itself an intervention in the thought and discourse around new media. It thinks differently about new media technologies. Rather than stopping at critique, invocational media offers a new vocabulary for talking about the web, databases, AI, VR and so on. It reaffirms cultural continuities over Modernist exceptionalism, but at the same time draws attention to the significant ways in which new media differ from other media forms.


[1] Derrida, J., Of Grammatology, Trans. Gayatri Chakravorty Spivak, Baltimore: John Hopkins University Press, 1976.

[2] Pierce, C. S., The Collected Papers of C. S. Pierce, vols. 1-6, ed. Charles Hartshorne and Paul Weiss; vols. 7-8, ed. A. W. Burks, Cambridge, Harvard University, 1931-58.

[3] 'Word of the day for Sunday the 13th of December 2000'. accessed August 2001.

[4] Latour, B., We Have Never Been Modern, Cambridge: Harvard University Press, 1993, 12.

[5] Winner, L., Autonomous Technology. Technics-out-of control as a Theme in Political Thought, Cambridge: MIT Press, 1977.

[6] Kuhn, T. S., The Structure of Scientific Revolutions, Chicago and London: University of Chicago Press, 1996 [1962].

[7] See Chesher, C., "Colonising virtual reality" Cultronix, Pittsburgh, PA, , 1993.

[8] Kurzweil, R., The Age of Spiritual Machines: When Computers Exceed Human Intelligence, New York: Penguin USA, 2000.

[9] Broderik, D., The Spike: How Our Lives Are Being Transformed by Rapidly Advancing Technologies, Kew: Reed, 1997.

[10] Davis, E., "Techgnosis: magic, memory and the angels of information" in M. Dery, Flame Wars: The Discourse of Cyberculture, South Atlantic Quarterly, Fall 93, Durham: Duke University Press, 1993, 584-616, and Davis, E., Techgnosis. Myth, Magic and Mysticism in the Age of Information, New York: Harmony Books, 1998.

[11] Wertheim, M., The Pearly Gates of Cyberspace, Sydney: Doubleday, 1999.

[12] Noble, D. F., The Religion of Technology: The Divinity of Man and the Spirit of Invention, New York: Knopf, 1997.

[13] Stivers, R., Technology as Magic. The Triumph of the Irrational, New York: Continuum, 1999.

[14] Taussig, M., Mimesis and Alterity. A Particular History of the Senses, London and New York: Routledge, 1993.

[15] Freud, S., "Civilisation and its Discontents" in Freud, and P. Gay The Freud Reader, London: Random House 1995 [1930], 722-772.

[16] Mauss, M., A General Theory of Magic, London and New York: Routledge Classics, 2001 [1950].

[17] Frazer, J. G., The Golden Bough. A Study in Magic and Religion, London: MacMillan, 1960 [1922].

[18] 'Invocational media' is a collective term that encompasses all devices which incorporate the 'Von Neumann machine' circuitry or microprocessors. This includes embedded processors, network devices and so on. An actual computer that someone uses is a 'general purpose electronic invocator'.

[19] Haraway, D., "A Cyborg Manifesto" in Haraway Simians, Cyborgs and Women. Reinvention of Nature, New York: Routledge, 1991,149-181.

[20] Bulfinch, T., The Golden Age of Myth and Legend, Ware, Hertfordshire: Wordsworth Editions, 1993.

[21] Deleuze, G., Cinema 1: The Movement Image, Minneapolis: University of Minnesota Press, 1986, 61.

[22] Heidegger, M., "The Question Concerning Technology", in The Question Concerning Technology and Other Essays, New York: Harper, 1977, 3-35. See also C. Chesher, "The Ontology of Digital Domains" in Holmes, D., Virtual Politics, London: Sage, 1997, 79-92.

[23] McLuhan, M., Understanding Media. The Extensions of Man, London and New York: Ark, 1964, 8.

[24] Weber, M., "Politics as a Vocation" in Mills, C. W., Gerth, H. H. (eds) and Weber, M From Max Weber, New York: Oxford University Press, 1974 [1946], 77-128.

[25] Vygotsky, L. S., Thought and Language. Cambridge, MA: MIT Press, 1962.

[26] Austin, J. L., How to Do Things with Words, Oxford: Clarendon Press, 1975.

[27] ibid 94-120.

[28] Deleuze, G. and F. Guattari, A Thousand Plateaus, Minneapolis: University of Minnesota Press, 1987, 75-110.

[29] Winograd, T. and F. Flores, Understanding Computers and Cognition, Norwood, New Jersey: Ablex Publishing, 1986, 58-60.

Dr. Chris Chesher teaches in the School of Media and Communications at the University of New South Wales in Sydney, Australia. He is a facilitator of the Australian new media culture mailing list 'fibreculture' (
© CTheory. All Rights Reserved