rhizomes.06 spring 2003

Louis Armand

"Modernity today is not in the hands of the poets, but in the hands of the cops" -- Louis Aragon

[1] The mark, within the title given here, of substitutability, equivalence, variability, alternation, denomination or parataxis. It is not a matter of a simple assertion, symbolic or encoded by means of a graphic symbol, of identity or identicality, even as these terms or concepts (identity or identicality) are implied or implicated in or by the form of such an algorithm. The identity or identicality of certain forms of inscription, of "encryption," is nonetheless what is under contest in the various discussion pieces assembled here, for better or for worse, under the title CODEWORK / SURVEILLANCE.

[2] In his Introduction to the October 2001 "CODEWORK" issue of the American Book Review (22.6), Alan Sondheim cites the term "codework" as a metaphor for a type of idealised mode of writing in which the terminological and formal aspects of computer programming would assume an aesthetic or ornamental function which at the same time would be productive of aesthetic and semantic disturbance within the "field of signification." Codework, in this sense, is posed as a type of writing which de-textualises while at the same time exposing the material of textuality vis-à-vis the "programme": i.e. "works using the syntactic interplay of surface language, with reference to computer language and engagement" and "works in which the submerged code has modified the surface language -- with the possible representation of the code as well." This in many respects resembles the material considerations of contemporary textual genetics with its formal analysis of "avant-texts." The programme, as what comes in advance of writing, denotes that which, consequently, withdraws from writing, just as the avant-text denotes that which withdraws from the text at the same time as "haunting' it in the form of a geno-trace, a signature code, a spectre.

[3] With the mechanisation of codes and cryptography leading up to the second world war, in the form of the German Enigma encryption machine and the British Colossus (which was designed to break it), such traces -- like the data traces that haunt contemporary computing -- manifested themselves not merely as residual echoes within the machine, but as effects of transmission and therefore subject, in itself, to a particular form of surveillance. From its advent, modern codework has always been concerned, not with "codes" per se, but with the spectrality of code and the haunting of the "pro-gramme."

[4] In "CODEWORLD," Sondheim discusses "codework, code writing" in terms of "the alterity of a substrate which supports, generates and behaves as a catalyst in relation to its production." The relation of codework to computer programming operates on an analogy, here, of the computer as auto-poietic machine -- a concept first developed by the British mathematician Alan Turing in the design of the prototype ACE Automatic Computing Engine in 1945. Following from his earlier work on the Universal Turing Machine in Computable Numbers (1936), Turing developed a concept of linking data to sets of instructions, in such a way as to modify the existing instructions coded within the computer (something which today is taken entirely for granted). Hence the basis of the development of Artificial Intelligence and concepts of interactivity (such as interactive genetic algorithms).

[5] The mathematical means by which Turing conceived of what was ultimately the foundational logic of modern computing derives from the proofs of the Czech mathematician Kurt Gödel, in response to Hilbert's positivistic claims (coded as four questions) regarding, in the first two instances, the completeness and consistency of mathematics (Uber formal unentscheidbare Satze der Principia Mathematica und verwandter Systeme [1931]). Gödel's method was to treat mathematic statements as integers -- in other words, to take advantage of the fact that mathematical propositions assume no external meaning beyond abstract symbolisation -- thereby treating "theorems" in the same manner as "numbers." It was in a similar spirit that Turing's Computable Numbers responded to Hilbert's third question regarding determinability(the Entscheidungsproblem), and that Turing chose to treat computing "instructions" as though they were "numbers." This mode of transversality (disposing of Russell's insistence upon a rigid hypotaxis of classes and types) was not a mere theoretical posture but a practical demonstration of the fact that arithmetic processes in computing are merely an imitation -- there are no numbers in the machine, only electrical impulses: an input representing a numerical value and an output representing a function of that value. This point is something which Wittgenstein (and Sondheim) failed to grasp -- Wittgenstein insisting in his discussions with Turing that mathematical propositions much have a meaning (just as Sondheim insists that "codework is not an instance ... of mathematical Platonism or Gödelian-Platonism; if anything it relies on the breakdown of the ideal, pointing out the meaning-component of computation, programme, protocol, even the strictest formalisms" -- which appears to belie a serious confusion between the materialism of both Gödel's and Turing's approach to number, and the Platonistic insistence upon symbolic meaning in the relation of form to idea).

[6] "The danger of codework," Sondheim argues, "is in its delimitation ... Sometimes it breaks free, relates to the subjectivity behind its production, the subjectivity inherent in every presentation of the symbol-symbolic." If this appears to point towards some type of ghost in the machine, or a return to Cartesianism (Descartes, as Georges Canguillem has pointed out, was very fond of machine models as an analogy of both the human body and the faculty of reason, much as Newton was), it may similarly be taken as a tentative claim upon the Lacanian notion of subjectivity with relation to what Lacan terms the Symbolic Order. In which case it is worth noting that Lacan sets out at the very beginning of his foundational lecture on this subject, "Le Stade du miroir comme formateur," to disavow any relation to Cartesian subjectivity as such. Elsewhere, Lacan most explicitly relates subjectivity to a materialist theory of consciousness -- perhaps most notoriously articulated in the work on Borromean knots (under the influence of the French mathematicians René Thom and Pierre Soury).

[7] If modern computing provides an expression of such a materialist conception of consciousness, it is not so much in the various forms of so-called artificial intelligence which have become increasingly pervasive in modern life, but in the conception and development of such things as the quantum computer. Using technology analogous to teleportation, the quantum computer functions by affecting an absolute equivalence between separate molecular states -- ostensibly for the purpose, at present, of factoring extremely large numbers for the purpose of cryptanalysis (but with obvious applications to genome sequencing, information theory, astrology, etc.). This is perhaps the logical development of ideas developed as early as 1930, with Vannevar Bush's "differential analyser" and later Turing's binary "scanner" of 1936, and with the understanding of semantic compression and redundancy advanced by Claude Shannon during the same period at Bell Systems (ideas which combined in the wartime project at Bletchley Park and which immediately led to the practical conception of an "electronic digital computer" or mechanical "brain").

[8] In "ETYMOLOGY OF CODE: DISTANCE, JUXTAPOSITION, AND FICTIVE SPACE," Tom Mackey looks into the "relationship between speed, movement, and our experience with the image mediated by technology." For Mackey, the World Wide Web -- invented in 1990 at the CERN research laboratories by Tim Berners-Lee -- functions as a "collaborative fictive space that forces us to question the nature of representation and simulation in the visual, verbal and text-based narratives we construct, encounter by chance, and connect through spontaneous threads and purposeful searches." As Mackey points out, this complex of relations is not fortuitous or mysterious, but the outcome of a genealogy beginning with electro-mechanisation and industrial forms of communication, beginning with Morse who "introduced a new way to experience information mediated through a technology of code, data and speed." In short, Morse's invention "enabled the transmission of information via 'code' that abstracted language to a binary of component parts" and affected a digital revolution in the process.

[9] In Ondrej Galuska's "EPICURIAN CODES," atomic simulacra (the point, or bit) stand in a particular relation to the "subject" as hypokeimenon -- as the zero of dimension which nevertheless structures and organises the various syncopated pulses of information (simulacra) which populate the world in a type of predetermination of Gibsonian "cyberspace." With reference to Leibnizian monadology and possible worlds theory, Galuska examines the relationship in philosophy and aesthetics between concepts of possibility and ideality, as well as the relationship of optical codes (and retinality) to a materialist understanding of cognition (and in this sense point towards Buckminster Fuller's Synergetics: The Geometry of Thought, and the law of geodetic motion in Einstein's General Relativity (which contradicts Newton's axiom that "a body subject to no external force will move in a straight line with constant speed")).

[10] In the essays by Darren Tofts and McKenzie Wark, the "point-to-point" formulation of vectoral media, data sampling, and the implications of dislocational surveillance, advances analogies to the topographies of industrial and urban surveillance (that bizarre virtual architecture of security monitors) -- encoded here in the relationship of "critical speculation" to censorship, copyright, intellectual and cultural property, and what Guy Debord termed détournement (or, from the point of view of property, "plagiarism"). In OPAQUE MELODIES THAT WOULD BUG MOST PEOPLE, Tofts looks at the "perverse locutions of dislocation" in the work of Jorn Zorn (collage, montage, etc.) and the discursive structures of hypermedia and what Nicolas Zarbrugg terms the "postmodern avantgarde." For Tofts, the experimentality of the avantgarde is intimately tied to technologisation, echoing not so much Benjamin (Das Kunstwerk im Zeitalter seiner technischen Reproduzierbarkeit) as Heidegger (Die Frage nach der Technik) -- in that "technology is a challenging" or Herausforden: "we now name the challenging claim that gathers man with a view to ordering the self-revealing as standing reserve: Ge-stell," or enframing -- which in more straightforward terms is what defines the relationship between technics and poetics, between the mechanism and possibility of "experimentation."

[11] Zoe Beloff, in her installation "THE INFLUENCING MACHINE OF MISS NATALIA A," explores a different aspect of dislocation and "montage," based upon the psychotopographies of virtual space. Beloff's "influencing machine" refers to a publication by the pioneer psychoanalyst Viktor Tausk, a student of Freud and the earliest exponent of psychoanalytical concepts with regard to clinical psychosis and the personality of the artist (subjects addressed only by Freud by way of literary records, rather than clinical practice). In 1921, after a complicated ménage à trois with Freud and Lou Andreas Salomé, Tausk committed suicide -- a fact that one of his patients, a Miss Natalia A, attributed to the manipulations of the "influencing machine." In Beloff's installation, the mechanistics of influence are examined through a simulation of the position of the clinical subject, by means of a détournement in the structure of "analysis." The physical installation which constitutes Beloff's "machine" requires the "viewer" to actively participate in the construction of a virtual environment, by being situated within a 3D optical space (whose architecture derives from superimposed 2D linear mappings) organised around a glass screen which acts as a video projection screen. Sensors analyse the viewer's physical interaction with the virtual space, causing various digital video projections to appear on the central screen, accompanied by voices, music, or electronic noise.

[12] What is most interesting about Beloff's installation (recently exhibited at the ZKM Zentrum für Kunst und Medientechnologie, Karlsruhe, as part of the "Future Cinema" programme) is not so much the function of optical synthesis, but of "analysis." In an apparent dislocation of conventional symptomatology or neurotic coding, the "subject" uses his or her body (by means of a pointer or index extension) to "analyse" a virtual or "hallucinated" space, in order to affect the projection of video images upon the real screen. A counterpart of this analysis is revealed when the "subject" removes the stereoscopic glasses and withdraws from the hallucination, under the temporary impression that in dissolving the 3D image the video projection would also be resolved into a series of flat, overlapping outlines, joined only at an illusionistic vanishing point. The "hallucination" persists beyond the disintegration of the visual field, in the reality of the screen as the symbolisation of an optical prosthesis which allows the subject to "see." In this way Beloff's apparatus demonstrates the affective relationship not only between seeing and the gaze (in post-Freudian theory), but the analogous relationship between optical synthesis and analysis, and the structure of the symptom-as-rebus in psychoanalytic practice.

[13] Alan Roughley, in his essay "TEXTUAL SURVEILLANCE: THE DOUBLE EYES (AND I'S) OF GEORGE BATAILLE'S STORY OF THE EYE," links surveillance to the function of censorship and the exercise of power with regards to symbolic meaning (and the curious authorial pseudonym Lord Auch, or Lord Eye). Following from Foucault, Roughley argues that "classical, socio-theological surveillance of discourse," intended to "censor language and purge it of any uncoded naming of sexuality," was founded upon an attempt at expurgation. Treating Bataille's l'Histoire de l'eoil, Roughley states: "The concept of the prison need not concern us here, unless we conceive of the watch kept by the self over the other as a form of imprisonment: the self regarding, keeping guard over, the other." For Roughley, imprisonment functions rather as a trope along the lines of Lacan's stade du miroir, in which the specular subject is determined as "subjection" to the signifier, vis-à-vis a discours de l'Autre (what Roughley terms "discursive surveillance," extending Lacan's ocular metaphor of the regard or gaze).

[14] Codework, then, if not yet or not quite merely one more term in the genealogy of recent academico-critical paradigms, here at least comes to resemble one (is such a destiny today unavoidable?), if for no other reason than by entitling (as it were) a project of anthologisation -- however seemingly haphazard -- of a disparate body of writing on, about or as "codework." And, thereby, authorising it according to the established code of scholarly presentation and of the edited volume in particular (that peculiar genre which, in authorising, establishes a claim upon authority which it retrospectively asserts in a manner which can equally be described as authoritarian). Even if one were to view this in terms of the simulation of authority. An editorial or authorial posture (the one usurping the position of the other according to some sort of internacine oedipal code which is conspicuous in not being brought out into the open -- as a "subject" of scrutiny -- even as its venue constitutes the public domain, while nevertheless entailing, in each of its aspects, the mechanics and indeed ideology of surveillance and of secrecy) which itself is merely the banal expression of a founding complicity within the academy to a form of work or labour code. And this despite (or not, of course) the constant deferral of current discourse to "radical" paradigms as the simulation of a critique (what is in fact a mask of conservativism and the creeping nostalgia of scholastic avantgarde-ism) -- such as the various codices of "postmodernity" which have, since its earliest evocation, been tied to latent forms of neo-conservativism (from Charles Olson to the present).

[15] Not only does this implicate "criticism" in the authoritarian aspects of academic conduct (in whichever guise they appear -- the prohibition of "irresponsible" modes of thought or praxis), but also the synonymous use of the term ethics (as in, ethical codes of conduct) and the various taxonomies of "responsibility" encoded in academic and other institutions which employ this term as a justification and even a means of affecting intellectual surveillance. The institutional subject held to account by virtue of, and by means of, its "instruments." As one says, the "instruments of reason" -- the innocuous and yet "malevolent" techne not only of a "despotic rationalism" (and its historical consequences) but also of the means of its critique (and their increasingly a-historical consequences). (It is not for nothing that Foucault linked Benthamite architectonics to the very faculty of reason itself and to its formal expression in the charter of the renaissance "university.")

[16] In his 1998 essay on Blanchot, Demeure, Derrida poses the question of the possibility of attestation, of "who keeps witness for the witness" (Blanchot), and consequently of "what attests to the absence of attestation." These questions raise the spectre of a fundamental dilemma of ethics, one elsewhere expressed in the "figure" of Bentham's panopticon and Nietzsche's Götzendämmerung, but above all in the "discovery" of the Freudean unconscious and the structure of attestation in the post-Cartesian subject, vis-à-vis what Lacan, following Sartre, came to term the regard or gaze (of the Other) -- the optical metaphor extending the idea of a signifying materiality in the mechanics of subjection or subjectification.

[17] The optical code and its determinations of accountability and attestation according to the assumed stratifications & discontinuities of power (value structures), has exerted a fatal fascination over "western" thought since at least Platonism and its resurgence following the invention of Roman Christianity -- as evinced in the various apparatuses of moral and political enforcement, from the spectacle of the Inquisition to its succession in the Stalinist show trials and McCarthyist purges of the 1930s and 1950s. The Judeo-Christian notion of arbitrary judgement is likewise reflected in the variable relation of value terms within or subject to such apparatuses, whose codes have, from the outset, been confused with the structural rigidity of a mechanism which nonetheless renders all dualistic relations (such as good and evil) contingent upon the right of power and its attendant discourses. The 1971 Stanford Experiment stands as one of a number of (controlled) illustrations of such ethical contingency upon the determinations of power vis-à-vis human behaviour. Difficulties arise in the subsequent attempts to create analogies between controlled experimentation of this kind and "real life" scenarios (whose reality principle is nevertheless what is precisely at stake).

[18] These difficulties do not so much arise from analogical thought, but from a resistance to the idea that responsibility in the face of such scenarios per se, is generalisable. That is to say, what facilitates unethical behaviour on the one hand, sustaining the very idea of ethics on the other, is indeed inherent to the structure of collective and individual subjectivity. It is no good attempting to distinguish qualitatively such events or institutions as the Rwandan genocide, Stalin's gulags, the Japanese POW camps, the Nazi holocaust or the US concentration camps in the Philippines from 1898-1946, from other less extravagant or less exotic episodes of coded violence, aggressive supervision, censorship, molestation or intimidation. For here it is often more a question of different emotional intensities summoned forth by events in their specificity, which blinds one to a more general cultural of complicity and collaboration -- one which makes such events possible in the first place and assists in sustaining the lie of disinvolvement. An acquiescent code of silence affects itself not only in the behaviour of individuals and communities, but in the very nature of subjectivity and its relation to the concept of sovereignty (the codes which bind/blind demos to the concept of arche within the framework of democratic statehood).

[19] The counter-logic of disavowal and (the absence of) attestation operates, beyond the psychopathology hinted at here, in the alienating influences of what Guy Debord terms the "spectacle" -- a condition between the Lacanian stade du miroir and Marx's conception of Entaüsserung in the relation of the subject (labour) and its objects (commodity) as outlined in the Ökonomisch-philosophische Manuskripte. The spectacle, for Debord, is the incrementalised return or détournement of the alienation effect which both negates and paradoxically sustains the subject as a labour commodity -- a form, we might say, of code-work. But rather than describing an instrumental relation to an unapparent truth, it marks the dispossession of truth or rather the dispossessing effect of its discourses (dispossessed of the possibility of truth) which become a form of noise or counter-work (Heideggerean idle talk or Gerede).

[20] Code in this sense is a principle of redundancy tied to entropy and death (cryptography -- as in the investigations of Tom McCarthy and the INS). In the mechanisation of labour, as equally in the mechanisation of consumption, the subject becomes a figure of redundancy within the socio-political algorithm. Intersubjectivity cedes to "interpassivity" -- a replacement of so-called "direct experience" by way of mechanical or technical prostheses. The subject's coding of self by means of the imaginary implies a psycho-optical incrementation toward a "utopianism" of pure spectacle. That is, an imaginary "Kapital" upon which the subject stakes its claim to identity in the form of an eternal deferral of the same (just as with the pornotopia of the World Wide Web and its virtual infinity of rhizomatic mirror-sites, meta-engines and topographies of hidden user surveillance, viral emplacement, data theft, information junking, etc.). Code in its relations to the secret (object of ideology?), but also to propriety, the law, and power in all of its manifestations (the law here being a secret publicly divined by those sanctioned or thereby empowered under the code), can be seen as operating, paradoxically, at the edges of the law, of the knowable and of the secret, between public and private, of what is openly transmitted and what is occulted, expurgated, hidden, withdrawn.

[21] The Situationist preoccupation with psychogeographies -- a topographics of the virtual -- returns upon the present spectrality of global events as a topographics of surveillance which is no longer merely the assembled fragments of CCTV images spread across a bank of flickering security monitors, but the data traces which adhere in the subject in the form of an optical blind. That is, as the zero point of perceptibility which ties the subject to the necessary condition of its being perceived. Which is also to say, the zero point of equivalence or adaequation in the being of the subject as the point of interface between the so-called imaginary and the real, that is as their dual symbolic counterpart and point of mutual deferral. The radical transversality of hypermedia, for example, which places remote events in time and space in direct "communication," transforms the apparent totality of surveillance (as panopticon) into a discursus of global events. The counter-intuitive delineations of such geo-desic structures (division, taxonomy, genre) opens up lines of communication, interfaces, the clinamen of a topological time-space whose geometry is not merely resemblant of a semiotic "surface of sense," but is in fact underwritten by a radicalised parataxis in which macro-medio-micro events operate along an axis of dynamic contiguity. As Buckminster Fuller puts it: "In the inherently endless scenario model of Einstein's Universe, truth is ever approaching a catalogue [or codex] of alternate transformative options of ever more inclusive and refining degrees."

Louis Armand
Prague, May, 2003