HYPER-LEX: A Technographical Dictionary


Paul Harris





The spirit or at least pervasive desire of our age revolves around a sort of transparency: a desire to project ourselves as a surface of permeable traces, to exfoliate, let the inside become the outside, to become fully visible like the meat and bones of a Cronenberg character, while remaining invisible like the little hacker ghost (Turing's Demon?) that tracks text in the Random Access Memory banks of the machine onto whose screen we splash words. In large part, the attractive force that transparency exerts is an effect of media culture; simultaneously, however, transparency marks a limit of im-mediacy - an unmediated, collapsed sensation where we can see the neurophysiology of our brains or the shapes of and linkages among our words.
This is an immediacy of the sensory that never shades into the tactile - it is rather the immediacy of sensing the medium itself, of clicking tracks around the computer screen or dredging up hidden treasures on the Netscape of our lives.

This transparency is embodied - enacted in a disembodying way - most clearly in the VR-user. Jacked in to an interface that both joins and separates mind and body, the VR-user gets to enter a proprioceptive universe both contingent on their movements and exhiliratingly alien. The body takes on a prosthetic virtual life of its own, a life then seen by a disembodied spectator's viewpoint; and from this situated spot, the projected body image reveals the visual inside of the embodiedness of living biological infrastructure. From the outside, the VR-user lives an uncanny relation to the body-image; it becomes a double, but with the further twist that this doubling relation itself finds objectified form. The VR-user doesn't so much experience the body as other as the very process of othering. Wim Wenders gives an imaginative twist to this mode of transparency in Until the End of the World, where a technology that records the neurophysiological impression of seeing, used by one character to let his blind mother see her children and the world, ends up being utilized as a way to record and then watch one's own dreams. Characters quickly become walking zombies, narcissistically ensnared in the mirror of their dreams, addicted to the content of the unconscious encoded as visual information.

In order to develop a critical ecology of the culture of transparency, we may turn to Gregory Bateson's writings. At the end of his life, Bateson posed a question whose answer is now beginning to take its amorphous shape: "Onto what surface shall a theory of aesthetics and consciousness be mapped?" The answer on the cutting edge, the leading surface at this juncture, would appear to be the network. While the key terms of Bateson's question may seem nearly nostalgic (aesthetics as nostalgia for high art, consciousness as nostalgia for "presence"), there is a distinctly aesthetic pleasure apparent in the joy that network users find in their work. The aesthetics of the network are crystallized in the features of hypertext - the network as a set of links among lexia, or textual units. The network ultimately provides an image of consciousness, because consciousness is now perceived as essentially digital in nature, as a flow of discontinuous signals that result in daunting numbers. But "network" is a promiscuous and ubiquitous term, serving many functions in describing our modes of conduct and perception of the world: network serves as a structural design principle, modus operandi, technological environment and constraint, as a textual space and psychological model all in one. We think of social relations as networks, as well as television corporations and business ties; other more mundane, literal coinages also persist in daily parlance.

But once transparency appears on the screen or the net, once transparency is projected onto a network, it becomes curiously opaque. If we become transparent to ourselves in some warped Baudrillardian sense, then it quickly becomes apparent that we are nodes in the network, that the network as such will remain an unknowable system - an invisible territory, maps of which we continually redraw by surfing the net, but a territory which will remain several dimensions beyond our ken. The opacity of the network persists in more visceral ways than its merely implicit invisibility - it comes home to us when we experience the alternating ecstasy and frustration of reading hypertexts, writing with new softwares, or exploring the web. The limitations of speed, the crashing of lines or programs, or just the cutting edge that a skipping CD sears through the eardrums all point to an irreducible bluntness, a resistance of the medium, that we like to overlook when our jack-in glasses fit snugly or when we theorize transparency in sweeping terms.

Technophiles will be quick to point out though that these sort of ups and downs are not in the network or tool at hand, but are a function of how we experience networks. This mode of experience is often described with metaphors that attempt to identify network-surfing with writing or graphic practice, but it might better be called the scene of clicking. And users could be thought of as bit-players, in that they are playing at manipulating bits of information, feeling free, but are in fact tiny nodes whose choices are prefabricated. The bit-player, in essence a signal skimming along over the surface of the net or the screen-text, is impatient with impediments. The bit-player prefers transcription to translation, wants transactions to be engineered by software with good genes, and likes well-engineered round, red tomatoes devoid of pocks and orange spots.

The topography of the network remains rather flat though, as if it were a set of switches and junctures along a homogenized, discrete track - the information superhighway seen by clicking train. Missing in this map of the digital are the peaks and valleys, the external image of our ups and downs, the sort of graphics found in representations of digital soundscapes or three-dimensional maps of distributions of things. If we think of the imaginative space of the network as a map onto which aesthetics and consciousness are projected, it would appear that language fans out onto flat surfaces; it is deployed in chains of metonymous units, and becomes a sequence of transcriptions that the user juxtaposes and skims, if not simply skips.

We witness the text receding into the medium that houses it, a new rendition of the well-worn medium=message riff. In a hypertext, for instance, the links are the structural logic of the virtual whole text; but the "links" do not link together in any way - they simply mark a transparent passage, an edge of difference that one passes through frictionlessly. Ostensibly, the "link" could point to a relation between two lexia, but the relation usually is either a literal, referential one (i.e., for more info on x, link to y), or a jump from one character-situation to a different scene, and the relation must be reconstructed after the fact. This operation, while it supposedly ensures the text's flexibility in the hands of a commanding reader, actually lays out all the textual loops for the bit-player to click through in advance.

To fabricate a critical ecology in the context of hypertext writing, one seeks to maintain a certain duplicitious relation to the medium. On the one hand, one manufactures the critical ecology according to some of the rules of the hypertext game. (It is especially necessary to simulate the medium when the text is appearing on the Net, of course.) On the other hand, the critical ecology comes designed with an infrastructure that enables it to play itself out of some of the hypertext game's constrictions. The basic idea is to create a "technographical dictionary," to begin generating a working vocabulary for the culture of transparency, in a format that would allow for flexible usage and continual additions. Rather than unfolding in a literal hypertext format, the dictionary lays out an initial set of lexia. Then, rather than create links that transport from one term to the other, the lexia becomes a combinatoric device that creates hybrid terms. The hybrids must then be given definitions, so that the dictionary becomes a means to make new terms and concepts. As a result, the hybrids need longer entries than the definitions of initial terms - the entries on hybrids occasion short takes on aspects of transparency culture.

The design principle behind moving from a lexicon of discrete words to hybrid terms is quite basic: hypertext and most modes of experience in the scene of clicking are structured according to metonymic chains, as sequences of screens whose order can be changed. The combining of lexical entries into new terms is meant to mimic in a crude way the different sort of transport accomplished by metaphor - the carrying across between terms naming each other that constitutes the life of the metaphor, the between-terms that generates an emergent semantic richness. I try to inject a hypertext screenscape of discrete lexia with the permutational dynamics of combinatoric gaming - something the creative writers of hypertext seek out, of course, here done in a more literal manner.

What's the point of such an exercise? The point is that on the one hand the sheer speed and immediate consumption with which we confront words now must be integrated into how we write. Academic writing should adapt to the fact that we all are glutted with stuff to read all the time, and have become consumers of articles and ideas - we skim and borrow, seeking to profit with least work, more like the students we complain about than we care to admit. But on the other hand, a sleeker writing should also try to slow up or congeal; the words should merge into one another to take on texture, to be dipped in analog feeling for a moment as they pass through digital scenes. I don't mean here that words should take on texture in the way that hypertext critics think this occurs, which amounts to stimulating the "look at rather than through" response. Rather than utilizing opacity as a source for significance, the basic gesture of combining lexical units into hybrid terms seeks to simulate bottom-up emergence, simple components combining to generate a different level of significance. The lexicon does not organize into a network, but a hierarchical distribution along which meanings slide as potential relations. The network trope blots out the sense of a hierarchical architecture that is so crucial to all living systems. The Hyper-Lex design simulates a critical ecology where emergent terms operate at a "higher" level of complexity, being more than the sum of their parts:

****************************
LEVEL I: LEXICON
 book
  brain
   constraint
    narrative
     technography
      virtual
****************************
LEVEL II: HYBRIDS
 virtual brain
  technographical brain
   technographical book