Has digital media destabilized the meaning of texts?
Tapia would say that texts have always been unstable, nonlinear, and “networked.” I would tend to agree, to an extent. I appreciated how he linked contemporary thought about digital texts to deconstruction, illustrating that semantic instability is a function of previous scholarly thought. He even points to medieval and classical texts to undermine the often heard claim that digital technologies would give rise to “a new kind of man and a new system of thought” (7). Such historicizing work is good to remind us that digital technology may just be the same content put into a different form. The Y2K frenzy is a good example of how erroneous conceptions of technology can lead to wild conclusions about its possibilities.
At the same time, I think digital media makes the instability of meaning all the more apparent as well as more readily available. Literacy no longer means knowledge of the Bible and select other texts; through digital technology it expands across cultures and across history. Tapia, however, draws attention to the fact that some features of digital media, such as user-participation and the increased presence of visual signs, are nothing new. He explains how ancient texts such as the I-Ching were dependent on reader participation, and furthermore, that the semiotic structures of spoken and written language predate digital technologies: “in the conscious articulation of syntax and punctuation, in the organization of paragraphs, pauses, silences, and digressions, visual signs and their history play a considerable part” (5). This may be true, but it is significant that digital texts have a much wider distribution than certain historical texts did during their time. I’m inclined to think that greater access and greater literacy in general has, in fact, changed the structure of consciousness, if only that it leads to the realization of polysemia. More and more texts are now consolidated in one place, connected by a network of links and nodes. The sheer diversity and volume of semiotic activities in digital media, arguably more accessible than ever before, is enough to make one a Derridean. Hypertext might serve the same function as “pauses, silences, and digressions,” but it is not the same. Hypertext can be defamiliarizing by emphasizing the “linked” nature of consciousness, indeed, in a new way.
As a sidenote, I think it is incredibly difficult and requires a lot of intellectual gymnastics to determine whether texts have always been unstable. First of all, the postmodern notion of “textuality” complicates the question, because it homogenizes and equalizes all forms of discourse. Literacy systems of the past (I hate being so reductive, but bear with me), on the other hand, vehemently insisted on hierarchy. The Bible was not regarded on the same level as a trading post’s record of goods, for instance. So to evaluate whether or not texts always lent themselves to interpretation requires one to interrogate the notion of the “text” itself and take into account how interpretive strategies and textual hierarchies of the past were different from our own.
Furthermore, Tapia is critical of deconstruction, but no one can deny that it was an important philosophical movement during which theorists became more conscious of their interpretive practices and textual instability. Becoming conscious of something, I believe, is what makes it “exist” in the first place. So did “interpretation” and textuality exist before we became conscious of them? I’m not convinced that they did. Does the tree falling in the forest make a sound? My solipsistic answer has always been “no.”
Consciousness creates reality.
I think that’s a good way to end my final blog post for this class.