Joca/synopsis-20182401

From XPUB & Lens-Based wiki

Source text

copied Liu, Lydia H. “ISpace: Printed English after Joyce, Shannon, and Derrida.” Critical Inquiry, vol. 32, no. 3, 2006, pp. 516–550. JSTOR, JSTOR, www.jstor.org/stable/10.1086/505377.

Raw notes

What is it saying (thesis)? iSpace: Printed English after Joyce, Shannon and Derrida

Finding an understanding the function and nature of the phonetic alphabet and alphabetical writing. In the process focusing on the confluences between literature and technoscience, with Finnegans Wake of James Joyce as a starting point.

iSpace

Phonemics

Discusses various writers and engineers, from the Perspective of Joyce.

Derrida: Coined the term archiwriting. The relation between speech and writing changed with the universal Turing machine. Printed word and letters as a starting point

Theal: Joyce saw the work as a machine. A piece of engineering, especially communication engineering. Bringing statistical properties of letter sequences and spaces among words and non-words to light.

Literary experiment two decades before Shannon’s printed English

Contrast to Derrida, Joyce thought that the structures used in FW is meant to result in natural language. Any natural language was construed by the reader. Similar to how computers work, in the sense that they use symbolic symbols.

Letter sequences not made to be pronounced, but to visualize a certain meaning. Chute, fall.

Irrelevance of the phonetic aspect of text. Having a bigger focus on the ideographic and graphic part.

Developments in computing took these experiments to a new level.

-- Liu discusses Printed English by Shannon Shannon

Mathematical theory of communication

Approach of English as a statistical system. Printed English not about reproducing text, but having an interface between natural language and machine language.

27 letter alphabet

Rethinking idea of communication

Different look than Joyce. Not symbolic view, but statistical structure. What they have in common: not really presence of these roles in any linguistic way.

Markoff chain, using the probability of certain letter sequences as an information source.

Introduces concepts as information entropy and redundancy

FW low redundancy

Basic English high redundancy

Shannon not interested in semantic aspect of English, but in stochastic structure.

Ogden. Basic English

Shannon -> making writing a form of counting, generate text based on statistics

The space is a conceptual figure. Not a visible word divider as used in some other languages, no linguastic meaning but a ideographic role

Predictabiity of English more dependent on the space than on any of the other letters.

By its focus on probability of letter sequences, each language can be seen as a variant of printed English with just a different set of probabilities.

Shannon was not the first one. Ogden basic english -> statistical basis for universal and international language.

Language as a tool to build influence, in which this simplified language was more pratical. But not only reason to Ogden: stable basis for scientific research. English was the only language analyzed well enough to simplicy it in a good way.

Ogden saw BASIC as a printed language, foreseeing new ways of communication. Did not see how Turing and Shannon would use language.

Turing machine: letters as an interface, but phonemes and words were not required for communication by a computer. Use of printed symbols to allow a machine to write reade etc.

Ideograph to the machine

Typesetting not important anymore. Discrete symbols were used Scanning. Mallarmé 1897. Movement of printed words as a visual experience. Readers eyes following the movement across typographical spaces.

Shannon: Square of Turings tape is a blank symbol. Represented by the 27th letter: iSpace.

Core of alphanumerical system

Unnoticed by majority of linguists and technology historians.

What happened to the phonetic alphabet

McLuhan: phonetic alphabet is a technology.

Problematic. Claim that pictograms are non-alfabetic and represent a less sophisticated technology. While mathematical symbols are examples of symbols with a meaning.

Still mystery how meaningless phonetic symbols can bear any meaning. Saussure. Signified linguistic sign

Logocentric


What is its conclusion? Concluding: information theory outcome of crossbreeding the ideas in the literary world and scientific experiments,

What is your opinion? Article connects many practitioners in this field. On one hand complicated, because many references to for me new authors. On the other hand, invites to read others. Building framework. Not sure if James Joyce should be the starting point for this framework though.

Finished parts

iSpace: Printed English after Joyce, Shannon and Derrida, by Lydia H. Liu

What is the function of the phonetic alphabet and alphabetical writing in the current age? In this article Lydia H. Liu explores literature and technoscience to offer an understanding about the universal English alphabet since the development of information theory.

As the starting point she takes the work Finnegans Wake (FW) by James Joyce. In the book Joyce experiments with the English language using outrageous letter sequences and signs. He also introduces iSpace, which marks the space between the words in the text.

Joyce as a writing machine Liu calls Joyce a modernist engineer of cyberspace and states that his use of the alphabet in FW had implications for the use of the alphabet in computer technology that was developed after the publication of FW in 1939. She supports that statement by discussing a variety of writers that were inspired by his work.

One of them is Jacques Derrida, who used the concept of archi-writing to argue that language already has a semi-fixed structure by itself, already before we use it in writing and speaking. The writing then can be done by a hypermnesic machine that can anticipate all what is possible to say. Derrida calls Joyce the ultimate version of such a writing machine.

To get an idea of Joyce’s view on this, Liu refers to Donald F. Theall. He argued that James Joyce approached writing as a piece of engineering, bringing statistical properties of letter sequences and spaces among words and non-words to light.

In the use of the alphabet, Liu states that in FW Joyce doesn’t use the alphabet to document certain phonemes, but as a way to create ideograms: writing that besides letters includes other graphic marks to document a certain thought that is open to construction by the reader. One of the examples is an unpronounceable sequence of 100 letters in FW that instead of representing a phoneme, visualizes the fall of a character.

From literature to information science The work of Joyce also inspired scholars working in information science. C.K. Ogden was an admirer of Joyce and compared in his introduction of BASIC English the 850 word sized vocabulary of his language to the ultimate vocabulary of Joyce, which Ogden estimated to be more than 250.000 words.

Shannon, the creator of information theory, uses BASIC English and Finnegans Wake to illustrate the concept of redundancy. Joyce’s writing is an example of low redundancy, while the limited vocabulary of BASIC often leads to expansion of the text and a high redundancy.

To do his research on the stochastic structure of language, Shannon approached English as a statistical system instead of a language. He called this system Printed English: an alphabet that is post-phonetic and features a 27th letter that marks the space.

Shannon used Printed English to find the statistical structure of the English language, generating random sequences of letters that look familiar to Joyce’s in FW. He understood Printed English as an ideographic alphabet, in which the sequence of letters was influenced by probability. The space as the 27th symbol was especially useful for that, because the predictability of the English language is more dependent on the space than on any other letter.

In the end Liu concludes that natural language presumes a separation between speech and writing which is not relevant for computers that use the alphabet for a different purpose, namely a symbolic use of the alphabet to do computations. Printed English is especially suitable for this purpose because of its well-known statistic properties in comparison to other writing systems. The road towards this is the outcome of crossbreeding the ideas in the literary world and scientific experiments.

I find it interesting how Liu shows the connections between literature and technoscience in the development of an alphabet that was not based on phonetics or semantics, but one that has a symbolic meaning. Her writing style is dense, with many references to other authors. On one hand this gives an overview of the field, on the other hand it complicated my understanding of the reading of the article because I didn’t know certain concepts Liu was referring to.