Synopsis7Feb
Abstract (50 words) Synopses (500 words)
back to base:
Reading – Tash
The Digital Universal Library and the myth of chaos
by Sanne Koevoets, in Webs of Feminist Knowledge Online
Abstract (50)
In this essay, Sanne Koevoets offers the FRAGEN database as an example of a feminist digital library which, through transparent processes and inclusive interfaces, is questioning and rejecting the biased structures of online knowledge spaces as we know them.
Synopses (500)
This essay is a feminist critique on digital libraries written by Sanne Koevoets (NL); who is a researcher and lecturer on new media cultures and gender studies.
It begins with an excerpt from Jose Luis Borges’ pivotal work ‘The Library of Babel’ (1941, English translation 1962), a short story which figures a vast library that consists of an infinite number of hexagonal spaces, holding an unlimited number of books. But the promise of a ‘Universal Library’, which would hold all of human knowledge, has always been a problematic one. Even with the rise of digital technologies, with its capacity for storage and its sophisticated search tools, Koevoets argues that the reality is both more complex and more mundane than the dream. Introducing her first criticism, she explains that “While the fantasy of a (digital) Universal Library may be philosophically or metaphysically compelling, the politics of selection and access – and thus of ordering techniques – are ever present on the Web.”
Next to the fundamental fact that every library is by definition selective in its collection of texts, Koevoets points out that technology is a social construct and thus not value-neutral. Our interactions with online spaces are governed by algorithms, which often conform to market forces and increasingly define and dominate how information is presented to us. In this way, largely invisible processes like ranking algorithms are becoming co-producers of authority, and to some, “the most pervasive source of bias in the history of research.” She ends this section with the essential question of the essay: “Under such conditions, how can webs of feminist knowledge be represented online?”
With the problems exposed, Koevoets brings forward a case study called FRAGEN: The FRAmes on GENder in Europe project, a digital feminist library constructing an online database of core feminist texts from all 27 EU countries, and Croatia and Turkey. The first, key difference between this project, and say, that of the Google Books project, is that FRAGEN tends towards specificity rather than totalizing inclusivity. The second, is the issue of transparency. FRAGEN’s approach to selection does not pretend to be neutral nor exhaustive. The library openly shares the identity of its librarians: key feminist figures from each of the 29 states, all chosen by committee. It also shares insights into the criteria by which these key figures were asked to select texts for a “longlist,” then on how “longlists” were pared down into “shortlists" of ten texts per country. Koevoets argues that “the combination of transparency and the way in which different local views and conceptualizations were used to provide access to the database via multiple route of entry (for instance by country, author, topic: etc.) lends the database a certain fluidity.”
The last section of the essay focuses on the website of the database, an interface which allows and invites other researchers to reflect and comment on the library texts in a comparative way. This is another way in which the FRAGEN database and website are set up to actively eschew claims to objectivity, and to represent the constellations of feminist knowledge in all their partiality.
In conclusion, Koevoets posits that it is the responsibility of every digital librarian, and feminist researcher, to take seriously the implications and assumptions that are built into the very structure of online knowledge spaces. “In order to make feminist knowledge accessible online, not only the politics of selection, but also the politics of the index must be addressed.”
Opinion / notes
Sanne Koevoets brings together issues of archive politics, bias in technology, and feminist methodology in a clear and concise way. I am especially interested by her critique on ranking algorithms and how it ties into her rejection of the ‘universal’ anything, which is also a key pillar in the understanding of situated knowledge.
The Suspicious Archive
by James T. Hong
Abstract (50)
This piece written by Taiwanese artist James T. Hong questions two main aspects of the modern ‘archive’. The first is its interpretation and subsequent relation to its reader, where Hong focuses on the effect of an archive rather than its content. The second is a critique of English as the dominant language used to deal with the preservation and distribution of knowledge.
Synopses (500)
James T. Hong is a filmmaker and artist based in Taiwan, and in this two-part essay he explores his suspicions on the ‘archive’; a critique built on the fundamental questions of interpretation and linguistics.
In part one, he considers a paranodal analysis of the archive as opposed to a forensic one. He asserts that the key to understanding an archive lies not in what it is (“a non-random collection of things”) but in the relationships between its intention, existence and subsequent interpretation by others. Following this, three very basic questions should be asked of any archive:
- Why does this archive exist?
- What is missing from the archive?
- Why does this archive contain this item rather than another?
Hong develops his position by looking at the internet – which he sees as a kind of decentralized archive, one which is nevertheless completely hierarchical in action. He criticizes processes like SEME (Search Engine Manipulation Effect), and questions censorship on a general scale, which may not always be top-down, but is in any case a threat to agonism and diversity. Hong warns against the internet becoming a mirror of things we already know and accept. “Removing the “ugliness” cleanses the archive, and the archive is us for future generations. This cleansing is thus a gross manipulation of the record of our present world.” Tying this back to his core issue of interpretation, he ends this section with a quote by philosopher Cristina Lafont: “to be human is not primarily to be a rational animal, but first and foremost to be a self-interpreting animal.”
The implications of this ‘interpretiveness’ of humanity carry huge political and cultural weight when set against the linguistic nature of hermeneutic knowledge. “In a linguistically articulated world, language is not simply a set of arbitrary signs that refer to objects within the world; language is rather the very means with which the world shows itself to us. Interpretive understanding is always a mediation between the strange and the familiar in some kind of language.”
The power of language over our understanding of ourselves (through the archive or otherwise) is the main focus of Part 2 of the essay, subtitled Every Word Is a Prejudice. References to Nietszche, Martin Heidegger and Hans-Georg Gadamer, explore philosophical concepts of truth, objectivity and morality, while contemporary examples of the news media cycle are used to question the way reality is perceived and believed – in what language and to what audience?
The essay continues on to chart the rise of English as the world’s lingua franca, and its current dominance in literature, science and journalism. Hong then arrives at his critical argument: “I claim very simply and crudely that nothing is really true, that nothing really matters, unless or until it is in English. This could be called a form of “imperialist linguistic idealism,” and it goes hand in hand with the implicit, globalist assumption that nature’s preferred way of being represented is in English—scientific or otherwise.” This legacy of the English language, and its increasingly powerful gatekeepers, must be taken seriously.
The essay concludes with the simple but significant idea that though language in itself holds no intentions, power is invested in it by the people who use and promote them. It is therefore imperative that we ask not just why it is used but how it can be used in a better way.
Opinion / notes
It’s interesting to me that Hong’s criticism of language focuses on the way English is used instead of how it is constructed. He also proposes ways in which we can turn this usage on its head: “English can also be used as language of opposition, as a critique of itself, its assumptions, its users, its attendant ideologies, and its dominance. The world can be made bigger again, if we, at the very least, use different words and diverse concepts.” This view of language as a critical tool on discourse is very relevant to my interests in this project.
Reading – Joca
iSpace: Printed English after Joyce, Shannon and Derrida, by Lydia H. Liu
What is the function of the phonetic alphabet and alphabetical writing in the current age? In this article Lydia H. Liu explores literature and technoscience to offer an understanding about the universal English alphabet since the development of information theory.
As the starting point she takes the work Finnegans Wake (FW) by James Joyce. In the book Joyce experiments with the English language using outrageous letter sequences and signs. He also introduces iSpace, which marks the space between the words in the text.
Joyce as a writing machine
Liu calls Joyce a modernist engineer of cyberspace and states that his use of the alphabet in FW had implications for the use of the alphabet in computer technology that was developed after the publication of FW in 1939. She supports that statement by discussing a variety of writers that were inspired by his work.
One of them is Jacques Derrida, who used the concept of archi-writing to argue that language already has a semi-fixed structure by itself, already before we use it in writing and speaking. The writing then can be done by a hypermnesic machine that can anticipate all what is possible to say. Derrida calls Joyce the ultimate version of such a writing machine.
To get an idea of Joyce’s view on this, Liu refers to Donald F. Theall. He argued that James Joyce approached writing as a piece of engineering, bringing statistical properties of letter sequences and spaces among words and non-words to light.
In the use of the alphabet, Liu states that in FW Joyce doesn’t use the alphabet to document certain phonemes, but as a way to create ideograms: writing that besides letters includes other graphic marks to document a certain thought that is open to construction by the reader. One of the examples is an unpronounceable sequence of 100 letters in FW that instead of representing a phoneme, visualizes the fall of a character.
From literature to information science
The work of Joyce also inspired scholars working in information science. C.K. Ogden was an admirer of Joyce and compared in his introduction of BASIC English the 850 word sized vocabulary of his language to the ultimate vocabulary of Joyce, which Ogden estimated to be more than 250.000 words.
Shannon, the creator of information theory, uses BASIC English and Finnegans Wake to illustrate the concept of redundancy. Joyce’s writing is an example of low redundancy, while the limited vocabulary of BASIC often leads to expansion of the text and a high redundancy.
To do his research on the stochastic structure of language, Shannon approached English as a statistical system instead of a language. He called this system Printed English: an alphabet that is post-phonetic and features a 27th letter that marks the space.
Shannon used Printed English to find the statistical structure of the English language, generating random sequences of letters that look familiar to Joyce’s in FW. He understood Printed English as an ideographic alphabet, in which the sequence of letters was influenced by probability. The space as the 27th symbol was especially useful for that, because the predictability of the English language is more dependent on the space than on any other letter.
In the end Liu concludes that natural language presumes a separation between speech and writing which is not relevant for computers that use the alphabet for a different purpose, namely a symbolic use of the alphabet to do computations. Printed English is especially suitable for this purpose because of its well-known statistic properties in comparison to other writing systems. The road towards this is the outcome of crossbreeding the ideas in the literary world and scientific experiments.
I find it interesting how Liu shows the connections between literature and technoscience in the development of an alphabet that was not based on phonetics or semantics, but one that has a symbolic meaning. Her writing style is dense, with many references to other authors. On one hand this gives an overview of the field, on the other hand it complicated my understanding of the reading of the article because I didn’t know certain concepts Liu was referring to.
Ilett, Rosemary Catherine (2003) Outstanding issues: gender, feminisms and librarianship. Chapter 4, section 3: Gender and librarianship : revisiting trait theory (page 97 - 116)
Abstract (50)
In this chapter Rosemary Illet uses feminist critique to show that the use trait theory to define an ideal type of profession ignores the relationship between professionalization, gender and power. She illustrates that by assessing librarianship through trait theory.
Synopsis (500)
In Outstanding issues: gender, feminisms and librarianship Rosemary Ilett analyzes the profession from a gendered position. For this synopsis I focus on chapter four, section three, where Ilett shows why librarianship was considered a not a real profession following trait theory: a way of analyzing a phenomenon by identifying aspects of a personality, in this case ‘the professional’. The author uses feminist critique to show that the use of traits alone to define an ideal type of profession does not offer a complete view; Doing so ignores the relationship between professionalization, gender and power.
She illustrates that by assessing librarianship following four components of the ideal profession according to trait theory. Negative stereotypes considering working women have a big influence in librarianship being assessed as semi-professional according to trait theory.
The first component she analyzes is that a profession has to have a specialist knowledge base, of which the output is original. This is for a number of reasons problematic for a female-dominated job like librarian according to Ilett. Stereotypically women are seen as less likely to have specialist knowledge. Next to that women are typically given work that is of a lesser importance, having a secondary relationship to knowledge production.
This same mechanism is visible in the image of a librarian, where one may see activities as cataloging and classification as secondary to knowledge production. There is however criticism on this view, which Ilett shows by for example referring to Gwinup (1974): “librarianship contained independent thought and ‘intellectual functions’ beyond the clerical”.
A second component is that according to trait theory a profession needs to be open for unions. Librarians are less engaged with unions then other workers. Ilett argues that women librarians are less likely to be active in a union, because challenging male management structures is associated with unfeminine behavior. Another problem is that many women have to balance home and work roles, leaving less time for engagement in an union. The professional associations set up for librarians were not established with the needs and challenges of women in mind.
The third component is that essential parts of librarianship are gendered. Service is an important part of librarianship, but it is extremely related to gender assumptions. Trait theory positions professional services as being an expert-consumer relationship. The equal and democratic relationship between user and librarian doesn’t fit in the male view of professionalism.
To conclude, Ilett analyzes the need for autonomy and independence as essential traits for a profession. These aspects are negatively affected by the negative image of librarianship. One cause of this is the view of librarianship as a secondary job. The author gives as an example that in the National Health Service librarians are paid within administrative and clerical grades, as there is no separate payment grade for them. Another reason for this negative image are sexist archetypes that devalue the profession for both men and women.
Masters, C., 2015. Women's Ways of Structuring Data. Ada New Media. Available from: http://adanewmedia.org/blog/2015/11/01/issue8-masters/
Abstract (50)
Christine Masters addresses the gender bias in the organizational schemes of cataloging systems, caused by an hierarchical structure and limited view on the diversity of users of these systems. Using examples of alternative systems, she proposes a feminist way to data organization.
Synopsis (500)
Christine Masters addresses the gender bias in the organizational schemes of data structures like databases in Women’s ways of structuring data. These data structures are especially relevant online, because they form the foundation of many popular web applications.
Some of the early classification systems of libraries form the basis a big part of the practice of data structuring today. Masters refers to an article by Hope Olson (2001) which shows that the early classification systems already had gender problems. Causes of these problems were the belief of a universal type of users for the system and the goal of the creators to create hierarchical relationships between categories and
An issue within these systems is that sub categories are not evenly distributed across topic and that a male-centric world view is enforced. Another problem caused by these systems is that some items don’t fit in one of the subcategories. This results in for example all works by women lumped together, or that that for example items about Afro-American adolescents don’t end up in the category adolescents, but are pushed to the margins of the collection by being put in a subcategory of the category Afro-American.
Masters then lists a number of alternative systems to organize data to avoid these problems. First she highlights the recommendations made by the article of Olson. Cataloging systems should stop assuming there is a singular audience using the system, and part of the power of structuring the system should be given to all users, instead of a small group. She remarks that free text searching could help in finding topics that are not represented in the controlled vocabulary of classification categories of a data structure. Another example of an alternative cataloging system is the Orlando database. This system provides information on women writers and their works. The data structure is not organized hierarchically, but uses a system of tagging. This makes it possible to have a less constrained organization of data, but the process of adding tags is highly interpretative. It is difficult to achieve consistency in tagging with many different people working on the task. For the Orlando database the creators dealt with this issue by connecting the tags to three specific aspects: biography, writing and events. Next to that tags are cleaned up by algorythms and taggers are trained to follow specific protocols. By using tags that are familiar to literary scholars, Orlando tries to fit women’s history in the existing male-centered knowledge framework and make their work visible.
Reflecting on these examples, Masters wonders if there are specific feminist ways to structure data. She stands for conscious structuring, where the system shows its motivations. The feminist data structure would use classification categories that are consciously shown and defined. The selection would be made as fairly as possible, by the users of the system. The process of creating such a system would require a lot of reflection and collaboration, but could be an effective strategy to address marginalization of groups within data structures.
Gilley, J., 2006. Information Science: Not Just for Boys Anymore. American Libraries 37, 50–51.
Abstract
Jennifer Gilley compares the male/female ratio in different information science programs. She finds out that information science courses offered by library school have more female students than the average for the field.
Synopsis
In this article Jennifer Gilley compares the male/female ratio in the student bodies of different information science courses in the United States. Although this discipline traditionally has a higher percentage of male students, some courses have 50/50 split. The differences in student population are however significant when comparing Master's that are accredited by the American Librarian Association (ALA) with non-ALA-accredited programmes. The ALA designated programmes have a majority of female students, while the opposite is true for information science courses that have no affiliation with any library science course.
Gilley dives further into this phenomenon and finds out that this is a side effect of the growing convergence of library science and information science. Schools are merging both disciplines to adapt to this development. The result is that the label of librarian influences the image that people have of a specific information science master's. A course inside a librarian school, with an ALA accreditation, emphasizes how information technology can help people. By focusing on the service ethic, it appeals to aspects which are important to women when choosing a field for their career.
To illustrate this redefinition of library science and information science, Gilley refers to the Kellogg-ALISE Information Professions and Education Renewal Project (KALIPER). This project enabled four library schools to create courses that include an education information science, starting in 1994-1996. In 1998 all curricula of schools in this discipline were analyzed, to examine the state of library science education at that moment. The author than makes an historical analogy, referring to the Williamson Report. This research from 1923 had the aim to create standards in library education. The outcomes were a division between skilled work and clerical work, and a recommendation to professionalize the discipline by obligatory higher education and a bigger influx of men.
Feminist librarians like Suzanne Hildenbrand and Sarah Pritchard warn that KALIPER could have the same result, because of its focus on technology which could create male-dominated hierarchies. Gilley argues however that the current number of female students in information science programmes proofs the opposite. She thinks that differences between librarians and information scientists in terms of gender division, public image and salary, will disappear in the future.
Ferreira, M.M., 2012. O profissional da informação no mundo do trabalho e as relações de gênero. Transinformação 15.
Abstract
Maria Mary Ferreira examines the conditions which make people choose to pursue a career in library sciences, focusing on ‘women’s work’ and the position of Brazilian women in the work space.
Synopsis
In this article Maria Mary Ferreira studies the field of library science in Brazil from the perspective of gender. She discusses three aspects in her paper: gender as a theme of research in Brazil, reflections on the position of women in the work space, focusing on librarians. She concludes with an analysis of the current situation and various propositions for new research and debate.
Ferreira defines gender studies as a way to analyze and refer to the social organization of the relation between the different sexes. In the past, research on gender in Brazil focused on the perspective of inequality between groups, and individuals. Two perspectives were leading in these studies: The public, concerning politics, and the private, concerning the domestic situation.
At first, universities were reluctant about the use of methods from gender studies in research. Critics thought that these studies couldn't be united with scientific values like objectivity and neutrality. Ferreira puts against this argument that the universities weren't neutral in the first place: Brazilian universities had a masculine culture that found its way in the research done.
Later, universities were more open to perform studies from a gender critic perspective. Ferreira then shows an example of a study done with these methods. In the past, female rural workers were not recognized as part of the workforce. Agriculture was defined as a strictly male field. The same was the case for activities related to housekeeping. Taking a critical standpoint from a gender perspective, enforced researcher to 'see' these activities as real work and part of the discourse.
Concerning women that are active on the Brazilian job market, Ferreira refers to a report that shows that 65% of the women are working as servants or cleaners. The high status jobs are generally male-dominated and women are excluded. On the other hand, women are nowadays the majority of the student population in Brazil, where most pursue an education in education, health, or social sciences.
Zooming in on the field of Library Science in Brazil, Ferreira points out that it is one of the most feminized professions. Women entered this discipline in big numbers during the 1950's and 1960's. The author cites Castro (1997) to show that this development started in the 1920's, as the 'institutionalization of the profession coincides with the feminization of it.' Education in Library Science was based on courses in the United States and France.
In these courses the profession was shaped into the form of the 'women's job'. The courses focused on education librarians to offer service to people for example. This made the public image of Library Science similar to Pedagogy.
Ferreira draws a number of conclusions from here literature research. Analyzing librarianship from a gender perspective is useful to understand the public image of the librarian and its low social status. On the other hand she believes that other analytical factors like social classes should be included for a more complete view. Still, 'discussing the gender issues, (...) means to revaluate the work of the female librarian'. And this might be the starting point for a new and improved image of the librarian in Brazil.
Olson, H.A., 2001. The Power to Name: Representation in Library Catalogs. Signs 26, 639–668.
Abstract
In this article Hope Olson examines the causes of biases in cataloging system like the LCSH and DDC. Olson proposes several strategies to solve these problems, criticising the closed vocabulary and hierarchical structure present in the most popular systems.
Synopsis
A large body of literature shows that cataloging systems used by libraries feature biases of for example gender, race and sexuality. These limits make it more difficult for people to find information outside of the mainstream. In this article Hope Olson examines the causes of these biases: the assumption that universal language is required, and that a data structure should be hierarchical.
The notion of a controlled vocabulary to gather similar items in a catalog is already present in early systems. A problematic aspect according to Olson is that these systems claim to be unbiased and universal, but they hide exclusions under the guise of neutrality. She refers to Charles Cutter's Rules for a Printed Dictionary Catalog (1876), where the author states that the creation of categories should be done with the convenience of the users in mind. Cutter uses a singular view on the users however, in a way that the needs of the majority are leading for the system. For the structure Cutter prefers a hierarchical structure based on logic for his controlled language. He proposes to go from broader terms to narrower terms, e.g. Animals > Carnivores > Dogs. Olson concludes that the controlled vocabulary and this particular structure limits terms for naming information and interpreting them.
These principles still form the basis of the Library of Congress Subject Headings (LCSH) system. By comparing the broad terms Women and Men Olson shows the bias in the structure. Many narrower term that are indexed under Women draw attention to those being an exception to the female norm. Gifted women is classified under the term gifted women, part of the category Women. There is however no specific category for gifted men. In contrast to Cutter, the LCSH bases its categories on the language of published authors. Although the LCSH claims universality, it reflects biases in its collection.
Another cataloging system, the DDC by Melvil Dewey, used universal language with the goal to simplify communication. His focus was not on the content of the subjects, but rather on the boxes they fit in. In his structure he connected categories to specific numbers in a decimal system. Ten general terms would be divided in hundred narrower terms. These basic terms were defined by specialists on the topic, for example botanists for botanic topics. Although efficient, the system is not flexible. In case there are more then 10 general terms, topics that are outside of the mainstream were lumped together in a category. Another problem is marginalization of topics. The DDC uses specific rules about priority in categorization. The result is that some related topic are spread through the whole catalog in numerous marginalized categories.
As a solution to these problems Olson doesn't recommend a new standard for information management. She proposes search tools that work around the constrained categories. Examples are free text searching, multilingual catalogs that allow more headings for one specific topic and encouraging users to add their own relations between catalog items.
Reading - Alice
After the future: n Hypotheses of Post-Cyber Feminism by Helen Hester
Abstract
Helen Hester, who is a member of the feminist collective Laboria Cuboniks, provides a critical approach to early cyberfeminist theories, while attempting to provide new methods of applying them to today's context.
Synopsis
Hester explores the practices and debates that appeared in the relationship between gender and technology, in the idea that they are still necessary and relevant today, even though taking them for granted is not advisable. Her first criticism of cyberfeminism comes from semantics. Quoting Susanna Paasonen, Hester makes the point that the word 'cyber' is too outdated and unappealing, and cannot be applied to today's 'domestication' of digital use. The term 'cyber' seems to describe a future in the world of technology, as seen three decades ago, and does not have much significance for contemporary technology users.
The work of Sadie Plant, 'Zeros + Ones' in particular, is at the base of this debate. Plant links the development of technology to the introduction of women in the workplace. The typewriter, viewed as a disruptive element in the work field, dominated by a monolith of white, cis males, which allowed women the power to manipulate information through tactile intervention. The women behind the typewriters are compared to a legion of zeros, against the male 'one'. Hester is slightly critical of Plant's insistence on blurring the boundaries between psychoanalysis, labor theories and gender as disruption, while being unable to provide a clear strategy for political action. As Alberto Toscano notes, this work promotes the idea that change can come from individual users of technology, on a micro level, rather that organizing together against the power structures.
A new approach to cyberfeminism includes the desire of creating self-organized networks, online and offline. There is also a tendency towards decentralization and inclusion, which can be translated in cyberfeminists reluctance to agree on a set definiton for the term cyberfeminism. In recent years, a large number of variations on feminist groups have appeared, throughout which one's identification as feminist or as woman is not enough of a link between individuals.
Therefore, the concept of disidentification has been put forward. One of the most significant examples from cyberfeminist approaches is the Old Boys Network's '100 anti-theses of cyberfeminism'. Represented by a list of 100 anti-definitions, the work maps the concept of cyberfeminism through what it is not, rather than what it is. While these anti-definitions are an attempt to break the limits and remove any labels from the movement, it does not offer much clues regarding its direction or purpose, thus preventing a collaborative action that could arise from commonalities between individuals. Refusing to settle for an identity can have a double impact: first, it can provide anyone an entry point into the culture, without implying the need for previous affinities or knowledge; second, it restricts the potential for association based on common interests and values, since it does not offer any indication of these ideas or values in the first place.
To conclude, Hester suggests ways to bring these concepts into a more relevant approach. She claims that, since the political context has largely changed since the end of the 20th century, the approaches also need to change. A suggested method of 'reanimating cyberfeminism for the 21 century' is to take the risk of moving beyond disidentification, and creating instead a definition that does not restrict, but is fluid and invites collaborative practices. She further suggests the creation of 'n hypotheses' as a reconfiguration to the '100 anti-theses', an unlimited series of positive statements that can invite further exploration and political action. She concludes by offering her own version of a definition: 'Xenofeminism is a gender abolitionist, anti-naturalist, technomaterialist form of post-humanism, building upon the insights of cyberfeminism. Its future is unmanned'
The scent of a woman's text - Are women writers really inferior? by Francine Prose
Abstract
In an article appeared in Harper's Magazine in 1998, Francine Prose takes a critical approach to the alleged differences between the writing styles of female and male writers, based on the preconception that the reader can identify the gender of the author.
Synopsis
In an age when whatever Oprah Winfrey reads becomes an instant success, Prose looks at the gender imbalance that still, somehow, exists in the literary field. On what literary awards are concerned, the battles are still being fought by a majority of men, even though every year more and more high quality books written by women appear. In the area of non-commercial fiction, critical reviews are essential for a writer to be recognized and, ultimately, sold, and women often suffer from „critical neglect”.
Since statistics show that works by women are less represented than works by men, Prose asks a few questions: „Is fiction by women really worse? Perhaps we simply haven't learned how to read what women write?”. Author Diane Johnson argues that men are not accustomed with the topics that women might write about. Do women and men approach writing differently, or do we, as readers, have different expectations from a written work, based on the gender of its author? Some readers might assume that a work by a female author might not deal with „serious” topics, a claim that Virginia Woolf also refers to in „A room of one's own.”
Even though today most people will deny that they have any gender preconceptions regarding writing, past critics have not been shy to express their opinions on female writers. One such example is Norman Mailer, who famously claimed that he doubts „if there will be a really exciting woman writer until the first whore becomes a call girl and tells her tale.” Thus, the only stories worth telling by women, in his view, would still be on their experiences with men. Other examples of men generalizing women's writing based mostly on their gender are presented, from Edmund Wilson claiming women writers complain constantly or Bernard Bergonzi's opinion that women novelists „like to keep their focus narrow”. Therefore, it is not much of a surprise that female authors throughout history have hidden their true selves under male pseudonyms, such as George Eliot or the Bronte sisters.
Following a quote from Cynthia Ozick's essay „Previsions of the Demise of the Dancing Dog”, where she presents some texts to students who later claim they can identify a female writer by the style of her „sentimental” writing, Prose gives us her own examples of male and female authored texts, in a blind-taste attempt. She offers examples that turn the assumptions on their head, with male authors with a narrow focus and sentences filled with emotions, to cold, rational paragraphs conceived by women. Another characteristic believed to be specific to female writers is their focus on interiors, family stories and, in general, private topics that care less general than the men who write adventurous works of fiction. Prose quickly destroys this assumption with examples that state the opposite.
Male writers are rarely criticized in the same terms as women. Men authors are often praised for expressing anger or other emotions, for tacking complex plots, focusing their stories in a single location, or spreading their writing in hundreds of pages, while women get criticized for doing the same.
To conclude, Prose imagines a future in which gender will be forgotten, at least in the literary world. Since there is little truth in the assumption that women and men write differently, when authors will no longer be put into gendered categories, writing will simply be good, or bad, regardless of who is behind the words.
Reading – Alex
Lev Manovich: Database as symbolic Form
In the article »Database as symbolic Form« Lev Manovich describes the attributes of databases as a new way to perceive and structure the world and he explains its relation to the narrative.
He begins his article with the definition of what a database is. According to him it is a structured collection of data. There are many different types as for instance the hierarchical one, networks, relational and object-oriented databases and there is a variation in how data is being stored in them. To point out the current state of databases he brings forward the example of a virtual museum, that would have a database as a backend. You would be able to view, search or sort this data, for example by date, by artist, by country or by any other thinkable metadata. This leads him to the conclusion that a database can be read in multiple ways. Manovich elaborates further on this by comparing the database with the narrative as we know it from books or the cinema: »Many new media objects do not tell stories; they don’t have a beginning or an end; in fact, they don’t have any development, thematically, formally or otherwise, which would organize their elements into a sequence. Instead, they are collections of individual items, where every item has the same significance as any other.« A database is a new way to structure the world and our experience of it. He also names webpages, that are hierarchically structured via tags, as a form of database, containing links, images, text or video. With this example the dynamic aspect of a database becomes visible, meaning that you can add, edit or delete any element at any time. This also means that a website is never complete.
Furthermore Manovich examines the differences of narrative and database. The database »represents the world as a list of items«, while the narrative »creates a cause-and-effect trajectory of seemingly unordered items«. He sees the database as the enemy of the narrative. As they both claim the exclusive way of explaining the world, they are not able to co-exist. But he also sees some similarities and intersections between the both of them. In computer games for instance you follow a certain narrative, although it is based on a database. This is what he later names as the interactive narrative. And he also shows examples of databases even before the time of new media for example in books like photo albums or encyclopedias.
Manovich also takes into account the interface through which we perceive the data that lays beneath. This interface can reveal the database in different forms, creating unique narratives for each user. In the last part of the text Manovich makes connections between art, the cinema and the database-logic. A movie editor for instance is selecting his material like a computer user would from a database and creates one fixed narrative with it.
In conclusion Manovich discusses two very significant points of how the word is being represented in new media: one, the interface that lays on top and second the database that is behind it. His comparison with the narrative is crucial in order to understand how the new media changes our perception of information, which eventually creates the world as we know it.
William S. Burroughs: The Invisible Generation
Burroughs examines, that what we see is determined to a large extent by what we hear. He underlays that thesis with a row of experiments and questions that address the function and impact of the spoken and written word.
In the article Invisible Generation Burroughs describes varies experiments with tape recorders in order to examine the impact of text and especially speech on humans. These experiments where initially describes by Ian Sommerville from London, for whom Burroughs is writing as a ghost.
When first seeing the article one can already notice at the first glance the specific layout of the text. It is written without punctation and with spaces between the sentences that already suggest that the text may have been cut up and being reassembled. He starts the text with his main thesis: what we see is determined to a large extent by what we hear. As an example for this assumption he suggests the following experiment: Replace the soundtrack of a movie and watch it. What can be seen from this, is that the new soundtrack may seem appropriate and therefor changes the interpretation of the images.
Burroughs seems to be fascinated by the possibilities and creative resources a tape recorder provides. A recording can be replayed infinited times or it can be edited. He proposes different »exercises« to liberate the words from its expected meaning. You can speed up or reverse a recording and then by trying to imitate the resulting sounds find new meanings. Furthermore Burroughs proposes to put tape recorders on the streets, on parties, everywhere, rerecord, and then rerecord the rerecorded, take parts of the rerecord and record again, press stop and record in certain intervalls and so on… there are endless possibilities to abstract the spoken word and make new meaning out of it. He wants people to gather new material to come to an unexpected output, to take away the trained and restricting understanding of language. Another method is called the irrelevant response, in which Burroughs makes two tape recorders speak to each other. The dialog is random but our brain empowers us to make meaning out of it. It's a method to break the expected. He even states that a tape recorder is an externalized section of the human nervous system. It gives us more control and makes us learn more about the nervous system.
Towards the end of the article he also takes attention towards the possibly negative effects of tape recordings and its influence. If you record only negative voices and filter out positives one, you can create a very negative image and influence someones opinion. With the example of advertising agencies he shows how sound recordings may be used to manipulate people. To break this down spiral of negative recordings, resulting in negative voices, that are recorded again he – again – suggests to use cut-up to cut association lines.
Reading – Zalán
Lawrence Liang – Shadow Libraries
Abstract
In this article Lawrence Liang builds up a metaphor between the ancient library of Alexandria, through shadow libraries till Michel Foucault’s heterotopia.
Synopsis
What are the similarities from the monumental ancient library of Alexandria, the New York public library, a collective enterprise like library.nu if not the word library? How could shadow libraries described as heterotopian environments? In the following synopsis I will elaborate more on these question and will try underline the most essential points of Lawrence Liang’s writing called Shadow Libraries published on the e-flux online journal platform in September 2012.
At the begin of the article the author describes, how through a rainy night his home library flooded and water leaked from the roof and through the walls. Through this accident he elaborates on comparisons about the fragile histories of books from the library of Alexandria to the great Florence flood of 1966. The popular linking library website Library.nu, suddenly created the impression of the universal library seem like reality. Unfortunately due copyright law issues this site was shut down and if it were ever possible to experience, what the burning of the ancient library of Alexandria must have felt like.
Referring to the first question “What are the similarities from the monumental ancient library from Alexandria, the New York public library, a collective enterprise like library.nu if not the word library?” I would state it with Lawrence Liang’s words: “ As spaces they may have little in common but as virtual spaces they speak as equals even if the scale of their imagination may differ. All of them partake of their share in the world of logotopias.” The curator Sascha Hastings described these places as “word places”–a happy coincidence of architecture and language. The burning of the library of Alexandria became a myth of all libraries. No one knows, how the library looked like, what was it’s content and what was it’s loss? We could argue about the loss of all the forms of knowledge in the world in a particular time. Diodorus Siculus, the Sicilian historian describes in the first century BC, a shadow library surviving the fire that destroyed the primary library of Alexandria, but has since been eclipsed by the latter’s myth.
Alberto Manuel states that “The Tower of Babel in space and the Library of Alexandria in time are the twin symbols of these ambitions. In their shadow, my small library is a reminder of both impossible yearnings—the desire to contain all the tongues of Babel and the longing to possess all the volumes of Alexandria.” (“My Library” in Hastings and Shipman eds.Logotopia, The Library in Art and Architecture and the Imagination, (Cambridge Galleries: ABC Art Books Canada, 2008).
Moving from the ancient library of Alexandria to the statement of library as paradise from Borges–describing, not as a spatial idea but a temporal one: that it was only within the confines of infinity that one imagine finishing reading one’s library. Thinking of shadow library as a way of thinking about what it means to dwell in knowledge.
In the end of the article Lawrence Liang compares the shadow library with heterotopia–a term popularised by Michel Foucault both in terms of language as well as spatial metaphor, stating “Heterotopias destabilize the ground from which we build order and in doing so reframe the very epistemic basis of how we know.” Concluding the article the heterotopic pleasure of our finite selves reaches till the infinity.
Opinion
Lawrence Liang expressed in the article a very inspiring metaphor of the library of Alexandria, as the new idea of the human itself. Nevertheless creating a new comparison between shadow libraries and heterotopias. It’s interesting to me that the author uses philosophical elements of describing the history of shadow libraries.
Reading – Angeliki
(reading practices and shared knowledge) Haraway, Donna. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies 14, no. 3 (1988)
(Bazzichelli, Tatiana. Extra Gender from Networking: The Net as Artwork. BoD – Books on Demand, 2009.)
Ong, Walter J. Orality and Literacy. 2 edition. London: Routledge, 2002.
LAST TIME
Reading – Tash
The Electronic Revolution, by William Burroughs
What is it saying (thesis)? (800)
The Electronic Revolution is an essay by William S. Burroughs, first published in 1970. It follows his experimental period of writing, in which he became fascinated by the ‘cut-up’ technique, and the subversive power of the written and recorded word. The piece is divided into two parts, and is written in a variety of styles, from the more formal and scientific, to streams of consciousness and even poetry.
Part one, entitled “The Feedback from Watergate to the Garden of Eden” introduces us to Burroughs’ theory that the written word is, “literally a virus that made spoken word possible.” He posits that the ability to write and convey information across generations is the distinguishing feature of human beings. It is the thing which separates us from other animals, and makes us into “time-binding machines”. However, while other writers and scholars of grammatology often extol the virtues of the written word, Burroughs is suspicious, resistant. He sees human language, and especially the alphabetic, non-pictorial kind, as an “unrecognized virus” which has attained a “state of wholly benign equilibrium with its host.”
Burroughs continues on to describe the ‘word virus’ through metaphors, and also literally as a biological mutation. He brings us into the realm of science by putting forward a theory that apes evolved into humans as a consequence of a virus which, when it didn’t kill them, physically altered the shape of their throats and skulls. This alteration, he says, is what allowed the first humans to speak. Burroughs then makes comparisons and connections to the biblical Garden of Eden and man’s original sin. For him, the ‘unit of word and image’ is as dangerous, and potentially fatal as Adam and Eve’s forbidden fruit. Returning to the contemporary era, Burroughs warns: “So now with the tape recorders of Watergate and the fall out from atomic testing, the virus stirs uneasy in all your white throats.”
The second part of the essay deals further with the idea of the human voice as a weapon, and the power of communications technologies to control man’s thoughts and actions. It focuses on the potential uses of tape recording technology, and especially on the effect that playback has on the human psyche. “Some of the power in the word is released by simple playback, as anyone can verify who will take the time to experiment.” Following his hypothesis that the word is a virus, then playback becomes a weapon of mass infection. He puts forward several examples of the volatile relationship between reality, recording and playback, including one about how to incite a riot in a crowd, using spliced up tapes of previous riots. Here he infers that there is a fundamental connection between human psychology and language technologies, and that the disease/control/breakdown of one is just as impactful on the other.
Following this, Burroughs’ mistrust of mass media is one of the major themes of this text. Referring to The Invisible Generation, an earlier piece of writing in which he uses his famous ‘cut-up’ method, he talks about the “potential of thousands of people with recorders, portable and stationary, messages passed along like signal drums, of the President’s speech up and down the balconies, in and out open windows, through walls…” His tone is conspiratorial and energetic as he continues to stress the political function of recorded messages. “You can cut the mutter line of mass media and put the altered mutter line out in the streets with a tape recorder.”
Moving on from cut-ups, Burroughs starts to talk about voice and video scramblings. He compares scrambles to viruses, demonstrates through a series of wildly interjecting pieces of text, and contemplates their uses. Though his examples are often about creating fear and anxiety, he also wonders if these techniques could be used for good. “Is it possible to create a virus which will communicate calm and sweet reasonableness?”
The piece ends with a suggestion for resisting the potential dangers of the ‘word virus’. Burroughs wants to change the system at its root, proposing a new way of writing language, and therefore also the thinking and speaking of it: “A far-reaching biologic weapon can be forged from a new language… The aim of this project is to build up a language in which certain falsifications inherit in all existing western languages will be made incapable of formulation.”
What is its conclusion? (100)
William Burroughs’ basic theory that alphabets and languages contain a ‘virus’, leads him to devise a series of experiments, in an attempt to hack or even replicate the virus. In this way, his famous cut-up method is not just about formal or creative investigation, it is a means of subverting language, which he sees as an anonymous force of social control. The essay argues that we must not take the written word for granted – not its origins, nor its consequences. Ultimately, this piece is a call to action, to be more critical of mass media and how language can be used to influence and create events (real or imaginary).
What is your opinion? (100)
Burroughs erratic writing style is sometimes difficult to follow. But I find his unique point of view refreshing. Unlike Otto Neurath, for example, who sees language and the written word as a system of order and democracy, Burroughs stance is much more dystopian. He seems to be fascinated by the entropy of information, and obsessed by how to subvert and resist the ‘negentropy’ that Norbert Wiener posited. His distrust of language technologies is interesting. I would love to research how the current field of cybernetics deals with power and politics. What are the inherent biases in the English language? In other languages? In software? How can a book scanner reveal some of these inner workings, or subvert them? What other fields of 'language subversion' are there? Steganography?
Reading – Zalán
The Medium is the Message, by Marshall McLuhan
What is it saying (thesis)? (800)
In 1964 published book Understanding Media Marshall McLuhan states that the ‘Media is the message’. What does this challenging and radical idea means? How does it affects us in the age of mass communication and internet? In the following synopsis I will elaborate more on this questions and make a bridge with current social media behaviours on common platforms such as Facebook, Instagram and WhatsApp.
‘This is merely to say that the personal and social consequences of any medium–that is, of any extension of ourselves–result from the new scale that is introduced into our affairs by each extension of ourselves…’ (McLuhan, 1964, p.7)
New technologies and automation has both effects on humankind, such as in positive way stating McLuhan (1964, p.7) ‘a depth of involvement in their work and human associations that our preceding mechanical technology had destroyed.’ In contrary with this new technology ‘new patterns of human association tend to eliminate jobs’ argues the author. (McLuhan, 1964, p.7) The mediums have more important impacts on the fundamental shape of the society, than any message, that is delivered through that medium.
Taking the electric light as the first example and stating, that this is pure information–a medium without message. (McLuhan, 1964, p.8) While looking to activities such as brain surgery and night baseball these activities became in some way the “content” of the electric light, since it would it be impossible without it. Day and night activities were reconstructed by the electric light. We can argue that the electric light can not be categorised as communication medium, because it has no “content”. This lasted till electric light started to be associated with brand names, which established as a medium. It’s message communicated the message of electric power in industry, totally radical, pervasive and decentralised. The uses of electric light and power are separate, but they eliminate time and space elements in human associations creating involvement in depth.
If we take, that the content of writing is speech, it is important to add, that it is a nonverbal process of thought. McLuhan argues, that ‘characteristic of all media, mean that the “content” of any medium is always another medium.’ (1964, p.8)
The cities, work, leisure and transportation were totally reshaped by the introduction of the railway. Since the Industrial Revolution it became a very important medium, and modifying the way, how we commute in our everyday life. On the other hand the airplane and car influenced the railway form of the city, politics, and association to dissolve. Recreating new mobility mediums in the urban and rural environments.
Taking the radio, telephone and the television as an other example, it is essential to underline, how those mediums transformed our division of time and changed totally our daily habits.
Looking to the field of cinematography the author formulates that ‘mechanisation was never so vividly fragmented or sequential as in the birth of the movies, the moment that translated us beyond mechanism into the world of growth and organic interrelation.’ (1964, p.12) It becomes a creative configuration and structure through sheer speeding up the mechanical world of sequences and connections. McLuhan quotes a statement about the moment as cubism in movie arrived by E. H. Gombrich (Art and Illusion) as ‘the most radical attempt to stamp out ambiguity and to enforce one reading of the picture–that of a manmade construction, a collared canvas.’ Cubism creates a perspective illusion, an interplay between dimensions and textures that “drives home the message” by involvement. Through the enormous development of the technology the understanding of art changed as well. The observers were able to understand the totality of a cubism art works, which leads to the statement the medium is the message. Before this time, the message was the “content”, as visitors used to ask the meaning of the artwork.
Moving from cubism to nineteenth century Alexis de Tocqueville, a master of his time in grammar of print and typography. He had the ability to read off the message of coming change in France and America as if he were reading aloud from a text that had been handed to him. Knowing as well when the grammar did not apply. De Tocqueville really admired and knew England, so he got asked to write a book on England. His answer was ‘One would have to have an usual degree of philosophical folly to believe oneself able to judge England in six months. A year always seemed to me too short time in which to appreciate the United States properly, and it is much easier to acquire clear and precise notions about the American Union than about Great Britain. In America all laws derive in a sense from the same line of thought. The whole of society, so to speak, is founded upon a singe fact; everything springs from a simple principle. One could compare America to a forest pierced by a multitude of straight roads all converging on the same point. One has only to find the centre and everything is revealed at a glance. But in England the paths run criss-cross, and it is only by travelling down each one of them that one can build up a picture of the whole.’ De Tocqueville understood the contrast of the print culture in England and America and argued that, the most important event in English history has never taken place; namely the English Revolution.
What is its conclusion? (100)
Concluding the essay ‘Media is the message’ McLuhan’s idea was far ahead his time. His statement is possible to apply from electric light through cubism till social media of our time. Underlining one of his important sentences: ‘Many people would be disposed to say that it was not the machine, but what one did with the machine, that was its meaning or message.’ (McLuhan, 1964, p.7)
He understood the binary relations of global village popularised in his books The Gutenberg Galaxy: The Making of Typographic Man (1962) and Understanding Media (1964). McLuhan described how the organism are organised in the ecosystem and how everything is connected to each other.
What is your opinion? (100)
In my opinion his statement is applicable of analysing the internet and the social media platforms such as Facebook, Instagram and WhatsApp. Those reshaped the personal habits and rituals of social gatherings. Status updates, comments, followers, locations, likes, dislikes and emojis take more importance of our life. Hashtag quotes influence our communication methods on mediums such as Instagram and Facebook. Being offline becomes almost impossible nowadays, although this seemed as an utopian dream from a cyber movie a few years ago.
The internet becomes an extension of our real truth by finding out, what we like and teleporting us into a realm of hyperreality.
Reading – Alex
Uncreative Writing – Kenneth Goldsmith
In his book »Uncreative Writing: Managing Language in the Digital Age« published in 2011 Kenneth Goldsmith, poet and author, presents the concept of uncreative writing in order to »update« writing to make it suitable for the 21st century. He is a great advocate of appropriation and teaches his ideas about writing at the University of Pennsylvania.
“The world is full of texts, more or less interesting; I do not wish to add any more.” But instead of adding more to the world, what Goldsmith proposes to do is to alter and to handle the material that is already there. In his very first sentences you can already recognise his attitude towards plagiarism. Goldsmith presents new techniques and methodologies like appropriation, copy & pasting and remixing to create more literature, always aiming to re-invent the writing of today and broadening the boundaries of language. What has often been ignored by authors, says Goldsmith, are the possibilities and circumstances of the digital world that provides us with a lot of tools allowing us to create, edit and easily distribute text. “Most writing proceeds as if the internet had never happened”. Goldsmith also takes ways of writing »outside the scope of literary practice« into account like: »word processing, databasing, recycling, appropriation, intentional plagiarism, identity ciphering, and intensive programming«. He thinks that, unlike art, modern literature has not put enough effort into implementing these changes into its own practice. This is the reason for his book: opening new doors to new fields of literature.
The book begins with a few examples of uncreative writing, that he examines during the next chapters. By explaining the example of Letham who wrote an essay on plagiarism reassembling only excerpts from other people's work, he points out the current negative view on appropriation. In contradiction to his theory of appropriation plagiarism is mostly not accepted. Although such a work may represent a great piece of art, the academia neglects its creativity which is why they reject it. But even the process of copying something existing can generate creativity — in fact it's impossible to oppress it.
In the first chapter of the book Goldsmith wants to alter the understanding of what language means. Behind every image there is a code that consists of words. We can edit that cryptic language and save it as an image. As a result we end up with a new picture. Everything that can be seen on the digital screen lays on top of a layer of language, a language we should use for writing practice. Goldsmith slowly reveals the text behind this interface. What can changes to the code mean? How does it effect writing? Furthermore, he describes the materiality as crucial to the understanding of text. One also needs to take the whitespace, the position and the colour of a text into consideration for writing practices. This means to draw attention, not to what, but how things can be said. Just like concrete poetry used to do it.
Language is a highly unstable construct. Goldsmith explains this with a simple example: »A red circle«. Everyone has a different notion of what a red circle is. Not only the colour but also the shape as well as the environment may be totally different in everybody's imagination. Things become even more complex when a computer is being asked for a red circle, e.g. via a google search. This again expands the scope of language by including the internet respectively algorithms. The meaning of a text may change at any time and is not stable or fixed at all. It can be reinterpreted, translated or adapted. Later in chapter six Goldsmith returns to this point when talking about LeWitt’s work. LeWitt wrote down his art as recipes and let others do the painting for him. When reading the instruction for how the picture should look like the point of flexibility in interpretation of these texts becomes more obvious.
In the following chapters Goldsmith explains why appropriation should be accepted and why it is necessary. He states how it could be used as a writing-tool, saying that a lot of creativity emerges from copying or retyping. This also leads him to the assumption that not the originality or synthetic ability of a writer should be the indicators for the quality of a work, but instead the choice and the taste of the author – how one reassembles existing pieces. He also connects that to pre-digital examples like Walther Benjamin's “The Arcades Project”. He continues with examples from the field of visual arts and what writing can learn from it, considering Duchamp with his ready-mades. Goldsmith labels that kind of art as work to be thought about, not to be seen – a statement accompanying the whole book. This is part of his vision for writing, too: It should not just be read but it should make you think.
At the end of the book, Goldsmith turns towards computer-made literature and how computation can help treat massive datasets and how humans already adapted their reading habits to the digital media (whereas the writing practice didn't). His teaching at the university is also part of his explanations and how he enables his students to become creative by forcing them to be as uncreative as possible. He wants his students to become “unoriginal geniuses” and so he motivates anyone to become the latter.
Overall Goldsmith wants to encourage to take a different approach towards writing. He wants to widen the term of writing and the understanding of language. He is doing this by using techniques like appropriation or retyping. Ultimately the book wants to »update« writing and adapt it to the digital post-modern world. This is always accompanied by Goldsmith’s radical idea about appropriation.
I think this book is great to open your eyes for what is possible with text and it shows ideas how to create new writings in a digital environment. Goldsmith illustrates his arguments with great examples and references especially to art. His theory and great enthusiasm about the digital medium is refreshing. Furthermore I like his ideas about appropriation. We rarely do anything original. Even when I compare it to the field of design, I think that he is right when he says that everything is somehow a remix of what already existing. Nevertheless, his view often seems kind of radical, ignoring the complexity of copyright and legal issues. Moreover the book does not make it clear how creativity is defined. At some points Goldsmith attaches the word »creative« to things one could also call a rational decision. In conclusion, in my opinion it’s a great and profound reading to expand one's perspective on new techniques for writing and to give a new impulse for creating text in a digital world.
Reading - Alice
Cybernetics and Ghosts, by Italo Calvino
Synopsis
Italo Calvino begins his essay from 1967 by describing the beginnings of language. Humans in prehistoric times started using speech to describe their daily activities, establish rules for the community, create relationships within and outside their clan. He imagines the archetype of 'the first storyteller' as the creator of complex language, experimenting with combinations of words, associating words with real objects and beings. From a very limited set of notions available to the prehistoric human, a form of language was created by repeating the same sounds and gestures and adapting them to various contexts. Stories began to emerge through various permutations between the limited activities and characters known to humans.
A more complex version of these proto-stories is represented by folk stories, which every culture has developed in a desire to explain the world and phenomenons that surround them. Russian philologist Vladimir Propp proposed the idea that all folk stories follow the same structure in his work 'Morphology of the Folktale', claiming that 'all such tales were like variants of a single tale, and could be broken down into a number of narrative functions.' This does not mean that international folklore is not incredibly complex, but purely introduces the idea that it is the result of infinite permutations between a finite set of elements passed down through oral history. Claude Levi-Strauss took this idea one step further when working with the folk tales of Brazil, treating these elements through the lens of mathematical processes. The idea that literature can be broken down into functional segments has also been debated within linguistic and literary groups such as Russian formalists, the semiological school of Roland Barthes, as well as Oulipo. The group of French linguists looked as literature as a structure that can be deconstructed into pieces that can be analyzed in relation to the socio-cultural environment in which the work has been created, but also reconstructed almost to infinity by rearranging the pieces.
Since literature can be processed from a mathematical approach, made up of a finite number of elements and their permutations, Calvino starts drawing a connection to computer 'brains' which could potentially do the work of producing literature and be successful at it.
He proposes the idea that writers do nothing more than follow the set of rules that they have already tested empirically, which is inevitably based on paths established by previous writers. Writing, in his view, is not a product of divine inspiration, or a talent that cannot be described in terms of logic. This idea dismantles the concept of the author as superior creature gifted with attributes unavailable to the average man. And since the process of writing is no longer idealized, he poses the question of having machines replace poets and authors. This hypothetical machine would be fed information, literature, and produce, in turn, literature. In his view, a perfect literary machine, after being forced to produce classical literature as output, would eventually reprogram itself to produce a much needed human disorder, expressed through avant-garde literature situated at the intersection between culture, language and probability.
In order to further strengthen his arguments, Calvino proposes an opposite view. Literature, instead of conforming to norms of language, is actually striving to free itself from them, in order to express what has previously been hidden, obstructed, concealed. Literature is an expression of the social or individual unconscious. In literary history, every literary current is born from the 'ghosts' of previous conceptions, revealing what was previously hidden, and discovering surprising new concepts in the process, far from our previous ideas of logic and rationality.
The topic of artistic creation as a set of permutations is then developed further, with the example of puns and word-play. Freud seems to have had a particular interest in this area, which only strengthens Calvino's association with the unconscious. The main idea is that word-play is simply a game of combining words in new, unexpected ways, until we reach a funny combination. In this sense, poetry or painting are also the result of countless permutations between words and rhymes, or between colors and brush strokes. Writing, as well, is a process of combining existing syntactical structures and notions, until a deeper meaning can be extracted from the right combination. This meaning could not have been revealed through traditional, rational practices, but is simply a meaning produced by the unconscious. Here, Calvino's two apparently opposing arguments come together to better prove his point. Writing is a game of combinations, a set of rules that is at the base of the work, but the combinations and rules, when arranged in the right order, reveal an unexpected meaning, extracted deep from the unconscious.
Thesis
The thesis is that a set of rules, rather than divine inspiration is required in order to write. This concept brings a sense of relief to the author, knowing that writing does not rely on something as arbitrary and vague as talent or inspiration. Literature can thus be reduced to a basic structure, and he expresses this concept by saying 'the endless variety of living forms can be reduced to the combination of certain finite quantities'.
Opinion
I appreciate his view on the creation of literature, which comes from a realistic approach. He demystifies the author as a superior figure and puts forward the idea that literature is simply the result of permutations between elements of structure, a sort of word-play from which an unexpected meaning can be extracted. Thus, literature is a combination of 'cybernetics' in the sense of mathematical permutation, and 'ghosts' which reveal parts of our collective unconscious. I believe this is quite a refreshing position on literature for his time, as a foresight to predictive text bots that can imitate the style of any writer.
Video: Kurt Vonnegut, The shapes of stories
Reading- Angeliki
"Six selection by the Oulipo" in New Media Reader
Thesis of the text
The text is a selection of six writings by the Oulipo. Some of them are descriptions and some other examples of the methods and purpose of the Oulipo meetings. The oulipian writers were exploring potential literature by analyzing and synthesizing constraints. Some of their techniques were connected to algorithmic methods and other systems, related to phonetics or the alphabet. They were aiming to discover new tools that would enhance the involvement of the reader in the literary creation with the aid of the computer.
The first text by Raymond Queneau furnishes the reader with the possibility to combine different sentences in the creation of multiple poems, called as sonnets. His method is based on the materiality of the text; Cutting the paper into lines containing each sentence, a ventalia is created in which you choose which sentense will be combined with another one. The second tale of the same writer provides a numeric order of the storyline. Every number contains parts and alternatives of the story. The reader selects between two numbers every time that hide the next part of the story. The parts are intervening like footnotes into the next texts of the other oulipians.
Subsequently, Jean Lescure in his text refers to a brief history of the Oulipo. He talks about how the group first met at a conference, related to the College of Pataphysics, and that the initial goal was not about composing poems. The name of the group was quite an issue for them, as they wanted to inspire themselves and differ from past writers, to be experimental and open to potentials. The name Oulipo that prevailed, is about a workshop for potential literature. They claimed that the source of inspiration should be a series of constraints and structures, opposed to subjectivity. Thus, literature can be explored like other sciences, such as mathematics. One of their first exploration was about language and its origin meaning, as an abstract and concrete object. As an example the writer refers the language of Chinook, in which the subject is an abstract word and not a specific noun. Then he talks about its connection with literature. In the language many potential interpretations and meanings can exist not even conceived by the writer. Oulipians wanted to find out these potentials. One of their main concerns was to provide tools and relevant examples for the future writers beyond their affectivity. And this is why they divided their process in analytic and synthetic. At last, obstacles could be dealt as a tool of creation of a new and combinatory literature.
At the next text, Claude Berge presents the potentials of a combinatory literature. According to him the idea of combinations was already present in mathematics, plastic arts and other fields, but not in literature, where it was introduced in 1961. To define further this term in writing, he talks about configurations and constraints. He is giving several examples among types of linguistic constraints, like alphabetical, phonetic, syntactic, numerical and semantic. The transposition of concepts is the core of combinatory literature and one example of it is the factorial poetry, where some elements of the text can be altered. To achieve these results they used equations and present the sonnets in graphs. But, in order to invlonve the writer and the reader in a more interesting process they decided to create graphs without cocircuits, where the end of the story was determined in advance. Some of the poems, ask for the readers' participation, and were inspired by the instructions given to computers. Another form of literature is the episodic story, relative stories are embed as parenthesies on the base of a mathematical theory. At the end he gives an example of Latin bi-square, in which each character on the story acquires its behaviour of a table of numbers and letters where provide the choice of unlimited combinations.
Paul Fournel in the fifth text specifies further the involvement of computer in literary creation. He divides the processes in the Aided Reading and the Aided Creation. The first one refers to the combinatory and the algorithmic literature. In the former the computer provides to ther reader and execute combinations of a definite amount of tales. The latter asks from the reader with double questions what is the next part of the story. In the Aided Creation defines three types of relation between the author, the writer, the work and the computer. In the first type the writer decides for the story with the help of the macine. In the second one the reader should use the machine to solve enigmas and take clues of the tale, created from the first type. In the last type of creation the reader chooses with the machine which attributes the story wishes to have, like the length or the theme, among stories already created in the computer.
In relation to these mechanistic processes, that produce multiple and unlimited combinations of stories, Italo Calvino introduces the antocombinatorics. The idea of it is that the computer connects all the possible knots but with certain constraints. He talks specifically for the dectetive mystery genre, where the character can be followed by a number of possible actions, but the writer eliminates them by setting more realistic parameters. For example, one character cannot be killed twice, if there are several actions of killing. He gives three examples of structures based on the types of constraints. The objective constraints aid to a logical order of sequences, in which an element doesn't recant another another important one. The subjective constraints provides other ulterior combinations and changes of the attributes of the character. The esthetic constraints make sure that the story follows a logical and psychological accepted flow. Finally, he claims that the aid of the computer doesnt mean that replaces the creativity of the author but liberates himself from the shackles of the a combinatory search.
Conclusion
The core of the intention of the Oulipians was to differ from their romantic and arbitrary inspired ancestors. By appropriating methods from scientific fields and use them in the literary creation they aimed to furnish the upcoming generations with new intelligent tools, opposed to their affectivity. The conclusion is that the 'clinamen' or the error in the system, is actually what makes an action a work of art. The computer only helps the creator to eliminate fast the compinatory possibilities on the level of the desired outcome.
Opinion
My opinion is that the idea of constraints of the Oulipo movement defined and structured strictly also the character and the complexity of the group. Like the computer did at that time, the distance they kept from their contemporary political, physical, cultural and social sphere excluded other possibilities and potentials of the movement. The writers of these texts often refer with articles to male audience. It is my belief that today the decentralized concept of connecting different layers of our perception above the world and ourselves can provide more interesting and complex systems of creation, either in literature or other form of art. Finally, the appropriation of methods and previous work of art is a question that should arise in the creative act.
Douglas Huebler, "Variables", etc.
Notes Steve - Some context
"The early 1960s saw the publication of Eric Havelock’s Preface to Plato, Claude Levi-Strauss’ The Savage Mind, Marshall McLuhan’s The GutenbergGalaxy and Understanding Media, and Jack Goody and Ian Watt's The Consequences of Literacy (1963). For the most part media theory emerged in the 1960s, arising from debate around the subjects which were central to these books: orality and literacy. All were produced at a time when the waining of print-based media was apparent as the new medium of television was disrupting traditional media. It was also a time when the implications of a new computer-mediated discourse network, ushered in by cybernetics and information theory, was emerging. These publications described a trajectory that will take us to Walter Ong’s Orality and Literacy (1988) and Friedrich Kittler’s Discourse Networks (198*) and Gramophone, Film, Typewriter (19**).
Steve's notes
Orality & Literacy by Walter Ong
In Walter Ong’s reading (building on Havelock and others), literacy becomes a technology which renders the previous medial technology (the mnemonic system of orality) inconceivable and opaque to literate culture. This is because once language enters the field of vision in the form of written symbols (which allow the reader to visualise abstract concepts) it is impossible to un-think it or reverse their mediation. For Walter Ong: “Literacy consumes its own antecedents [it] consumes their memory.” (15). The literates’ relation to language is inextricably associated with its reproducibility in encoded form. Central to structuralists, post/structuralists and cybernetics alike, therefore are various forms of software (the alphabet included) which store, mediate and structure memory. Oral cultures survived millennia without need for such a visual-symbolic translation of natural language and never arrived at the necessity to outsource their memory to external storage systems such as the scroll or codex. (15). ¶ Oral cultures generated patterns, clustering subjects together, cycling rhythmic phrases in mnemonic loops. These technologies fashioned the subjects of orality to an equal degree as their literate counterparts were fashioned by their own technology of memory. For the writers in the early 1960s the cybernetic explanation became the means by which to deal with the issue of orality. “All the [smart] things to be said about computers can be spelled out in Plato’s Phaedrus”. (in ong?) If the software of the alphabet engendered a crisis in subjectivity, over the centuries the human subject managed to naturalise the code, sublating it and making it equivalent to natural language, engendering an illusion that writing was in some way a direct translation of nature. This ideology of the Romantics led to the veneration of the poet who approached the status of God in his generative, originary powers(Ong).
Steve's Notes
The Consequences of Literacy (1963) by Jack Goody and Ian Watt
In The Consequences of Literacy, Jack Goody and Ian Watt note that “man as talking animal [is studied] primarily by the anthropologist, and man as talking and writing animal primarily by the sociologist.”1For Goody and Watt history proper begins with writing; “history” as a category of knowledge transmission is produced with and by the mnemonic technology of writing. At the beginning of Goody & Watt's text we have the three elements which bare a relation to Burroughs’ argument [in the Electtronic revolution - see above] which are worth highlighting:
a) writing as technology;
b) writing's relation to orality and
c) writings relation to memory.
These three figures, given a perverse inflection, would also propel Burroughs' argument in The Electric Revolution.
For Goody & Watt natural human language allows for forms of social organisation to be passed down through generations. Although this resonates with Korzybski’s notion of “accumulation” Goody & Watt borrow Durkheim’s term “intellectual capital”. Because writing engenders a fundamentally different relation to symbol and referent wholly different systems of communication and technologies of memory are produced which engenders a different conception of time. In oral cultures “each wordis ratified in a succession of concrete situations” 2 The transmission of cultural tradition (cultural memory) in oral societies is regarded as “homeostatic” 3It is a way in which the past can be folded into the present to achieve social cohesion, it is in this way that “the tribal past is digested into the communal orientation of the present”. In literate cultures there is a different relation to the line of time as “literate society can not but enforce a more objective recognition of the distinction between what was and what is.”4
Julia Kul Reading
PREFACE TO PLATO - ERIC HAVELOCK INTRODUCING the growth of the early Greek mind.
[this needs editing, but it raises some interesting issues for us]
Let’s assume that history is like putting cultural informations into the collective storage. In Greece before Homer, the cultural book had been stored entirely in oral memory. The ear was the chief organ of communication that at that time had shaped dispersive collective memory (700B.C.) With the development of signs (the written language) it started to move slowly toward the eye dominance. It was a REVOLUTION that completely changed the entire linguistic system: vocabulary and syntax of words and sentences.
The first philosophers were living and speaking in a period which was still adjusting to the condition of a possible future LITERACY. The meaning of particular terms wasn’t define historically and it wasn’t fixed to any previous semiotic links or areas. Especially the philosophical terminology presented itself rather unstable and blurry. It tended to shift in accordance with different occurring contexts. With each new generation of orators and thinkers it has been undergoing reinterpretation and redefinition. Philosophical investigation wasn't rooted in the language and consequently its outcome (the philosophical thesis) lacked the crucial element; exactness. The lack of shared terminology seemed to be "analytically-creative" yet the fact that it wasn’t historically coined made early philosophers strongly oriented toward metaphysical problems and formulate abstract, greatly intuitional solutions by referring to one’s own semiotic system.
The alphabet was not an addition to the abilities of the tongue but the factor which started the remodelling process the pre-Platonic oral system into abstract vocabulary.
The Pre-socratics themselves were essentially oral prophets linked to the repeatable past and first organisers of new syntax of the future. They were devoted to the task of inventing a language by establishing some precise syntax categories necessary for addressing abstract statements. It was simply the process of building up a grid of pre-cordinates for future thinkers and assigning the particular coordinates (phenomenons) to unchangeable values. This assignation became fundamental to any kind of speculative activity. The terminology that Plato and Aristotle seek to define had to pass a long development period before reaching such a precision. Therefor Aristotle adopted the terminology of many other philosophers before him. The previous thinkers can claim an authority no greater than him.
Problems with intellectual authorship of particular linguistic forms caused the general distortion of oral theories.
PLATO ON POETRY
The role of poetry in early Greece was fundamental and powerful; it provided moral, spiritual guidelines and intellectual training for citizens. It functioned as a divine book, encyclopaedia containing all possible information, and ,what follows, it presented itself as a powerful didactic instrument. Plato rebelled against it since, in his understanding, poetry produced a false version of experience which is twice removed from reality: it is a knowledge produced by a fictitious narration which is afterwords re-enacted it on stage. Poetry’s impact on public was oral and not direct: authors never speak themselves, the knowledge was transmitted by actor’s reciting the poem. Poetry produces a "double mimesis" by using the empirical realm for purposes of imitation, theatric impersonation and over-dramatisation of everyday communication in fashion of extreme realism. Poetry therefor is dangerous both for morality and science: it is “crippling of the mind”,…the psychic poison confusing our intelligence, a prostitute that seduces reasoning, the enemy of TRUTH. It confuses man’s values and render him characterless. It creates a potential threat to the decent Greek citizens that should be professional in his social duties. (guardians, workers-food providers.) It should be removed from educational system.
School system was based on the similar oral method opposing the technology of writing till the end of 5.B.C
Literacy wasn’t popular ( Iliad wasn't always available in bookstores) so people could only listen to a performance.
The greek oral state of mind was for Plato the main enemy. Why? Because poetry was the only mean to control the verbal, person-to-person transition of information.
It was a rhythmic, metric pattern that discreetly invaded the soul of citizens.
POETRY WAS A PRESERVED COMMUNICATION; a living memory, a linguistic statement, a paradigm telling people what they are and how should they behave.
Plato’s Republic shouldn’t be read as utopian and strictly political programme.It is mostly educational proposal.
Once Republic is viewed as an attack on the existing educational apparatus of Greece, the logic of its organisation becomes clear. In that respect, it is not the utopian proposal.
Plato claimed that instead of over-dramatisation of reality, educational system should provide a clear description of reality. According to Plato that is precisely what philosophy is doing.
Who should therefor rule people’s hearts and minds? Philosophers or poets?
It was a power struggle between philosophers and poets. Poetry stood in Plato's way to propagate Platonism.