User:Joca/synopses
Drucker, J. (2011). Humanities approaches to interface theory. Culture Machine, 12.
In this paper, Johanna Drucker formulates a number of principles to create a theory of interface from the perspective of the humanities. Her motivation for that is the move of academic authoring and reading environments from print media to digital media. This creates the need to reflect on and formulate the theory of interface that is behind the interfaces academics use daily.
Traditionally interface is the domain of engineering and developed into its own field: Human-Computer Interaction. This is embodied in the task-oriented and efficiency-driven approach in interface design. Drucker finds this mechanistic approach problematic, because it looks at people as users of an interface, and not subjects whose engagement with a digital environment can be analyzed with the insights from critical studies.
Building on that she looks for a definition of an interface that fits this approach. She cites different authors: Chartiers (2004) and his concept of embodiment, Nunberg (1993) who applied embodiment on the context of digital interfaces and relates it to how it mediates cognitive and intellectual activities. Then Drucker refers to authors from the field of HCI, namely Laurel (1990) in which interface is the necessary contact between interactors and tasks to allow certain functions to be executed. She concludes with Long (1989), a sociologist that defined an interface as 'a critical point of interaction between life worlds.'
Showing these definitions from different fields, she states that, although diverse, there is much reason to jettison the idea of the interface as a thing. Drucker sees a interface as the combination of what we read and how we read, brought together by engagement. Thus, it is a provocation to cognitive experience.
By seeing the interface as a collection of protocols and activities, mediated by the graphical cues, content and the viewer, Drucker tries to find connection practices from media studies. One example she draws is from graphical reading practices. McCloud (1993) looks into ways we connect different frames in a comic. A category of connection that relates to a digital interface is the non-sequitur: We shift from text to advertisement to video without a visible connection. But unlike comics, there is no existing narrative to organize these tasks of correlation. As an alternative to the narrative, Drucker brings up Frame Analysis by Goffman (1974). It is a schematic outline of basic principles of translating information to cognitive value. It tries to analyze a combination of information, viewer and graphical interface in terms of relations rather than entities. As there is an interaction between the environment and the viewer, which differ at each reading of a text, there are n-dimensional ways to process the interfaced information.
Drucker concludes that her sketch of interface theory from a humanities perspective involves a synthesis of multiple approaches: graphical reading, frame analysis, and constructivist approaches to a subject. She states that these are fundamental to understand the dynamics of the codependent relations between (digital) environments and cognitive events.
Next to a mechanistic view on an interface, a different perspective should be accepted according to Drucker: one where ambiguity and uncertainty have a place and accommodate for a reading experience that is probabilistic.
Drucker, Johanna N.d.DHQ: Digital Humanities Quarterly: Performative Materiality and Theoretical Approaches to Interface. Digital Humanities Quarterly. http://www.digitalhumanities.org/dhq/vol/7/1/000143/000143.html
In this article, Drucker describes a theory of performative materiality and the application of it to interface design. Using mainstream principles of critical theory, the author wants to extend the analysis of interface, suggesting that it is not just about understanding how an interface is structured, but by what it does within different domains. After showing how she constructed the theory, she analyzes a number of websites to show possible design implications.
Drucker starts by describing the development of discussing materiality in the context of the digital, mentioning different models of materiality.
Models of materiality
Literal materiality
In the 1990's e-humanities were a new field and the discourse was based on binaries: old versus new media and as a byproduct the idea that digital objects only exist as an abstract binary flux. Matt Kirschenbaum argued that digital formats have a certain materiality, which Drucker describes in her literature review:
Formal materiality
This perspective looks at the codes and structures of human expression that are embedded in a certain artifact. An interface is here a set of instructions for reading, listening and experiencing etc.
Forensic materiality
Building upon descriptive and analytic bibliography, and methods used in investigations, the forensic view sees materiality and how it is experienced as part of a wide web of cultural assumptions, events, and practices. This perspective analyzes the position of materiality within these perspectives, for example by tracking the resources used in production.
Distributed materiality
To exist, digital artifacts depend on other objects. Distributed materiality looks into the complex interdependencies between the digital and objects as servers and networks. It is an extension of the forensic materiality, where the big difference lies in seeing materiality as a network of things, instead of a single node within a network.
Performative materiality
This model emphasizes the interpretative event of the production of work. It sees materiality as a set of characteristics that work as cues and triggers that provoke an individual experience.
Connecting the performative to critical theory
Drucker relates the performative concept to different concepts in critical theory that dive into the acts of reading and cognition.
Non-representational approaches
This framework envisions that symbolic codes, visual or graphical, can't represent a preexisting entity. They are entities on their own that perform in a certain transfer of meaning.
Theories of enunciation
Systems of enunciation describe the spoken and speaking subject of language and are drawn from linguistics. The theories distinguish types of discourse by looking at markers in the used language. Who speaks, and who is spoken?
Interface theory
A method to analyze interfaces from the position of visual semantics using methods from fields like behavioral cognition, ergonomics, psychology, visual and interaction design.
Systems ecology and new materialisms
The notion that things are part of bigger systems, and that traditional approaches to characterize matter are limited in expressing that. Systems ecology assumes that all apparent things are events.
Media archaeology
This field studies inscriptional technologies based on the frame of the 'real'. It analyzes computational, mathematical and process-driven aspects of digital media, acknowledging the dimension of time to these activities and their outcomes.
Winograd, T., & Flores, F. (2008). Understanding computers and cognition a new foundation for design. Boston: Addison-Wesley.
In their book, Winograd and Flores want to introduce a different way of understanding computers and human cognition. They start by describing the practice of seeing computers as artificial brains on steroids. Although interesting as a start for utopian visions and to drive technical progress, this vision stays abstract on what these machines should actually do in the context of human practice.
To answer that question the authors argue that one should look into the language. In the time of writing, computers are mainly in use as word processors. But also in a more abstract level computers are tools for communication and action in the domain of language. Besides that, they state that practice shapes language, and language shapes the space for action.
Rationalistic tradition
Winograd and Flores describe the current practice in software and hardware development as rationalistic: an approach where particular styles of rational thought are emphasized, like the notion of information and decision making. Though logical at a micro level, they argue that on a macro-level this approach offers a narrow view on understanding computing.
Referring to the work of Heidegger and Maturana they show a number of elements in rationalistic thinking and their implications for the design of formal systems.
One of the problematic elements is how cognition is defined in rationalism. It is seen as a separate activity that consists of thinking and making decisions. In this model knowledge is a collection of representations of the reality, and thinking the process of manipulating these representations.
Both Heidegger and Maturana question this framework, though from different backgrounds: Hermeneutics and biology. What they have in common is the notion that knowledge and cognition can't be separated from the other activities that make a human being. In that sense knowledge is in being situated in the world, and not in collecting representations of that world.
Building on that idea, the authors point out that experience is grounded in pre-understanding of a situation, based on one's background. Knowledge is based on an interpretation that depends on this background.
Artificial intelligence is an attempt to construct human cognition in a formal model, but the background of this knowledge exists only to the extent that it is embodied in the software. To what extent is it possible to include the ever-continuing development and articulation of this preexisting experience?
Winograd and Flores point out that it might be possible to include all required knowledge for specific tasks into a model, but that the system breaks down when using it to more complex systems, like chatbots because they fail to live up to our unspoken expectations of such a system.
One of the causes of that is how rationalistic thinking understands language: the transmission of information. The authors support a different view, as they see language as a form of human social action. Language does not work as a representation of the world, but is a space for interlinked patterns of activity: humans do not just describe, they can commit to things being said and listened to. Computers can manipulate and structure information, but they can't commit to language.
Following that model, computers are tools for human action. This makes the concept of an expert system problematic because a system can not be an expert that does things based on its experience. It is rather a highly advanced method of communication among experts.
Breakdowns
An important concept for the authors is the breakdown of a system. One of the standard ways of approaching design is to talk about problems and how to solve them. A problem, the authors argue, depends on the background of the human being experiencing it.
Following Heidegger, the authors prefer to use breakdowns instead of problems to describe these situations. By breakdown, they mean the moment where are comfortable being in the world is interrupted. Those moments show the nature of our practices and tools. Based on that idea, a design is an interpretation of a breakdown that is made to anticipate future breakdowns.
The authors conclude that rationalistic thinking sees knowledge as an individual possession, where it should be understood as a social activity. By creating computers as systems that treat cognition rationally, collecting facts and representations, the field of computer technology is limiting itself.
The question is then, how to design computers using the new discourse on language and cognition.
A direction for design
As an answer to that question, Winograd and Flores propose an ontological approach to design. This approaches the creation of new artifacts not just by asking what can be built but sees a design process as an intervention in our background. This process can be reflective, political and philosophical.
They continue by concretizing this visions in a number of design patterns. One of those is a specification of what is meant by 'user-friendly' designs. The popular vision is that computers will become easier when they become more like people. The authors support that statement partly, pointing out that what makes a system understandable are transparent interactions.
For an interface, it means that the design disappears during operation. The user is driving, not commanding. Elements in interface design to support that are function keys, menus, and pointing devices. Within that vision, bad design is what forces users to deal with complexities that are outside their reach of operation.
In the process of designing, breakdowns are unavoidable. A failure in a system is actually a positive phenomenon because it shows the limits of a specific tool or practice. It is a way of identifying the domain of operation of a system.
Related to that it is important to acknowledge the implications of blindness in design. The choice for certain menu labels eliminates other possibilities.
In the end, the authors conclude that language is shaped by the world we live in, but that world is also influenced by the language we use. As computers are devices of linguistic operation, the same mechanism works here.
This poses a challenge. The rationalistic interpretation of computing can reinforce that same view on how we function as human beings. On the other hand, looking differently at computer systems can open up a better understanding of our surroundings and ourselves.
Beard, D. (2018) Why paying attention to the homepage will pay off. Poynter. Retrieved October 3, 2018, from https://www.poynter.org/news/why-paying-attention-homepage-will-pay
David Beard of Poynter, knowledge center for journalism, interviews the vice-president of programming at CNN Digital: S. Mitra Kalita. She shares her vision on the 'why' of CNN, and how that is reflected in the design of their digital platforms.
Over the course of the past ten years, news media shifted their focus more towards publishing on external social platforms, like Facebook and Twitter. There is now a counter movement caused by phenomena like fake news, toxic comments on social media and the dependency on feed algorithms beyond the control of the media themselves.
Or as Kalita formulated: "The homepage is not dead." Although articles are distributed on other platforms, and people might not need to visit the homepage of a news medium, it pays of to focus on the homepage: it's an interface where you have full control, and one which is used by the most loyal readers.
Identifying the proposed audience for an article and how to find it, was already part of the strategy of CNN. Kalita took a step back, before making a news item it is the question why CNN is doing something for who?
Combined with exercises and prototyping sessions, this led to a new way of thinking at CNN. The mobile homepage is designed for the daily routine of readers, from the urgent news in the moning, to deep reads for the commute home. Newsletters are a way to keep people up to date about specific niches.
Kalita aim with this strategy is to preserve and dominate a core audience on their own website and app, but to give hooks to connect to new users who might find CNN news via other platforms.
Political economy and information capitalism in India, Govindam Parayil - Introduction on information capitalism
Information capitalism is the economic logic of post-industrialism. It distances itself from an economy based on social labour, capital, manufacturing and bulk production. It rather presents a new form of economic dynamics based on the increasing meaning and significance of knowledge, and its relationship with capital and labour.
Where Industrial Capitalism was mostly beneficial to Western Europe, North America and Japan (and of late China), other countries made less progress. Parayil refers mostly to India for the latter.
The rise of ICT and knowledge capitalism opened up new opportunities for the Indian economy, for example by being the place to outsource software projects to thanks to its big and affordable workforce with a high education. Parayil then draws a connection between this euphoria, and cyber-libertarian thinking. The dynamics of the knowledge economy value self-help, micro-enterprises and empowerment. This is a step away from the welfare state and social democracy that are part of the societal thinking in a industrial capitalist society.
A problematic aspect on the more individual worldview in cyber-libertarianism according to the author, is the phenomenon of digital divide. In the case of India the population of the big cities benefits mostly from the knowledge economy, while the rural areas lag behind.
However, Parayil argues that the solution is not just providing the technology to the people who don't have it yet. The question is according to him rather in what economy we want to live in. Does the export of software services benefit the country, to what extent is it different from manufacturing-based industrial capitalism?
In the essays in the book Parayil want to investigate these questions. By using India as a case study, the goal is to find answers that could be applicable to other developing countries that aim to grow their economy via information technology. And in the end, how to bring the immense benefits of ICT to all indians, especially the rural poor.