User:Manetta/self-initiated-shift/v2

From XPUB & Lens-Based wiki

shifts

shift to a new set of tools

With-power-comes-great-responsibility.png

As being educated to use the Adobe software masterpackage in the environment of Mac OSX, my former work has been produced and transmitted through the channels of InDesign, Photoshop, After Effects, Dreamweaver and a little bit of Illustrator from time to time. Next to these five giants, my toolset was supplemented with a basic set HTML+CSS lines. And: Fontlab, a piece of software to make typefaces and (e.g.) 'dynamic' ligatures with.

In the last 6 months, this set of tools has mainly been replaced with a new set of open-source tools (Scribus, Gimp, ImageMagick, Kdenlive), and supplemented with a programming language called 'python' , while working in the new environment of Debian (a distribution of the open source operating system Linux) which involves using the terminal. Working within a very visual environment shifted towards working within text-based interfaces. The convenience and experiences that i built up in the Mac environment haven't disappeared (obviously), as they are still present in the routes i take to search for options and tools which i was relying on before. Though it offers me a very different way of approaching design, based on thinking in structures rather then visual compositions and their connotations.


There are different arguments to give for this shift of operating system. The main reason was a rather practical one, related to a shift of work-method. Linux is developed in the early 90's by Linus Torvalds, and comes from a coders community. Therefore the system is built to be friendly for code-related work methods, but also asks for a bit more involvement when installing new programms or elements at your operating system. Also Linux features as the 'split screen' make it very comfi to work back and forth with a text-editor and browser.

Secondly there is a certain set of cultural principles present in the open-source community that I feel attracted to: the importance of distribution in stead of centralization, the aim of making information (also called knowledge) available for everyone (in the sense that it should not only be available but also legible), and the openness of the software packages which makes it possible to dig into the files a piece of software uses to function.

Using Linux is a way to be not fully depended on the mega-corporations. Although my private emailadress is still stored on servers owned by Google, my facebookaccount is there, and i'm following a set of people on Twitter, my professional toolset is not depended anymore on the decisions that Apple and Adobe will make in the future. That is a relief.

shift to design-perspective

Uta Eisenreich-a-not-b.jpg
from the book 'A not B', by Uta Eisenreich


In the last 6 months, i've been looking at different tools that contain linguistic systems, as i'm interested to look closer at how computers/algorithms are understanding human speech.

From speech-to-text software to text-mining tools, they all systemize language in various ways in order to understand natural language—as human language is called. These tools fall under the term 'Natural Language Processing' (NLP), which is a field of computer science that is closely related to Artificial Intelligence (AI).

Examples of such linguistic tools are the following:

  • alternative counting systems, to represent color (Hexidecimal - a numeral system with a base of 16: 0,1,2,3,4,5,6,7,8,9,A,B,C,D,E,F)
  • interpreting the human voice into written language (CMU Sphinx - speech to text software)
  • mimicing the human voice (text to speech software)
  • interpreting digital written documents (Tesseract - a piece of character recognition software)
  • using semantic structures as linking system (the idea of the Semantic Web - a metadata method of linking information)
  • using subtitle files as edition resource (Videogrep - a video editor based on different types of search-queries)
  • recognizing patterns in written documents (Pattern - a text mining tool)
  • representing language structures in a dataset (WordNet - a lexical database)

As the tools i looked at where mainly open source, it enabled me to dig into the software files and observe the formats that are used to systemize a language. When looking into the speech-to-text tool 'Sphinx' for example, i've been exploring which format of dictionary is used, and how these dictionaries are constructed.

This has been a quite formal way of approaching them, in the sense that is was an act of observing the different language systemizations.


In a certain sense, human-to-hardware interface gestures like clicking & swiping could also be considered to be language systemizations. Together with 3 classmates i've been considering the ideologies that relate to these two gestures, and how the could be visualized. It resulted in a browser-based game called 'fingerkrieg', where the player is challenged to click/swipe as many times as possible in 15 seconds. As clicking could be considered as a binary gesture, swiping is a rather new gesture that is designed to naturalize the interaction a user has with a device. These two interfaces contain certain language systemizations in that sense, as they limit or offer certain possibilities that are fixed within the designed system.


It was by a kind of coincidence that i stumbled upon a device called 'Amazon Echo', an artificial intelligence machine that one could put in one's living room. It uses speech to text recognition which is used to receive voice commands; linked data to link the requested information sources; and text-to-speech to reply to the user's requests. Echo seemed to be a device wherin different language systemizations came together.

By approaching Echo as an interface, a survey into AI started. I looked at five iconic science-fiction films, in which AI-machines are the main character. My selection of 'popular' AI interfaces existed out of Hal (from 2001, a Space Odyssey — 1968), Gerty (from Moon — 2009), Samantha (from Her 2013), Ash (Be Right Back, BlackMirror — 2013) and David (from Steven Spielberg's AI — 2009). Stating the years of the films when summing up seems to be quite important, as it seems to be important to know when people are already using AI machines in film, to show in which context these films are made. HAL is made in 1968, which is one year before the first man on the moon.

An AI interface is mimicing a human being, but how does that influences the way that you communicate with it? And also, how is the user actually adapting his system to the machine's system? How is this relation between computer and user shaped by the fact that the interface is a human simulation?

When communicating with a machine through an interface that simulates a human being, it seems that invisibility is an ideal that is sought-after. The machine is being naturalized, in the sense that it becomes closer while being less visible. So close that it is easy to forget even about it. Though is doesn't mean that the machine is less present. It is rather the opposite. The AI's are trying to be part of the human natural surroundings, through speech, 'realtime' information deliveries, and an increases amount of knowledge about one's contacts and behavior. They become part of the user's systems.


[motivation to work on systemized languages]
The presence of machines nowadays, and the difficulties that arise when positioning them into a human centralized world.

The naturalization of machines, and the invisibility of their interfaces.

Linguistic systems are built to let the computer understand human language, which is already an act of looking for data-correlations. Hence they will always be insecure and inaccurate.

The fact that semantic meaning is constructed through these language structures and systems. (Examples: ngrams, valued dictionary files, JSGF files, machine-learning algorithms, newest = deep learning) Signification isn't derived from the objects themselves, but rather from the system they are put in.


[subjects related to systemized languages]

  • universal language
  • classification & taxonomies/ontologies
  • problem of an one-sided fixed truth that represents 'common sense'


[relation to design background]
Working from a perspective of my graphic design background and combining that with my systemized languages interest, could be a step that would bring me closer to develop a more specific topic of research.

Graphic design is a multi-layered methodology for communication, by considering 'order', 'legibility', 'composition', & 'connotation'—and many other elements which could be part of this list.


(some notes on design from 2009: http://manettaberends.nl/ThoughtsAboutDesign.html)