SI16: Difference between revisions

From XPUB & Lens-Based wiki
Line 58: Line 58:


====Manifesto====
====Manifesto====
https://hub.xpub.nl/soupboat/si16/intro/
Dear friend and online scroller,  
Dear friend and online scroller,  
Beloved internet user,  
Beloved internet user,  

Revision as of 17:41, 24 March 2022

First approaches to vernacular and language processing

Special Issue #16

Texts to discuss *

My translation of the lyrics of a greek partisan song and the lyrics of a queer take on the same song
Experiental narrations 5 occupants of Peiraiki-Patraiki factory in 1990*
Experiental map of Panayotis' narratives*

Prototyping

First steps into learning Python

  • Exploratory Programming for the Arts and Humanities notebooks for chapters 4,5,6,7,8,15
  • Natural Language Processing with Python notebooks for chapters 1,2

Reading Writing and Reasearch Methodologies

Anotating the Intro of Queer Phenomenology

with Carmen, Chae and Miriam in Steve's Class

Discussing the text with chae in an experiental approach, having just arrived in Rotterdam and trying to orientate in a very different context.
Trying to read the text through our experiences

queer phenomenology annotation pad

Glossary of Interconnected Keywords

with Chae, Kimberley and Jian

Collective experiment on giving our definitions to keywords that are inteconnected to each other.

glossary of interconnected keywords

Approaching the vernacular through the theme of rejection

https://pad.xpub.nl/p/Rejection_Glossary

Collaboration, Conflict & Consent workshop

https://pzwiki.wdka.nl/mediadesign/Mitsa_selfportrait

The forced poetics and the making of Special Issue 16

really inspired by Clara's lecture on vernacular design and forced poetics

publishing is a validation

conversation with Carmen and Erica

conversation and diagrams with Erica about forced poetics and empowerment

Prototyping with the replace function

First prototype with Erica

   new_text = print(text.replace('reason',' 👮reason👮').replace('Reason','👮Reason👮').replace('High','👨‍⚖️High👨‍⚖️').replace('normal','🔫normal🔫').replace('Objective','(⊙▃⊙)Objective(⊙▃⊙)').replace('objective','(⊙▃⊙)objective').replace('planet','🇺🇸planet🇺🇸').replace('foreign','🪖foreign🪖'))

Applying the replace function on Bataille's Solar Anus

pad of text analysis
https://pad.xpub.nl/p/solar_anus
notebook with experiment

   new = anus.replace('sun','moon').replace('SUN','MOON').replace('solar','lunar').replace('phalluses','rabbits').replace('shafts','strapons').replace('phalloid','dilidoid')

The idea for a thematic libary of words replaced by emojis

screenshot

SI #16 Manifesto

Manifesto

https://hub.xpub.nl/soupboat/si16/intro/ Dear friend and online scroller, Beloved internet user, Dearest binge watcher and human being IRL,

XPUB1 (The Experimental Publishing Master from Piet Zwart Institute) welcomes you to the Special Issue 16 on vernacular language processing: "Learning How to Walk while Catwalking."
Hu? How do you learn how to walk while catwalking?

Be confident, be ambitious and be ready to fail a lot. Our Special Issue is a toolkit to mess around with language: from its standard taxonomies and tags, to its modes of organizing information and its shaping knowledge. With these tools we want to legitimize failures and amatorial practices by proposing a more vernacular understanding of language. We decided to release the Special Issue 16-toolkit in the form of an API (Application Programming Interface). APIs often organise and serve data and knowledge. What is not always evident is that they facilitate the exchange of information between different software programs and systems according to mainly commercial standards and purposes. We chose instead to build a process that responds to the topics we are working with. Our API is an attempt at a more critical and vernacular approach to such model of distribution. You didn't get a thing yet? Don't worry! We are also on our way and that's the whole point of this experimental enquiry. We will be happy to guide you through the API and the different functions included in it, share our technical struggles and findings. This project is characterized by the elaboration of vernacular methods of processing. The material we process comes from various sources. For some of it, we appropriated existing texts and compiled them into corpora. For others, the activation of certain functions calls for an audience’s input. The participatory aspect of the functions is an important factor that unites them.

Since we are working with filters, we realized how every cultural object rejects and filters its public. We want to question these limitations focusing on accessibility and proposing several entry points to our project. API, there’s this very good meme that, I think, explains it in a rather good way. Imagine a bar with different staff in it: The cooks working in the kitchen would be the ‘backend’, the ones behind the bar the ‘frontend’, andddd the waiters running from the bar to the tables are the API!

Something that feels informal, approachable, "ours" and not imposed standarized forms. Organic, with the spanish opening times, etc. Approachable.

wtf I don't really understand but I like it

In other words, a toolkit for processing language with a vernacular attitude. This toolkit does not only consist of a set of tools but also of a world we are building around them: how do we want these tools to affect reality? This toolkit can be expanded, as new tools can be added to it and the world around them being stretched. There is a strong focus on the way we are working on it: a decentralized approach that builds from the ground up. Ambitious! Political! Unstable! ...but as some point embracing this unstability, trying to learn and care for each other, while learning python and caring for API .

I think of it as a personification of something that's intended to be functional, in that we assign the API a particular behavior so that it does the unexpected.

Opening times?

We are confident, we are ambitious and we are failing a lot while Learning How To Walk While Catwalking. We want to legitimize failures and amateur practices outside the hierarchy of experience. We want to care of each other in the process of learning, now between us, and then with you. We approach the text as a texture, a malleable clay tablet, a space for foreign input and extensive modifications, for cut-up and for collage, for collective agency and participation. Not a surface but a volume, in which the text is not only text, but a shared space. We work to sort out several meanings from the same word. We intend to blur our roles as authors, users and public because this is an act of collective world building. I was invited to get onto an online platform for something called the Special Issue 16(?) The front, or first page has an index with some descriptions. Overall it had to do with texts, and ways of modifying them, I think.

Landing on Special Issue 16 page, reading the 'about' page. Finding out about several projects, triggered by the different showcases. As I gain interest in one of the example, I click on a link and read about the intention from which this tool departed.

So well, I am in front of the screen, I click on the first link and get a description and a sort of instruction of how to use their tools. Fine, I'll use the tool. It seems like I'm not the only one who has been invited here, the layout is unfamiliar, but I see how I could partake in it. And if I do, well, the next person will also have something else to deal with, I'm into that. What could I write? I write. Oh, wasn't that hard.

Unapologetic. Fearless. Eager. Playful. Brave. Persistent. Experimental. Bvvvrruummm

I am curious to know even more, I click on the shared folder link and am redirected to a library of tools. Finding out I can use this tool for my own purposes. I start scrolling through the multiple tools offered there. From the tool I was initially interested on, I drift to another snippet that calls my attention. From there, I click on the link offering a 'showcase view'. I get acquainted with the example. Zooming out, I land on the 'about page' to which that project belongs. More tools are offered but I am not interested in more tools. I zoom out more, and I find the 'introduction', which informs me about the general purpose of this page.

OMG THIS IS. HERE?

wow these people are so meta

can you buy me a coke?

I clicked on the link to access this website: oh what happens nice! nice colours, I have icons to click on but I clicked on one and I've seen many things i don't understand so it's better to know what this is about first because I don't understand so I click on the about page. Ok let's move back to the homepage. I can choose between projects and functions (again, what's this????) ok maybe by looking at the projects I will understand better...

Embracing the chaos that comes with the learning curve.

I click on a link and am brought to a page and keep clicking, can't stop clicking, super curious what's happening here, not sure what exactly yet, but it doesn't matter. I see the about page but I'd rather not read it because I like surprises. Looks like they're making a lot of experimental tools I've never heard of before but always wanted to try.

What is this all about? Shall we open the window for some fresh air?.

Yes that was a bit conceptual but basically, our project is meant to give a bunch of users several tools such as : ✂️ scissors, 📃 sticky notes, ✏️ pencils, erasers, and printed paper. ✂️🖊📝✏️📃 And let them have fun. Cutting it and putting it together, making notes and writing jokes… But everything in a digital format. Q&A link link link Special Issue 16—Learning How to Walk while Catwalking

...and I wish that your question has been answered

with Carmen, Erica and Miriam

...And I wish that your question has been answered interface

Research

Text

This is an act of persistent resistance. We created three functions to facilitate an iterative process of refusal towards PM Kryakos Mitsotakis and PM Mark Rutte's answers during a Press Conference and any of their possible versions. We invite you to play as much as you want with these functions and create your own answers as counter-reaction to Mark Rutte's final sentence: "So this is my answer and I wish that your question has been answered". Every new answer, every new iteration, can be submitted to our Archive of Repetitive Answers. Although they will never be good enough, nor shall they be accepted as exhaustive, we consider the modified answers as a trigger for a never-ending dialogue.

Our tool is a filter to process and alter texts. By targeting specific words and replacing them, either for another word, for specific characters or for blank spaces, the reader or user of the tool can change the text in many ways. The tool includes three functions The function “respell” receives as input a text (string type) and substitute all the occurrences of a target word with a replacement chosen by the user. The function “stitch” is very similar to the previous one but replaces all the occurrences of a target word with a single character (it can also be a blank space) that is repeated as many times as the length of the target. The third function “reveal” also works very similar but deletes all input text except the target word(s) and replaces the deleted text with blank spaces.

Functions

Respell

   Respell receives as input a text as a string type, and substitute all the occurrences of a targeted word with a replacement as a string type chosen by the user.
   from nltk.tokenize import word_tokenize
   # text, target, and replacement are string types
   def respell(text, target, replacement):
       target = target.lower()
       txt = word_tokenize(text)
       new = []
       for w in txt:
           if w == target:
               w = replacement
               new = new + [w]
           elif w == target[0:1].upper() + target[1:]:
               w = replacement[0:1].upper() + replacement[1:] 
              new = new + [w]
           elif w == target.upper():
               w = replacement.upper()
               new = new + [w]
           else:
               new = new + [w]
       text = ' '.join(new)
       final= text.replace(' .','.').replace(' ,',',').replace(' :',':').replace(' ;',';').replace('< ','<').replace(' >','>').replace(' / ','/').replace('& ','&')
       return final
   This function in itself could be understood as a filter to process and alter texts. By targeting specific words and replacing them, either for another word, for specific characters or for blank spaces, the user of the tool can intervene inside a text. One could break down the meaning of a text or create new narrative meanings by exposing its structure, taking out or highlighting specific and meaningful words and detaching such text from its original context. This tool offers a broad spectrum of possibilities in which it can be used, from a very political and subversive use, to a more playful and poetic one.


Stitch

   Stitch receives as input a text as a string type, and replaces all the occurrences of a target word, with a character or a word that is repeated as many times as the length of the target.
   from nltk.tokenize import word_tokenize
   # text, target, and replacement are string types
   def stich(text, target, replacement):
       target = target.lower()
       txt = word_tokenize(text)
       new = []
       for w in txt:
           if w == target:
               w = len(w)*replacement
               new = new + [w]
           elif w == target[0].upper() + target[1:]:
               w = len(w)*replacement
               new = new + [w]
           elif w== target.upper():
               w = len(w)*replacement 
               new = new + [w]
           else:
               new = new + [w]
       text = ' '.join(new)
       final= text.replace(' .','.').replace(' ,',',').replace(' :',':').replace(' ;',';').replace('< ','<').replace(' >','>').replace(' / ','/').replace('& ','&')
       return final
   This function in itself could be understood as a filter to process and alter texts. By targeting specific words and stitching them, with a character or a word that is repeated as many times as the length of the target , the user of the tool can intervene inside a text. One could break down the meaning of a text or create new narrative meanings by exposing its structure, taking out or highlighting specific and meaningful words and detaching such text from its original context. This tool offers a broad spectrum of possibilities in which it can be used, from a very political and subversive use, to a more playful and poetic one.

Reveal

    Reveal takes a text as string input and deletes all its characters except the input list of words.
   def reveal(text,group):
       txt = word_tokenize(text)
       txt_linebr = []
       for token in txt:
           if token == '<':
               continue
           elif token == 'br/':
               token=
               txt_linebr.append(token)
           elif token == '>':
               continue
           else:
               txt_linebr.append(token)   
       new = []
       for w in txt_linebr:
           if w==:
               new = new + [w]
           elif w not in group:
               w = len(w) * ' '
               new = new + [w]
           elif w in group :
               new = new + [w]
       text = ' '.join(new)
       final= text.replace(' .','.').replace(' ,',',').replace(' :',':').replace(' ;',';').replace('< ','<').replace(' >','>').replace(' / ','/').replace('& ','&')
       return final
   This function in itself could be understood as a filter to process and alter texts. By chosing to keeping specific words of a text and deleting all the others, the user of the tool can intervene inside a text. One could break down the meaning of a text or create new narrative meanings by exposing its structure, taking out or highlighting specific and meaningful words and detaching such text from its original context. This tool offers a broad spectrum of possibilities in which it can be used, from a very political and subversive use, to a more playful and poetic one.