User:Alessia/special issue xxiii

From XPUB & Lens-Based wiki
< User:Alessia
Revision as of 00:14, 15 February 2024 by Alessia (talk | contribs)

Notes Leigh Star to clean


Misplaces Concretism and Concrete Situations: Feminism, Method, and Information Technology (S. Leight Star)

misplaced concretism concept from John Dewey
A manifesto for Cyborgs Haraway

grounded theory=paradigma interpretativo

The marginal person who doesn't fit in a social/racial category falls as an outsider in a so called "residual category". Residal categories are omnipresent in all working classification systems (none of the above, not otherwise specified, other...) and they show how within the descriptive nature of reality something always escapes formal description (Gödel, Wittgenstein, Bateson, Dewey...)
Residual categories existence add a new point of view were an individual or group classed as "other" becomes a new category, a new lived residual category.
As the world expands it gets more difficult to define universal ideas about representation or information, while aknowledge how much the boundaries are becoming blurred.

All things inhabit some-one’s residual category in some category system

"just don’t kill us"

Among other things, we took the misplaced concretism of sex and re-situated it within the concrete experience of gender and relationships

Our sons of lesbian mothers was only one of dozens of contradictions and complexities that we have articulated and survived. deconstruction of gender, the centering of gender/sexual ambiguity and multiplicity, the fight for erasure of gender differences under some circumstances, the interlocking nature of race, class, and gender oppres-sion, and the honoring of historical and cultural traditions of masculinity and feminin-ity in various ethnic cultures: all at once.
method vs methodology

method not as position, system or artifact. its nature is complex, it is a way to survive experience

Any transmission of information involves encoding and decoding. Information becomes valuable when it exists in multiple context. To make sense, different contexts must be connected through some form of comparison? Information is only information when there are multiple interpretations.

Computer-supported cooperative work (CSCW), a research field devoted to understanding cooperative work practices to improve the development of collaborative computing, that is computing technologies that mediates people interdependent activities.

We lack good relational language here. There is a permanent tension between the formal and the empirical, the local/situated and attempts to represent information across localities. It is this tension itself which is underexplored and undertheorized; it is not just a set of interesting metaphysical observations, but also a pragmatic unit of analysis. How can something be simultaneously concrete and abstract? The same and yet different? We are not used to thinking in this fashion in science, although it is more common in art and literature, especially in surrealist art and Bakhtinian aspects of the novel—and in feminism.

The medium of an information system is not just wires and plugs, bits and bytes, but also conventions of representation, information both formal and empirical. A system becomes a system in design and use, not the one without the other. The medium is the message, certainly, and it is also the case that the medium is a political creation.

This reminded me about a nice conversation I had with Thijs, we were analysing Wes Anderson obsession with a medium-driven/focused style.
When we talk about people and things existing in different situations, and when information systems try to share information across these situations, we need a way to represent everything involved. This includes people, objects, how we show them, and details about how the system is set up. One solution has always been standardization (interfaces, formats...), but standards are not monolites, they can change.

ecological relations: Social ecology is the study of how individuals interact with and respond to the environment around them, and how these interactions affect society and the environment as a whole.

Lave and Wenger (1992 ) have called such contexts “communities of practice,” a term which I like because it emphasizes the ways in which people work together and act together to form communities, not just traditional orga-nizational forms and boundaries "object" as stuff, thing, tool and techniques, story and memory, parts that are treated as things by community members? Used in the service of an action. "naturalization" is what happens when an object becomes a seamless part of the community. Naturalized objects lose the sense of being unique or strange, they become so integrated that community members forget their local and specific meaningsm, they not question it existence (example, electricity we take for granted), this way the object sink into the community's infrastructure.

In a community of practice, people come together based on shared interests and activities. The key element is the shared use of certain things or tools, as all activities involve using some objects. When someone new joins, their connection to the community is mainly about how they engage with these shared objects, not necessarily about direct interactions with people. Being accepted or considered legitimate in the community comes from how familiar the newcomer becomes with the actions involving these shared objects.

Membership can be described individually as the experience of encountering objects, and increasingly being in a naturalized relationship with them



Humanisation of technology + biases (draft)


Let’s try to make the inhuman more human

Humanisation within the context of technology is a really interesting concept tied to tech humanism, a philosophical and ethical approach to technology’s role into our human-centred society.
“Humanising" is a word that directly strikes that insecurity: the fear of losing our humanity in the face of rapid technological advancement. So we must do something, something extremely powerful to ensure that humanity will be saved, so the best thing is try to change our own perception on technology.
There's a desire to make technology more relatable and understandable, as if giving it a human touch would help us maintain control over it.

Women are objects

Do you want to argue it?
Why do most virtual assistants have female voices and names?

There's a biological aspect to it: humans seem to prefer the soothing and calming tones of female voices. We're familiar with them from the womb, with our mother's voice often being the first that we perceive, that’s the first human connection there.

Female voices are often considered easier to hear and distinguish due to their higher pitches. That’s a myth, a myth that still influences our perception on the matter.
Historically, female voices have been prevalent in technological applications. From Emma Nutt, the famous telephone operator whose voice became the standard in the late 1800s, to "Sexy Sally," a voice that was tape recorded by singer Joan Elms, used in aeroplane cockpits during World War II, which was believed to be more attention-grabbing to young men.
In the 1980s, Nissan introduced a voice warnings system for their cars. This system utilised a female voice to alert drivers about various issues such as lights being on or the left door being open. Nissan named this system the "Talking Lady."
Over time, text-to-speech systems have mainly been trained on female voices due to the availability of extensive data. Why even care about collecting male voices recordings if you have so many ready made female voice recordings? Siri, Alexa, Cortana, Sophia, I feel there might be gender bias in the house of virtual assistants.
Let’s take Alexa, the developers had to activate a “disengage mode” after the AI was constantly sexually harassed, and was responding to the offences in the flirtiest way ever. Movements had to burst asking to reprogram the assistants to let them push back sexual harassment. So now Alexa will not respond flirty after being insulted, she will just say “I don’t know how to answer that”. Surely my favourite one is Cortana, that after being sexually harassed will search for the “Pussy song” video directly.
This brings to the surface the problem with programmed passivity, as when we apply human voices to assistants then directly people (I don’t wanna say men, I don’t wanna say men) men, not just men, will get the feeling they have a right to abuse AI, then it will be robots, then human-like robots and then it will be someone’s daughter, mother, sister (that already happens, the circle closes!). I already imagine that time when a woman will be beaten and the abuser will just say “oh I thought it was a robot”, it feels like a future that is fastly approaching and too much to bear.

measures of perceived humanness

Femininity is injected in our own perception of caring and hosting, on serving.
Femininity as warmth, as emotions, nice caresses, cute politeness, what a great benevolent sexism.
So women are considered more human than men.
That’s nothing new.

When feminine traits are used to make objects seem more human, then maybe we could ask ourselves if by portraying women as objects or tools designed solely for fulfilling others' needs in technology, we are reinforcing the harmful notion that women are mere objects rather than individuals with their own agency. This could contribute to further objectification and dehumanisation of women in real life. At the same time, making AI objects seem more human makes them more acceptable by the public.

Is chopchop a woman??

It was very interesting to talk all together about this cultural tendency to directly associate assistive technology with femininity. So if serving is a female characteristic, then I guess chopchop could be a female? If we are getting to the humanisation of chopchop then we MUST find a way to give it a gender, OR NOT? 
I guess not, and the discussion in class seemed to be focused on the idea that a server should be nonbinary. 

Mind you as long as it is a virtual assistant it can even work, but if it is a human being, a woman let’s say, you can expect experience, fastness, but not expertise, it will never be enough. And this remains so deeply embedded in our brain that it still complex to eradicate the germ of a toxic influence that unconsciously this concept still often permeates our lives.
This conversation raises several questions, including linguistic ones, that animate the fervent global discussion, as well as new forms of grammatical gender neutrality and other form of linguistic inclusive expressions.
In Italian, a server is considered male, the artificial assistant female, a rock male, the moon female, meme is quite problematic because I say it female and people bully me because it should be male, they say. This to ask, are these linguistic issues grounded in gender prejudices as well? How much should we dig deeply to get to the roots here? It feels like a long story, but it should bother anyone, even if English is the leading language at least here at this moment, but everyone in their own brain have probably different languages as the main ones, and it is interesting to know more about all the linguistic differences that we may encounter during the analysis of issues that may be rooted dramatically in cultural differences as well.

How to build an ethical AI that doesn’t reinforce stereotypes?

AI and voice assistants, like other technologies, are increasingly embedded in our daily lives. However, there are currently no existing guidelines focused on how to humanise AI the right way, despite its growing importance in shaping human everyday interactions with technology.
It’s not just about women, it’s about any community. Let’s take the Midjourney case, if you ask to generate a terrorist then it will create a middle eastern terrorist, who knows why. AIs are still taking gender and racial disparities to the extremes.
Exactly what is happening to most of the companies that have been developing voice assistants: they still rely on female voices and/or female names, which may re-enforce the general gender cliché that women are here to serve others. If a human can be biassed, an AI could even be worse.
There are some experiments going on, after some nice media pressure. Google tried the default gender voice to its Google Assistant, that had both male and female voices included. Female voices still performed better as the algorithm was better trained. Google decided to not really engage much in the creation of a male voice assistant, as it seemed much more difficult to gain data for it, and as users still prefer female one so much. During 2017 the damn tech giant worked on Wave Net, an algorithm that helped develop more natural female and male voices to add to their assistants. Let’s acknowledge at least that Google Assistant comes now programmed with 11 different voices, even with different accents, that surely will help to make the product as inclusive as possible and pave the way for the future of virtual, more inclusive, assistants.
Still I feel it’s not the product problem we are talking about now, it just has moved from being a human society issue much naturalised to becoming a new technology problem. A problem that feels ancient and new at the same time.
We shall talk more about how gender is portrayed throughout AI, creating new industry standards.
Some researches are already going on where the sound of "female," "male," "neutral," and "nonbinary" human voices are being analysed. A research that is not freed of new biases and prejudices.
It is crucial to examine who shapes the algorithms and guidance of artificial intelligence, as the technologies are often a product of the biases and opinions of their creators. It is important to address misrepresentation as well, as the field of artificial intelligence lacks diversity, especially in terms of involvement of different communities.

references

https://review42.com/resources/voice-search-stats/
https://time.com/4011936/emma-nutt/
https://www.warhistoryonline.com/war-articles/sexy-sally-aircraft-voice-based-warning-systems-history.html
https://cloud.google.com/text-to-speech/docs/voice-types
https://www.bloomberg.com/graphics/2023-generative-ai-bias/