User:Annasandri/ref synopsis: Difference between revisions
Annasandri (talk | contribs) No edit summary |
Annasandri (talk | contribs) |
||
Line 12: | Line 12: | ||
=<span style="color:blueviolet;">♫ "Rabbit hole" </span>= | =<span style="color:blueviolet;">♫ "Rabbit hole" </span>= | ||
Julia Longoria, Sindhu Gnanasambandan, Larissa Anderson, Wendy Dorr, Brad Fisher, Dan Powell, Kevin Roose, and Andy Mills, The New York Times, 2020[https://www.nytimes.com/2020/04/16/podcasts/rabbit-hole-internet-youtube-virus.html]<br> | Julia Longoria, Sindhu Gnanasambandan, Larissa Anderson, Wendy Dorr, Brad Fisher, Dan Powell, Kevin Roose, and Andy Mills, The New York Times, 2020[https://www.nytimes.com/2020/04/16/podcasts/rabbit-hole-internet-youtube-virus.html]<br> | ||
''"We shape our tools and afterwards our tools shape us" Marshall McLuhan'' <br> | |||
"Rabbit Hole" tells the story of the people who were shaping and being shaped by the internet.<br> | |||
''"Which of my tastes, thoughts, and habits are really mine, and which were put there by an algorithm?"''<br> | |||
Kevin Roose calls this sensation "Machine drift", a phenomena that occurs when we have the sensation that our decisions, tastes and habits are driven by the platforms that we use and the algorithms that make them work. | |||
<div style="width:40%;float:left;"> | |||
'''Guillaume Chaslot'''<br> | |||
Guillaume Chaslot is a french programmer: he got a phd in artificial intelligence and he studied machine learning. In 2010 he got hired by Google where he works on the AI of YouTube: a project regarding the recommendations sidebar. At the beginning he is excited about the idea that his work is going to affect so many people. | |||
The youtube algorithm at the beginning was relying on clicks: the more people clicked on videos, the better they thought it was. Until they realised that they were too many clickbaits: people would click on the title, then realise that the video was not at all about that and then they would leave the platform immediately. So they switched the direction, starting to work on something that would maximising watch time, a decision that produced viewing numbers that no one has never seen before. | |||
Guillaume doesn't really question too much the result of his work until he realises the idea of maximising watch time was creating filter bubbles. Once you selected a type of content the YouTube engine was giving you more of the same: a system that seemed harmless if concerning cute cats videos but that was starting to get more shady when working with political content. The engine was pushing people to see only one side of reality creating this sort of algorithmic echo chambers where different realities were created. | |||
Guillaume eventually shows his concerns in the team and focuses on side projects. He creates an algorithm that has the opposite effect: trying to get people out of the filter bubble. | |||
The prototype is never tested with real users and Guillaume bosses try to move him from this side projects until they fire him for bad performances. He leaves Silicon Valley and comes back to France.<br> | |||
'''Google Brain'''<br> | |||
Google Brain is an awarded team of AI developers that in 2015 starts to work on a new implementation of the YouTube recommendation algorithm. They decide to build a new strategy: in order for the platform to keep growing they needed a way to give users recommendations on new content, new subjects, new topics. Google Brain works on a new technique called deep neural network [https://wiki.pathmind.com/neural-network] which is supposed to mimic the way the human brain processes the information. In order to work the system needs to receive a lot of data - in this case regarding YouTube users - that will instruct the AI to find new patterns and connections that no human could ever find. | |||
It means that the AI is able to to draw people into new subjects and make them interested in new things, dramatically increasing the quantity of recommended material.<br> | |||
'''Susan Wojcicki'''<br> | |||
Susan Wojcicki is YouTube C.E.O.<br> | |||
''"I mean, it’s interesting that you say [Caleb's story]that because, I guess I want to say, going back from my early initial days, which is that we couldn’t get people interested in news or get people interested in politics. We had no indication that this was something that people were interested in. People were interested in gaming and music, entertainment. They came to laugh. They came to escape, in many ways. [...] And so I tried to figure out, how do you reconcile that with a company that is a more entertainment-based company, right? So what does that mean to have quality if the main thing that you’re doing is gaming videos, cat videos, and music videos? Then what does that really mean for quality?"'' <br> | |||
In 2016, after the terrorist attack in Nice Youtube decide to push and prioritize content regarding the attack for their french user. Until they found out that users didn't want to see it.<br> | |||
''"And so what do you do as a platform? Do you show it to them anyway? And so that’s actually — I remember that very clearly, because that was the first time I said to them, You know, it doesn’t matter. We have a responsibility. Something happened in the world, and it’s important for our users to know."''<br> | |||
But they decided to show it anyways, marking the first time that YouTube took this idea of responsibility and prioritized that over watch time.<br> | |||
''"You know, one of the biggest changes that we made from our recommendation system, and it was probably one of the later changes we made, and I think it’s because it’s a harder one to grapple, which is that we realized that there was a set of content that, even if people were repeatedly engaging with it, we thought it was important to make the recommendations more diversified and to show them more quality content along side. "[..] | |||
"We began to identify content that we called borderline content. And if users were repeatedly engaging with this borderline content, we will show them other content alongside that we’ve ranked as more high quality content."''<br> | |||
But then rose the question: How do you define borderline content?Are this still interventions happening in an automated system?<br> | |||
''"Yeah. I mean, it is a complicated and nuanced area. I mean we have a whole process, an algorithm, to find it."''<br> | |||
For its entire existence, YouTube has been defined as an alternative media space. And the people that were there were like these insurgents and upstarts, these kind of rebels and misfits. And now, YouTube is basically saying, when it comes to certain things, we’re going to let the establishment win, which is tricky, because the establishment isn’t always right.<br> | |||
'''PewDiePie'''<br> | |||
PewDiePie is a Swedish YouTuber who started to make Let's Play videos. He uses a sort of "male-friends" humor, he has a funny and overreacting approach to Let's Play videos, often using a catchy dirty tone. | |||
In a relatively short span of time he eventually manages to build the most subscribed channel in YouTube's history, gathering his followers under the name of "Bro's army". | |||
In 2016 he starts to diversify his content, including reactions to YouTube trend topics and criticizing YouTube behaviour and changes in the platform like trying to promote content from more diverse creators.<br> | |||
''"YouTube wants my channel gone. They want someone else on top.[...]I’m white. Can I make that comment? But I do think that’s a problem. [...]PewDiePie says YouTube is killing his channel because he’s white."''<br> | |||
Medias start to report his behaviour and he is accusing them of building this conversation to gain attention from his name.<br> | |||
''"This clickbait journalism is just the purest form of cancer."<br> | |||
"PewDiePie racist, question mark. What the [EXPLETIVE]?"''<br> | |||
A few reporter from the Wall street Journal started to pay attention and made an investigation, finding nine videos that had anti-semitic jokes and Nazi imaginary.<br> | |||
''"I think what this article shows more than anything, old-school media does not like internet personalities because they’re scared of us.[...]It was an attack towards me. It was an attack by the media to try and discredit me, to try and decrease my influence and my economic worth. That’s what this was. | |||
I’m still here. I’m still making videos. Nice try, Wall Street Journal. Try again,"''<br> | |||
He starts to make fun of politically-correctness and mainstream medias. He is become the symbol of this culture war. | |||
In 2018 another YouTube channel, T-series seems to approach PewDiePie in terms of subscriptions. T-series is an Indian media conglomerate which is growing a lot because more and more people in India starts to get online. So PewDiePie creates this videos where is attacking the channel that want to dethronize him.<br> | |||
''"(RAPPING) You’re trying to dethrone me from spot on number one? But you’re India you lose, —"''<br> | |||
His fan start a subscription campaign in order to preserve PewDiePie sovereignity on YouTube. The campaign becomes viral and gets out of the cyberspace: "Bitch lasagna", PewDiePie's offensive hit eventually plays in a rally in Russia and in many other places. | |||
PewDiePie's channel doubles his subscriptions. | |||
A mass-shooting happens in a mosque in Christchurch, New Zealand: 49 people dies and the shooter, who is livestreaming the events on social medias, mentions he wants people to rip into each other and create chaos and- to subscribe to PewDiePie. | |||
The shooter was following conspiracy theories regarding an attempt to replace the white race and internet figures like PewDiePie. | |||
He makes an announcement after some months:<br> | |||
''"Something happened that I don’t think anyone would have predicted. The Christchurch shooter, said, “subscribe to PewDiePie.[..]I think it’s time to end the “subscribe to PewDiePie” movement or meme.[..] | |||
To have my name associated with something so unspeakably vile has affected me in more ways than I let show. I just didn’t want to address it right away, and I didn’t want to give the terrorist any more attention. I didn’t want to make it about me because I don’t think it has anything to do with me."''<br> | |||
'''Q-anon'''<br> | |||
“Where we go one, we go all."<br> | |||
Q-anon supporters, in their constant activity of looking collaboratively for clues, are not only using the internet the way it was originally envisioned but they are also using it in the exact way that the powerful algorithms that currently run our biggest social media platforms are designed to encourage.<br> | |||
'''Facebook'''<br> | |||
After 2016 elections Facebook decided to modify the user algorithm in order to make the users see less recommended and clickbait content and more post from actual friends, family and groups. | |||
</div> | |||
=<span style="color:blueviolet;">✒ "The Apophenic Machine" </span>= | =<span style="color:blueviolet;">✒ "The Apophenic Machine" </span>= |
Revision as of 15:37, 14 October 2020
→ ►█ "Feels good man" [1]
Movie, 2020
→ ✒ "Meme magic is real, you guys"[2]
Article, 2016
→ ✒ "This Is Not a Game - Conspiracy theorizing as alternate-reality game"[3]
Article, 2020
♫ "Rabbit hole"
Julia Longoria, Sindhu Gnanasambandan, Larissa Anderson, Wendy Dorr, Brad Fisher, Dan Powell, Kevin Roose, and Andy Mills, The New York Times, 2020[4]
"We shape our tools and afterwards our tools shape us" Marshall McLuhan
"Rabbit Hole" tells the story of the people who were shaping and being shaped by the internet.
"Which of my tastes, thoughts, and habits are really mine, and which were put there by an algorithm?"
Kevin Roose calls this sensation "Machine drift", a phenomena that occurs when we have the sensation that our decisions, tastes and habits are driven by the platforms that we use and the algorithms that make them work.
Guillaume Chaslot
Guillaume Chaslot is a french programmer: he got a phd in artificial intelligence and he studied machine learning. In 2010 he got hired by Google where he works on the AI of YouTube: a project regarding the recommendations sidebar. At the beginning he is excited about the idea that his work is going to affect so many people.
The youtube algorithm at the beginning was relying on clicks: the more people clicked on videos, the better they thought it was. Until they realised that they were too many clickbaits: people would click on the title, then realise that the video was not at all about that and then they would leave the platform immediately. So they switched the direction, starting to work on something that would maximising watch time, a decision that produced viewing numbers that no one has never seen before.
Guillaume doesn't really question too much the result of his work until he realises the idea of maximising watch time was creating filter bubbles. Once you selected a type of content the YouTube engine was giving you more of the same: a system that seemed harmless if concerning cute cats videos but that was starting to get more shady when working with political content. The engine was pushing people to see only one side of reality creating this sort of algorithmic echo chambers where different realities were created.
Guillaume eventually shows his concerns in the team and focuses on side projects. He creates an algorithm that has the opposite effect: trying to get people out of the filter bubble.
The prototype is never tested with real users and Guillaume bosses try to move him from this side projects until they fire him for bad performances. He leaves Silicon Valley and comes back to France.
Google Brain
Google Brain is an awarded team of AI developers that in 2015 starts to work on a new implementation of the YouTube recommendation algorithm. They decide to build a new strategy: in order for the platform to keep growing they needed a way to give users recommendations on new content, new subjects, new topics. Google Brain works on a new technique called deep neural network [5] which is supposed to mimic the way the human brain processes the information. In order to work the system needs to receive a lot of data - in this case regarding YouTube users - that will instruct the AI to find new patterns and connections that no human could ever find.
It means that the AI is able to to draw people into new subjects and make them interested in new things, dramatically increasing the quantity of recommended material.
Susan Wojcicki
Susan Wojcicki is YouTube C.E.O.
"I mean, it’s interesting that you say [Caleb's story]that because, I guess I want to say, going back from my early initial days, which is that we couldn’t get people interested in news or get people interested in politics. We had no indication that this was something that people were interested in. People were interested in gaming and music, entertainment. They came to laugh. They came to escape, in many ways. [...] And so I tried to figure out, how do you reconcile that with a company that is a more entertainment-based company, right? So what does that mean to have quality if the main thing that you’re doing is gaming videos, cat videos, and music videos? Then what does that really mean for quality?"
In 2016, after the terrorist attack in Nice Youtube decide to push and prioritize content regarding the attack for their french user. Until they found out that users didn't want to see it.
"And so what do you do as a platform? Do you show it to them anyway? And so that’s actually — I remember that very clearly, because that was the first time I said to them, You know, it doesn’t matter. We have a responsibility. Something happened in the world, and it’s important for our users to know."
But they decided to show it anyways, marking the first time that YouTube took this idea of responsibility and prioritized that over watch time.
"You know, one of the biggest changes that we made from our recommendation system, and it was probably one of the later changes we made, and I think it’s because it’s a harder one to grapple, which is that we realized that there was a set of content that, even if people were repeatedly engaging with it, we thought it was important to make the recommendations more diversified and to show them more quality content along side. "[..]
"We began to identify content that we called borderline content. And if users were repeatedly engaging with this borderline content, we will show them other content alongside that we’ve ranked as more high quality content."
But then rose the question: How do you define borderline content?Are this still interventions happening in an automated system?
"Yeah. I mean, it is a complicated and nuanced area. I mean we have a whole process, an algorithm, to find it."
For its entire existence, YouTube has been defined as an alternative media space. And the people that were there were like these insurgents and upstarts, these kind of rebels and misfits. And now, YouTube is basically saying, when it comes to certain things, we’re going to let the establishment win, which is tricky, because the establishment isn’t always right.
PewDiePie
PewDiePie is a Swedish YouTuber who started to make Let's Play videos. He uses a sort of "male-friends" humor, he has a funny and overreacting approach to Let's Play videos, often using a catchy dirty tone.
In a relatively short span of time he eventually manages to build the most subscribed channel in YouTube's history, gathering his followers under the name of "Bro's army".
In 2016 he starts to diversify his content, including reactions to YouTube trend topics and criticizing YouTube behaviour and changes in the platform like trying to promote content from more diverse creators.
"YouTube wants my channel gone. They want someone else on top.[...]I’m white. Can I make that comment? But I do think that’s a problem. [...]PewDiePie says YouTube is killing his channel because he’s white."
Medias start to report his behaviour and he is accusing them of building this conversation to gain attention from his name.
"This clickbait journalism is just the purest form of cancer."
"PewDiePie racist, question mark. What the [EXPLETIVE]?"
A few reporter from the Wall street Journal started to pay attention and made an investigation, finding nine videos that had anti-semitic jokes and Nazi imaginary.
"I think what this article shows more than anything, old-school media does not like internet personalities because they’re scared of us.[...]It was an attack towards me. It was an attack by the media to try and discredit me, to try and decrease my influence and my economic worth. That’s what this was.
I’m still here. I’m still making videos. Nice try, Wall Street Journal. Try again,"
He starts to make fun of politically-correctness and mainstream medias. He is become the symbol of this culture war.
In 2018 another YouTube channel, T-series seems to approach PewDiePie in terms of subscriptions. T-series is an Indian media conglomerate which is growing a lot because more and more people in India starts to get online. So PewDiePie creates this videos where is attacking the channel that want to dethronize him.
"(RAPPING) You’re trying to dethrone me from spot on number one? But you’re India you lose, —"
His fan start a subscription campaign in order to preserve PewDiePie sovereignity on YouTube. The campaign becomes viral and gets out of the cyberspace: "Bitch lasagna", PewDiePie's offensive hit eventually plays in a rally in Russia and in many other places.
PewDiePie's channel doubles his subscriptions.
A mass-shooting happens in a mosque in Christchurch, New Zealand: 49 people dies and the shooter, who is livestreaming the events on social medias, mentions he wants people to rip into each other and create chaos and- to subscribe to PewDiePie.
The shooter was following conspiracy theories regarding an attempt to replace the white race and internet figures like PewDiePie.
He makes an announcement after some months:
"Something happened that I don’t think anyone would have predicted. The Christchurch shooter, said, “subscribe to PewDiePie.[..]I think it’s time to end the “subscribe to PewDiePie” movement or meme.[..]
To have my name associated with something so unspeakably vile has affected me in more ways than I let show. I just didn’t want to address it right away, and I didn’t want to give the terrorist any more attention. I didn’t want to make it about me because I don’t think it has anything to do with me."
Q-anon
“Where we go one, we go all."
Q-anon supporters, in their constant activity of looking collaboratively for clues, are not only using the internet the way it was originally envisioned but they are also using it in the exact way that the powerful algorithms that currently run our biggest social media platforms are designed to encourage.
Facebook
After 2016 elections Facebook decided to modify the user algorithm in order to make the users see less recommended and clickbait content and more post from actual friends, family and groups.
✒ "The Apophenic Machine"
Molly Sauter, Real life Magazine, 2017 [6]
✑Apophenia:"unmotivated seeing of connections [accompanied by] a specific feeling of abnormal meaningfulness".
"A medium’s materiality affects the way we can think with it."
The invention of the press created a way to read and experience information in a linear way, through a sequence of indexes, numbered pages and table of contents.
"The modern condition is networked, and thus modern thought is networked too — or it at least tries to be."
When John Perry Barlow wrote his "“Declaration of the Independence of Cyberspace” in 1996 he was looking at the nascent internet as the medium that would concretise a new era driven by "ethics, enlightened self-interest, and the commonweal.” Needless to say, we find ourself in a very different situation now: the network became a place that where we don't feel anymore at the center if not for being played by it.
"We are not in the network; we are on it".
The internet that we know and experience is the one build upon hyperlinks and "deep linking", a practise that is creating a web of individualised thoughts and unique connections: every index is personal and built by an individual in realtime.
"Humans are storytellers, pattern-spotters, metaphor-makers. When these instincts run away with us, when we impose patterns or relationships on otherwise unrelated things, we call it apophenia. When we create these connections online, we call it the internet, the web circling back to itself again and again. The internet is an apophenic machine."
Conspiracy theories might be seen as a side-effect and a mere product of psychological biases: they in-fact are a human product.
But what happen when we create a technology that has the power of encouraging this behaviour?
The anthropologist Kathleen Stewart tries to answer this question in "Conspiracy's Theory Worlds":
“the internet was made for conspiracy theory: it is a conspiracy theory: one thing leads to another, always another link leading you deeper into no thing and no place, floating through self-dividing and transmogrifying sites until you are awash in the sheer evidence that the internet exists.”
Conspiracy theories are claiming that the structure of power and events surrounding us is traceable and can be easily ordered and mapped through a tidying practice that gives its practitioners the feeling of maintaining an agency on the world while disconnecting themselves from responsibilities.
They might be the last true believers in an ordered universe.
Considering this can help us in understanding that conspiracy theorising is not only a delusional activity but it can still be linked back to a mode that is not completely irrational: the idea of attributing an event to the plan of a cabal means that the events around us can still be controlled. Conspiracy thinking does not seem unreasonable in a " complex global networked capitalism — where actors like Maersk, Walmart, or ExxonMobil organise world-spanning feats of logistics, extraction, and finance-backed violence, or where the Catholic Church priest abuse scandal was front-page news across the globe for years "-"It finds intentionality and a purposeful human hand where other epistemologies might see, as Keeley puts it, only the “absurdism of an irrational and essentially meaningless world”".
There must be a reason- and sometimes there is.
It is getting even more complex if we consider the fact that humanity is now facing a global catastrophe where the responsibility lies both on no one and everyone at the same time. When analysing climate change or global capitalism we can see how the plot ended up being too complex to trace its heroes and villains: what we are seeing in conspiracy thinking is the resilience of people that are still trying to put it into a narrative with all the tools they can find.
It is easier to believe that an obscure sect is ruling the world that accepting and taking our slice of responsibility.
"[...]networks (like conspiracy theories) excel at creating the illusion of the world as graspable, strung together with links even as the socially contingent markers of importance, trust, and validity are increasingly on the fritz"
When it comes to build a conspiracy theory meticulousness is an imperative requirement and the internet is just an exquisite environment to make it happen. It is embedded with such a broad collection of irrelevant data where every piece of information might be chosen to become the crucial tile of the mosaic.
According to Sauter "In the conspiratorial mode, sheer availability is the primary criterion for significance.".
"As the complexities of the hyper-networked world exponentially compound beyond the ability of anyone to fully grasp, it’s unsurprising that the broader political culture should tumble backward into familiar moral narratives, familiar villains, all bolstered by the mediated opportunity to link, link, link, link"
Conspiracies are not an irrational response to the complex networked environment that we created: they are emerging from the very same structure that they are striving to understand and place into a narrative - " link by link by link". What they are not doing is creating a response that goes beyond the search for the ultimate villain. They are comforting themselves in a handcrafted answer, making the rest of the non-linked world act under their hypervigilant sight.