ESSAY: You Are the Product by John Lanchester

From XPUB & Lens-Based wiki

ESSAY: You Are the Product by John Lanchester link

Summary

The article is about Facebook changing its ‘mission statement’ ‘making the world more open and connected’.' and the writer analyses the strategies over the years. I thought it was fascinating to read that Zuckenburg had studied, next to Computer Science, Psychology. And that Facebook’s first external investor Peter Thiel studies philosophy with a great interested in René Girard, his big idea was something he called ‘mimetic desire’. Human beings are born with a need for food and shelter. Once these fundamental necessities of life have been acquired, we look around us at what other people are doing and wanting, and we copy them. In Thiel’s summary, the idea is ‘that imitation is at the root of all behavior’. Over the years there are several strategies developed to manipulate the people:

  • The Filter Bubble: Only connect with people who agree with eathother.
  • If the product is free, you are the product. Facebook’s customers aren’t the people who are on the site: its customers are the advertisers who use its network and who relish its ability to direct ads to receptive audiences. Therefore it doesn't matter if the content on the site is true.
  • To take the huge amount of information Facebook has about its ‘community’ and use it to let advertisers target ads with a specificity never known before, in any medium. Martínez: ‘It can be demographic in nature (e.g. 30-to-40-year-old females), geographic (people within five miles of Sarasota, Florida), or even based on Facebook profile data (do you have children; i.e. are you in the mommy segment?).’
  • Get friends of these women to post a ‘sponsored story’ on a targeted consumer’s news feed, so it doesn’t feel like an ad. As Zuckerberg said when he introduced Facebook Ads, Nothing influences people more than a recommendation from a trusted friend
  • connect a phone ID and add it to the Facebook ID, Put it together with the rest of the online activity: not just every site that is visited, but every click that's been made.– the Facebook button tracks every Facebook user, whether they click on it or not. Since the Facebook button is pretty much ubiquitous on the net, this means that Facebook sees you, everywhere.
  • Have partnerships with the old-school credit firms, Know who everybody was, where they lived, and everything they’d ever bought with plastic in a real-world offline shop.
  • Organise social scientists at the company to manipulate some people’s news feeds to see what effect, if any, it had on their emotions.
  • Offering internet connectivity to remote villages in India, with the proviso that the range of sites on offer should be controlled by Facebook.
  • A project involving a solar-powered drone called the Aquila, which has the wingspan of a commercial airliner, weighs less than a car, and when cruising uses less energy than a microwave oven. The idea is that it will circle remote, currently unconnected areas of the planet, for flights that last as long as three months at a time. It connects users via laser and was developed in Bridgwater, Somerset.
  • Perhaps the new mission could be that Zuckerberg would run for president, the early signals Zuck has sent out, like the fifty-state pretending-to-give-a-shit tour, the thoughtful-listening pose he’s photographed in while sharing milkshakes in an Iowa diner.

Notes:

At the end of June 2017, Zuckerberg announced that the company was changing its ‘mission statement’, its version of the canting pieties beloved of corporate America. Facebook’s mission used to be ‘making the world more open and connected’.'

Facebook is generally agreed to have played a big, perhaps even a crucial, role in the election of Donald Trump. The benefit to humanity is not clear. This thought, or something like it, seems to have occurred to Zuckerberg because the new mission statement spells out a reason for all this connectedness. It says that the new mission is to ‘give people the power to build community and bring the world closer together’.

Internet companies are working in a field that is poorly understood (if understood at all) by customers and regulators. The stuff they’re doing, if they’re any good at all, is by definition new.


Antonio García Martínez, a former Facebook manager, argues in Chaos Monkeys: Zuckerberg was studying for a degree with a double concentration in computer science and – this is the part people tend to forget – psychology. He is very well aware of how people’s minds work and in particular of the social dynamics of popularity and status. The idea was that people wanted to look at what other people like them were doing, to see their social networks, to compare, to boast and show off, to give full rein to every moment of longing and envy, to keep their noses pressed against the sweet-shop window of others’ lives.

Facebook’s first external investor, the now notorious Silicon Valley billionaire Peter Thiel. Thiel’s $500,000 investment in 2004 was crucial to the success of the company. But there was a particular reason Facebook caught Thiel’s eye, rooted in a byway of intellectual history. In the course of his studies at Stanford – he majored in philosophy – Thiel became interested in the ideas of the US-based French philosopher René Girard, as advocated in his most influential book, Things Hidden since the Foundation of the World. Girard’s big idea was something he called ‘mimetic desire’. Human beings are born with a need for food and shelter. Once these fundamental necessities of life have been acquired, we look around us at what other people are doing, and wanting, and we copy them. In Thiel’s summary, the idea is ‘that imitation is at the root of all behaviour’. Girard was a Christian, and his view of human nature is that it is fallen. We don’t know what we want or who we are; we don’t really have values and beliefs of our own; what we have instead is an instinct to copy and compare. We are homo mimeticus. ‘Man is the creature who does not know what to desire, and who turns to others in order to make up his mind. We desire what others desire because we imitate their desires.’ Look around, ye petty, and compare. The reason Thiel latched onto Facebook with such alacrity was that he saw in it for the first time a business that was Girardian to its core: built on people’s deep need to copy. ‘Facebook first spread by word of mouth, and it’s about word of mouth, so it’s doubly mimetic,’ Thiel said. ‘Social media proved to be more important than it looked, because it’s about our natures.’ We are keen to be seen as we want to be seen, and Facebook is the most popular tool humanity has ever had with which to do that. The view of human nature implied by these ideas is pretty dark. If all people want to do is go and look at other people so that they can compare themselves to them and copy what they want – if that is the final, deepest truth about humanity and its motivations – then Facebook doesn’t really have to take too much trouble over humanity’s welfare, since all the bad things that happen to us are things we are doing to ourselves

The highest-profile recent criticisms of the company stem from its role in Trump’s election. There are two components to this:

  • one of them implicit in the nature of the site, which has an inherent tendency to fragment and atomise its users into like-minded groups. The mission to ‘connect’ turns out to mean, in practice, connect with people who agree with you.
  • No company better exemplifies the internet-age dictum that if the product is free, you are the product. Facebook’s customers aren’t the people who are on the site: its customers are the advertisers who use its network and who relish its ability to direct ads to receptive audiences. Why would Facebook care if the news streaming over the site is fake? Its interest is in the targeting, not in the content. This is probably one reason for the change in the company’s mission statement. If your only interest is in connecting people, why would you care about falsehoods?


Fake news is not, as Facebook has acknowledged, the only way it was used to influence the outcome of the 2016 presidential election. At the end of April, Facebook got around to admitting this (by then) fairly obvious truth, in an interesting paper published by its internal security division. ‘Fake news’, they argue, is an unhelpful, catch-all term because misinformation is in fact spread in a variety of ways:

Information (or Influence) Operations – Actions taken by governments or organised non-state actors to distort domestic or foreign political sentiment.

False News – News articles that purport to be factual, but which contain intentional misstatements of fact with the intention to arouse passions, attract viewership, or deceive.

False Amplifiers – Co-ordinated activity by inauthentic accounts with the intent of manipulating political discussion (e.g. by discouraging specific parties from participating in discussion, or amplifying sensationalistic voices over others).

Disinformation – Inaccurate or manipulated information/content that is spread intentionally. This can include false news, or it can involve more subtle methods, such as false flag operations, feeding inaccurate quotes or stories to innocent intermediaries, or knowingly amplifying biased or misleading information.

One man’s fake news is another’s truth-telling, and Facebook works hard at avoiding responsibility for the content on its site.

The key to understanding this is to think about what advertisers want: they don’t want to appear next to pictures of breasts because it might damage their brands, but they don’t mind appearing alongside lies because the lies might be helping them find the consumers they’re trying to target.

Much of the video content on the site is stolen from the people who created it. An illuminating YouTube video from Kurzgesagt, a German outfit that makes high-quality short explanatory films, notes that in 2015, 725 of Facebook’s top one thousand most viewed videos were stolen. http://kurzgesagt.org/work/how-facebook-is-stealing-billions-of-views/

Facebook has two priorities, as Martínez explains in Chaos Monkeys: growth and monetisation. Martínez gives the clearest account both of how it ended up like that, and how Facebook advertising works. to take the huge amount of information Facebook has about its ‘community’ and use it to let advertisers target ads with a specificity never known before, in any medium. Martínez: ‘It can be demographic in nature (e.g. 30-to-40-year-old females), geographic (people within five miles of Sarasota, Florida), or even based on Facebook profile data (do you have children; i.e. are you in the mommy segment?).’

Taplin makes the same point: If I want to reach women between the ages of 25 and 30 in zip code 37206 who like country music and drink bourbon, Facebook can do that. Moreover, Facebook can often get friends of these women to post a ‘sponsored story’ on a targeted consumer’s news feed, so it doesn’t feel like an ad. As Zuckerberg said when he introduced Facebook Ads, Nothing influences people more than a recommendation from a trusted friend A trusted referral is the Holy Grail of advertising.’


A technique called ‘onboarding’. As Martínez explains it, the best way to think about this is to consider our various kinds of name and address.

For example, if Bed, Bath and Beyond wants to get my attention with one of its wonderful 20 per cent off coupons, it calls out:

Antonio García Martínez 1 Clarence Place #13 San Francisco, CA 94107

If it wants to reach me on my mobile device, my name there is:

38400000-8cfo-11bd-b23e-10b96e40000d

That’s my quasi-immutable device ID, broadcast hundreds of times a day on mobile ad exchanges.

On my laptop, my name is this:

07J6yJPMB9juTowar.AWXGQnGPA1MCmThgb9wN4vLoUpg.BUUtWg.rg.FTN.0.AWUxZtUf

This is the content of the Facebook re-targeting cookie, which is used to target ads-are-you based on your mobile browsing.

Though it may not be obvious, each of these keys is associated with a wealth of our personal behaviour data: every website we’ve been to, many things we’ve bought in physical stores, and every app we’ve used and what we did there … The biggest thing going on in marketing right now, what is generating tens of billions of dollars in investment and endless scheming inside the bowels of Facebook, Google, Amazon and Apple, is how to tie these different sets of names together, and who controls the links. That’s it.

Facebook already had a huge amount of information about people and their social networks and their professed likes and dislikes.​2 After waking up to the importance of monetisation, they added to their own data a huge new store of data about offline, real-world behaviour, acquired through partnerships with big companies such as Experian, which have been monitoring consumer purchases for decades via their relationships with direct marketing firms, credit card companies, and retailers. There doesn’t seem to be a one-word description of these firms: ‘consumer credit agencies’

These firms know all there is to know about your name and address, your income and level of education, your relationship status, plus everywhere you’ve ever paid for anything with a card. Facebook could now put your identity together with the unique device identifier on your phone.

So Facebook knows your phone ID and can add it to your Facebook ID. It puts that together with the rest of your online activity: not just every site you’ve ever visited, but every click you’ve ever made – the Facebook button tracks every Facebook user, whether they click on it or not. Since the Facebook button is pretty much ubiquitous on the net, this means that Facebook sees you, everywhere. Now, thanks to its partnerships with the old-school credit firms, Facebook knew who everybody was, where they lived, and everything they’d ever bought with plastic in a real-world offline shop.​4 All this information is used for a purpose which is, in the final analysis, profoundly bathetic. It is to sell you things via online ads.

What this means is that even more than it is in the advertising business, Facebook is in the surveillance business. Facebook, in fact, is the biggest surveillance-based enterprise in the history of mankind. It knows far, far more about you than the most intrusive government has ever known about its citizens. It’s amazing that people haven’t really understood this about the company. I’ve spent time thinking about Facebook, and the thing I keep coming back to is that its users don’t realise what it is the company does. What Facebook does is watch you, and then use what it knows about you and your behaviour to sell ads. I’m not sure there has ever been a more complete disconnect between what a company says it does – ‘connect’, ‘build communities’ – and the commercial reality.

the scandal in 2014 when it turned out that social scientists at the company had deliberately manipulated some people’s news feeds to see what effect, if any, it had on their emotions. The resulting paper, published in the Proceedings of the National Academy of Sciences, was a study of ‘social contagion’, or the transfer of emotion among groups of people, as a result of a change in the nature of the stories seen by 689,003 users of Facebook. ‘When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.’ The scientists seem not to have considered how this information would be received, and the story played quite big for a while.

Martínez compares Zuckerberg to Alexander the Great, weeping because he has no more worlds to conquer. Perhaps this is one reason for the early signals Zuck has sent about running for president – the fifty-state pretending-to-give-a-shit tour, the thoughtful-listening pose he’s photographed in while sharing milkshakes in (Presidential Ambitions klaxon!) an Iowa diner.

An early experiment came in the form of Free Basics, a program offering internet connectivity to remote villages in India, with the proviso that the range of sites on offer should be controlled by Facebook. ‘Who could possibly be against this?’ Zuckerberg wrote in the Times of India. The answer: lots and lots of angry Indians. The government ruled that Facebook shouldn’t be able to ‘shape users’ internet experience’ by restricting access to the broader internet. A Facebook board member tweeted that ‘anti-colonialism has been economically catastrophic for the Indian people for decades. Why stop now?’ As Taplin points out, that remark ‘unwittingly revealed a previously unspoken truth: Facebook and Google are the new colonial powers.’

Facebook is working on a project involving a solar-powered drone called the Aquila, which has the wingspan of a commercial airliner, weighs less than a car, and when cruising uses less energy than a microwave oven. The idea is that it will circle remote, currently unconnected areas of the planet, for flights that last as long as three months at a time. It connects users via laser and was developed in Bridgwater, Somerset.