User:ThomasW/Notes you are not a gadget

From Media Design: Networked & Lens-Based wiki
Jump to navigation Jump to search

Lanier, Jaron - You Are Not a Gadget: A Manifesto , (2010), New York, Borzoi Books


SOFTWARE EXPRESSES IDEAS about everything from the nature of a musical note to the nature of personhood. Software is also subject to an exceptionally rigid process of “lock-in.” Therefore, ideas (in the present era, when human affairs are increasingly software driven) have become more subject to lock-in than in previous eras. Most of the ideas that have been locked in so far are not so bad, but some of the so-called web 2.0 ideas are stinkers, so we ought to reject them while we still can. Speech is the mirror of the soul; as a man speaks, so is he.PUBLILIUS SYRUS page6

When they design an internet service that is edited by a vast anonymous crowd, they are suggesting that a random crowd of humans is an organism with a legitimate point of view.

 P 7

It would be hard for anyone, let alone a technologist, to get up in the morning without the faith that the future can be better than the past. Back in the 1980s, when the internet was only available to small number of pioneers, I was often confronted by people who feared that the strange technologies I was working on, like virtual reality, might unleash the demons of human nature. P7

A single person, Tim Berners-Lee, came to invent the particular design of today‟s web. The web as it was introduced was minimalist, in that it assumed just about as little as possible about what a web page would be like. It was also open, in that no page was preferred by the architecture over another, and all pages were accessible to all. It also emphasized responsibility, because only the owner of a website was able to make sure that their site was available to be visited. P 8


Little programs are delightful to write in isolation, but the process of maintaininglarge-scale software is always miserable. Because of this, digital technology tempts the programmer‟s psyche into a kind of schizophrenia. There is constant confusion between real and ideal computers. Technologists wish every program behaved like a brand-new, playful little program, and will use any available psychological strategy to avoid thinking about computers realistically.

P8


Standards and their inevitable lack of prescience posed a nuisance before computers, of course. Railroad gauges—the dimensions of the tracks—are one example. The London Tube was designed with narrow tracks and matching tunnels that, on several of the lines, cannot accommodate air-conditioning, because there is no room to ventilate the hot air from the trains. Thus, tens of thousands of modern-day residents in one of the world‟s richest cities must suffer a stifling commute because of an inflexible design decision made more than one hundred years ago. P9


What do files mean to the future of human expression? This is a harder question to answer than the question “How does the English language influence the thoughts of native English speakers?” At least you can compare English speakers to Chinese speakers, but files are universal. The idea of the file has become so big that we are unable to conceive of a frame large enough to fit around it in order to assess it empirically P12

Entrepreneurs naturally sought to create products that would inspire demand (or at least hypothetical advertising opportunities that might someday compete with Google) where there was no lack to be addressed and no need to be filled, other than greed.

         P13


Voluntary productivity had to be commoditized, because the type of faith I‟m criticizing thrives when you can pretend that computers do everything and people do nothing. An endless series of gambits backed by gigantic investments encouraged young people entering the online world for the first time to create standardized presences on sites like Facebook. Commercial interests promoted the widespread adoption of standardized designs like the blog, and these designs encouraged pseudonymity in at least some aspects of their designs, such as the comments, instead of the proud extroversion that characterized the first wave of web culture

p14 --- Instead of people being treated as the sources of their own creativity, commercial aggregation and abstraction sites presented anonymized fragments of creativity as products that might have fallen from the sky or been dug up from the ground, obscuring the true sources. P14


The ascendant tribe is composed of the folks from the open culture/Creative Commons world, the Linux community, folks associated with the artificial intelligence approach to computer science, the web 2.0 people, the anticontext file sharers and remashers, and a variety of others. Their capital is Silicon Valley, but they have power bases all over the world, wherever digital culture is being created. Their favorite blogs include Boing Boing, TechCrunch, and Slashdot, and their embassy in the old country is Wired. Obviously, I‟m painting with a broad brush; not every member of the groups I mentioned subscribes to every belief I‟m criticizing. In fact, the groupthink problem I‟m worried about isn‟t so much in the minds of the technologists themselves, but in the minds of the users of the tools the cybernetic totalists are promoting. P14

---

g n the other hand, I know there is also a distinct tradition of computer science that is humanistic. Some of the better-known figures in this tradition include the late Joseph Weizenbaum, Ted Nelson, Terry Winograd, Alan Kay, Bill Buxton, Doug Englebart, Brian Cantwell Smith, Henry Fuchs, Ken Perlin, Ben Schneiderman (who invented the idea of clicking on a link), and Andy Van Dam, who is a master teacher and has influenced generations of protégés, including Randy Pausch. Another important humanistic computing figure is David Gelernter, who conceived of a huge portion of the technical underpinnings of what has come to be called cloud computing, as well as many of the potential practical applications of clouds.

P15

--- The same thing is happening again. A self-proclaimed materialist movement that attempts to base itself on science starts to look like a religion rather quickly.

         P15

---


Pop culture has entered into a nostalgic malaise. Online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media. It is a culture of reaction without action. P16

We have to think about the digital layers we are laying down now in order to benefit future generations. We should be optimistic that civilization will survive this challenging century, and put some effort into creating the best possible world for those who will inherit our efforts. P17


There was an active campaign in the 1980s and 1990s to promote visual elegance in software. That political movement bore fruit when it influenced engineers at companies like Apple and Microsoft who happened to have a chance to steer the directions software was taking before lock-in made their efforts moot.-That‟s why we have nice fonts and flexible design options on our screens. It wouldn‟t have happened otherwise. The seemingly unstoppable mainstream momentum in the world of software engineers was pulling computing in the direction of ugly screens, but that fate was avoided before it was too late.A similar campaign should be taking place now, influencing engineers, designers, businesspeople, and everyone else to support humanistic alternatives whenever possible. Unfortunately, however, the opposite seems to be happening. Online culture is filled to the brim with rhetoric about what the true path to a better world ought to be, and these days it‟s strongly biased toward an antihuman way of thinking. P18

The New YorkTimes, for instance, promotes so-called open digital politics on a daily basis even though that ideal and the movement behind it are destroying the newspaper, and all other newspapers. *It seems to be a case of journalistic Stockholm syndrome. P18

Cloud” is a term for a vast computing resource available over the internet. You never know where the cloud resides physically. Google, Microsoft, IBM, and various government agencies are some of the proprietors of computing clouds. P19

--

You might even be eager to embrace wars and tolerate poverty and disease in others to bring about the conditions that could prod the Rapture into being. In the same way, if you believe the Singularity is coming soon, you might cease to design technology to serve humans, and prepare instead for the grand events it will bring. P21


The Rapture and the Singularity share one thing in common: they can never be verified by the living. P21

Bits are presented as if they were alive, while humans are transient fragments. Real people must have left all those anonymous comments on blogs and video clips, but who knows where they are now, or if they are dead? The digital hive is growing at the expense of individuality.Kevin Kelly says that we don‟t need authors anymore, that all the ideas of the world, all the fragments that used to be assembled into coherent books by identifiable authors, can be combined into one single, global book. Wired editor Chris Anderson proposes that science should no longer seek theories that scientists can understand, because the digital cloud will understand them better anyway.

  P21


If you believe the distinction between the roles of people and computers is starting to dissolve, you might express that—as some friends of mine at Microsoft once did—by designing features for a word processor that are supposed to know what you want, such as when you want to start an outline within your document. You might have had the experience of having Microsoft Word suddenly determine, at the wrong moment, that you are creating an indented outline. While I am all for the automation of petty tasks, this is different. From my point of view, this type of design feature is nonsense, since you end up having to work more than you would otherwise in order to manipulate the software‟s expectations of you.

P22

--

ypp The real function of the feature isn‟t to make life easier for people. Instead, it promotes a new philosophy: that the computer is evolving into a life-form that can understand people better than people can understand themselves. Another example is what I call the “race to be most meta.” If a design like Facebook or Twitter depersonalizes people a little bit, then another service like Friendfeed—which may not even exist by the time this book is published—might soon come along to aggregate the previous layers of aggregation, making individual people even more abstract, and the illusion of high-level metaness more celebrated. P22


“Information wants to be free.” So goes the saying. Stewart Brand, the founder of the Whole Earth Catalog, seems to have said it first. I say that information doesn‟t deserve to be free. Cybernetic totalists love to think of the stuff as if it were alive and had its own ideas and ambitions. But what if information is inanimate? What if it‟s even less than inanimate, a mere artifact of human thought? What if only humans are real, and information is not?

Of course, there is a technical use of the term “information” that refers to something entirely real. This is the kind of information that‟s related to entropy. But that fundamental kind of information, which exists independently of the culture of an observer, is not the same as the kind we can put in computers, the kind that supposedly wants to be free. Information is alienated experience. You can think of culturally decodable information as a potential form of experience, very much as you can think of a brick resting on a ledge as storing potential energy.

P22

---

p In the same way, stored information might cause experience to be revealed if it is prodded in the right way. A file on a hard disk does indeed contain information of the kind that objectively exists. The fact that the bits are discernible instead of being scrambled into mush—the way heat scrambles things—is what makes them bits. But if the bits can potentially mean something to someone, they can only do so if they are experienced. When that happens, a commonality of culture is enacted between the storer and the retriever of the bits. Experience is the only process that can de-alienate information. Information of the kind that purportedly wants to be free is nothing but a shadow of our own minds, and wants nothing on its own. It will not suffer if it doesn‟t get what it wants. But if you want to make the transition from the old religion, where you hope God will give you an afterlife, to the new religion, where you hope to become immortal by getting uploaded into a computer, then you have to believe information is real and alive. So for you, it will be important to redesign human institutions like art, the economy, and the law to reinforce the perception that information is alive. You demand that the rest of us live in your new conception of a state religion. You need us to deify information to reinforce your faith. P23

--

What the test really tells us, however, even if it‟s not necessarily what Turing hoped it would say, is that machine intelligence can only be known in a relative sense, in the eyes of a human beholder.*The AI way of thinking is central to the ideas I‟m criticizing in this book. If a machine can be conscious, then the computing cloud is going to be a better and far more capacious consciousness than is found in an individual person. If you believe this, then working for the benefit of the cloud over individual people puts you on the side of the angels.

             P24

--

People degrade themselves in order to make machines seem smart all the time. Before the crash, bankers believed in supposedly intelligent algorithms that could calculate credit risks before making bad loans. We ask teachers to teach to standardized tests so a student will look good to an algorithm. We have repeatedly demonstrated our species‟ bottomless ability to lower our standards to make information technology look good. Every instance of intelligence in amachine is ambiguous.

P25 --

A significant number of AI enthusiasts, after a protracted period of failed experiments in tasks like understanding natural language, eventually found consolation in the adoration for the hive mind, which yields better results because there are real people behind the curtain. Wikipedia, for instance, works on what I call the Oracle illusion, in which knowledge of the human authorship of a text is suppressed in order to give the text superhuman validity. Traditional holy books work in precisely the same way and present many of the same problems.

 P25

-- It‟s so weird to me that Ray Kurzweil wants the global computing cloud to scoop up the contents of our brains so we can live forever in virtual reality. When my friends and I built the first virtual reality machines, the whole point was to make this world more creative, expressive, empathic, and interesting. It was not to escape it. P25

- The public reaction to the defeat of Kasparov left the computer science community with an important question, however. Is it useful to portray computers themselves as intelligent or humanlike in any way? Does this presentation serve to clarify or to obscure the role of computers in our lives?

P26

--

When people are told that a computer is intelligent, they become prone to changing themselves in order to make the computer appear to work better, instead of demanding that the computer be changed to become more useful. People already tend to defer to computers, blaming themselves when a digital gadget or online service is hard to use. Treating computers as intelligent, autonomous entities ends up standing the process of engineering on its head. We can‟t afford to respect our own designs so much. P27 --

The border between person and nonperson might be found somewhere in the embryonic sequence from conception to baby, or in the development of the young child, or the teenager. Or it might be best defined in the phylogenetic path from ape to early human, or perhaps in the cultural history of ancient peasants leading to modern citizens. It might exist somewhere in a continuum between small and large computers. It might have to do with which thoughts you have; maybe self-reflective thoughts or the moral capacity for empathy makes you human. These are some of the many gates to personhood that have been proposed, but none of them seem, definitive to me. The borders of person-hood remain variegated and fuzzy. P29

-- the new twist in Silicon Valley is that some people—very influential people—believe they are hearing algorithms and crowds and other internet-supported nonhuman entities speak for themselves. I don‟t hear those voices, though—and I believe those who do are fooling —themselves. P29

--

The alternative to sprinkling magic dust on people is sprinkling it on computers, the hive mind, the cloud, the algorithm, or some other cybernetic object. P31

--

o call consciousness an illusion is to give time a supernatural quality—maybe some kind of spooky nondeterminism. Or you can choose a different shell in the game and say that time is natural (not supernatural), and that the present moment is only a possible concept because of consciousness. P32

But the danger of an engineer pretending to know more than he really does is the greater danger, especially when he can reinforce the illusion through the use of computation. The cybernetic totalists awaiting the Singularity are nuttier than the folks with the food supplements. P32

According to a new creed, we technologists are turning ourselves, the planet, our species, everything, into computer peripherals attached to the great computing clouds. The news is no longer about us but about the big new computational object that is greater than us.

P34

What these critics forget is that printing presses in themselves provide no guarantee of an enlightened outcome. People, not machines, made the Renaissance. P34

This was as clear as ever when John Updike and Kevin Kelly exchanged words on the question of authorship in 2006. Kevin suggested that it was not just a good thing, but a “moral imperative” that all the world‟s books would soon become effectively “one book” once they were scanned, searchable, and remixable in the universal computational cloud.

P34

The approach to digital culture I abhor would indeed turn all the world‟s books into one book, just as Kevin suggested. It might start to happen in the next decade or so. Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what‟s important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don‟t know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video. P34

It‟s easy to imagine an alternate history in which everything that was printed on early presses went through the Church and was conceived as an extension of the Bible. “Strife of Love” might have existed in this alternate world, and might have been quite similar. But the “slight” modifications would have consisted of trimming the alien bits. The book would no longer have been as strange. And that tiny shift, even if it had been minuscule in terms of word count, would have been tragic.

This is what happened when elements of indigenous cultures were preserved but de-alienated by missionaries. We know a little about what Aztec or Inca music sounded like, for instance, but the bits that were trimmed to make the music fit into the European idea of church song were the most precious bits. The alien bits are where the flavor is found. They are the portals to strange philosophies. What a loss to not know how New World music would have sounded alien to us! Some melodies and rhythms survived, but the whole is lost. Something like missionary reductionism has happened to the internet with the rise of web 2.0. The strangeness is being leached away by the mush-making process. Individual web pages as they first appeared in the early 1990s had the flavor of personhood. MySpace preserved some of that flavor, though a process of regularized formatting had begun. Facebook went further, organizing people into multiple-choice identities, while Wikipedia seeks to erase point of view entirely.

If a church or government were doing these things, it would feel authoritarian, but when technologists are the culprits, we seem hip, fresh, and inventive.

P35


It is utterly strange to hear my many old friends in the world of digital culture claim to be the true sons of the Renaissance without realizing that using computers to reduce individual expression is a primitive, retrograde activity, no matter how sophisticated your tools are. P36

A fashionable idea in technical circles is that quantity not only turns into quality at some extreme of scale, but also does so according to principles we already understand. Some of my colleagues think a million, or perhaps a billion, fragmentary insults will eventually yield wisdom that surpasses that of any well-thought-out essay, so long as sophisticated secret statistical algorithms recombine the fragments. I disagree. A trope from the early days of computer sciencecomes to mind: garbage in, garbage out.

     P36

Standards and their inevitable lack of prescience posed a nuisance before computers, of course. Railroad gauges—the dimensions of the tracks—are one example. The London Tube was designed with narrow tracks and matching tunnels that, on several of the lines, accommodate air-conditioning, because there is no room to ventilate the hot air from the trains. Thus, tens of thousands of modern-day residents in one of the world‟s richest cities must suffer a stifling commute because of an inflexible design decision made more than one hundred years ago. P9

Pop culture has entered into a nostalgic malaise. Online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media. It is a culture of reaction without action. P16

We have to think about the digital layers we are laying down now in order to benefit future generations. We should be optimistic that civilization will survive this challenging century, and put some effort into creating the best possible world for those who will inherit our efforts. P17

  • “Cloud” is a term for a vast computing resource available over the internet. You never know where the cloud resides physically. Google, Microsoft, IBM, and various government agencies are some of the proprietors of computing clouds. P19


“Information wants to be free.” So goes the saying. Stewart Brand, the founder of the Whole Earth Catalog, seems to have said it first. I say that information doesn‟t deserve to be free. Cybernetic totalists love to think of the stuff as if it were alive and had its own ideas and ambitions. But what if information is inanimate? What if it‟s even less than inanimate, a mere artifact of human thought? What if only humans are real, and information is not? Of course, there is a technical use of the term “information” that refers to something entirely real. This is the kind of information that‟s related to entropy. But that fundamental kind of information, which exists independently of the culture of an observer, is not the same as the kind we can put in computers, the kind that supposedly wants to be free. Information is alienated experience.You can think of culturally decodable information as a potential form of experience, very much as you can think of a brick resting on a ledge as storing potential energy. P22


In the same way, stored information might cause experience to be revealed if it is prodded in the right way. A file on a hard disk does indeed contain information of the kind that objectively exists. The fact that the bits are discernible instead of being scrambled into mush—the way heat scrambles things—is what makes them bits. But if the bits can potentially mean something to someone, they can only do so if they are experienced. When that happens, a commonality of culture is enacted between the storer and the retriever of the bits. Experience is the only process that can de-alienate information.

Information of the kind that purportedly wants to be free is nothing but a shadow of our own minds, and wants nothing on its own. It will not suffer if it doesn‟t get what it wants. But if you want to make the transition from the old religion, where you hope God will give you an afterlife, to the new religion, where you hope to become immortal by getting uploaded into a computer, then you have to believe information is real and alive. So for you, it will be important to redesign human institutions like art, the economy, and the law to reinforce the perception that information is alive. You demand that the rest of us live in your new conception of a state religion. You need us to deify information to reinforce your faith.

P23

Computers and chess share a common ancestry. Both originated as tools of war. Chess began as a battle simulation, a mental martial art. The design of chess reverberates even further into the past than that—all the way back to our sad animal ancestry of pecking orders and competing clans. P25

The approach to digital culture I abhor would indeed turn all the world‟s books into one book, just as Kevin suggested. It might start to happen in the next decade or so. Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what‟s important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don‟t know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video. P34

, It‟s easy to imagine an alternate history in which everything that was printed on early presses went through the Church and was conceived as an extension of the Bible. “Strife of Love” might have existed in this alternate world, and might have been quite similar. But the “slight” modifications would have consisted of trimming the alien bits. The book would no longer have been as strange. And that tiny shift, even if it had been minuscule in terms of word count, would have been tragic.

This is what happened when elements of indigenous cultures were preserved but de-alienated by missionaries. We know a little about what Aztec or Inca music sounded like, for instance, but the bits that were trimmed to make the music fit into the European idea of church song were the most precious bits. The alien bits are where the flavor is found. They are the portals to strange philosophies. What a loss to not know how New World music would have sounded alien to us! Some melodies and rhythms survived, but the whole is lost. Something like missionary reductionism has happened to the internet with the rise of web 2.0. The strangeness is being leached away by the mush-making process. Individual web pages as they first appeared in the early 1990s had the flavor of personhood. MySpace preserved some of that flavor, though a process of regularized formatting had begun. Facebook went further, organizing people into multiple-choice identities, while Wikipedia seeks to erase point of view -entirely. P35


The people who are perhaps the most screwed by open culture are the middle classes of intellectual and cultural creation. The freelance studio session musician faces diminished prospects, for instance. Another example, outside of the world of music, is the stringer selling reports to newspapers from a war zone. These are both crucial contributors to culture and democracy. Each pays painful dues and devotes years to honing a craft. They used to live off the trickle-down effects of the old system, and, like the middle class at large, they are precious. They get nothing from the new system.

P63

Our willingness to suffer for the sake of the perception of freedom is remarkable. P81


As a source of useful information, Wikipedia excels in two areas: pop culture and hard science. In the first category, truth is fiction anyway, so what the wiki says is by definition true; in the second, there actually is a preferred truth, so it is more plausible to speak with a shared voice.Wikipedia was predicted by Douglas Adams‟s science fiction comedy Hitchhikers Guide to the Galaxy. His fictional Guide functioned in a similar way, with one of its contributors able to instantaneously update the entire entry for Planet Earth (from “Harmless” to “Mostly harmless”) with a few taps on a keyboard. Though Earth merited a two-word entry, there were substantial articles about other topics, such as which alien poetry was the worst and how to make strange cocktails. The first thought is often the best thought, and Adams perfectly captured the spirit of much of Wikipedia before it was born. P96-97