User:ThomasW/Notes I am not a gadget
Lanier, Jaron (2011) I Am Not A Gadget, xxx, xx
It would be hard for anyone, let alone a technologist, to get up in the morning without the faith that the future can be better than the past. Back in the 1980s, when the internet was only available to small number of pioneers, I was often confronted by people who feared that the strange technologies I was working on, like virtual reality, might unleash the demons of human nature. page 7
A single person, Tim Berners-Lee, came to invent the particular design of today‟s web. The web as it was introduced was minimalist, in that it assumed just about as little as possible about what a web page would be like. It was also open, in that no page was preferred by the architecture over another, and all pages were accessible to all. It also emphasized responsibility, because only the owner of a website was able to make sure that their site was available to be visited. page 8
Little programs are delightful to write in isolation, but the process of maintaining large-scale software is always miserable. Because of this, digital technology tempts the programmer‟s psyche into a kind of schizophrenia. There is constant confusion between real and ideal computers. Technologists wish every program behaved like a brand-new, playful little program, and will use any available psychological strategy to avoid thinking about computers realistically. page 8
Standards and their inevitable lack of prescience posed a nuisance before computers, course. Railroad gauges—the dimensions of the tracks—are one example. The London Tube was designed with narrow tracks and matching tunnels that, on several of the lines, cannot accommodate air-conditioning, because there is no room to ventilate the hot air from the train.
Thus, tens of thousands of modern-day residents in one of the world‟s richest cities must suffer a stifling commute because of an inflexible design decision made more than one hundred years ago. P9
What do files mean to the future of human expression? This is a harder question to answer than the question “How does the English language influence the thoughts of native English speakers?” At least you can compare English speakers to Chinese speakers, but files are universal. The idea of the file has become so big that we are unable to conceive of a frame large enough to fit around it in order to assess it empirically P12
Entrepreneurs naturally sought to create products that would inspire demand (or at least hypothetical advertising opportunities that might someday compete with Google) where there was no lack to be addressed and no need to be filled, other than greed.
P13
Voluntary, productivity had to be commoditized, because the type of faith I‟m criticizing thrives when you can pretend that computers do everything and people do nothing. An endless series of gambits backed by gigantic investments encouraged young people
entering the online world for the first time to create standardized presences on sites like Facebook. Commercial interests promoted the widespread adoption of standardized designs like the blog, and these designs encouraged pseudonymity in at least some aspects of their designs, such as the comments, instead of the proud extroversion that characterized the first wave of web culture p14
---
Instead of people being treated as the sources of their own creativity, commercial aggregation and abstraction sites presented anonymized fragments of creativity as products that might have fallen from the sky or been dug up from the ground, obscuring the true sources.
P14
The ascendant tribe is composed of the folks from the open culture/Creative Commons world, the Linux community, folks associated with the artificial intelligence approach to computer science, the web 2.0 people, the anticontext file sharers and remashers, and a variety of others. Their capital is Silicon Valley, but they have power bases all over the world, wherever digital culture is being created. Their favorite blogs include Boing Boing, TechCrunch, and Slashdot, and their embassy in the old country is Wired.
Obviously, I‟m painting with a broad brush; not every member of the groups I mentioned subscribes to every belief I‟m criticizing. In fact, the groupthink problem I‟m worried about isn‟t so much in the minds of the technologists themselves, but in the minds of the users of the tools the cybernetic totalists are promoting. P14
---
On the other hand, I know there is also a distinct tradition of computer science that is humanistic. Some of the better-known figures in this tradition include the late Joseph Weizenbaum, Ted Nelson, Terry Winograd, Alan Kay, Bill Buxton, Doug Englebart, Brian Cantwell Smith, Henry Fuchs, Ken Perlin, Ben Schneiderman (who invented the idea of clicking on a link), and Andy Van Dam, who is a master teacher and has influenced generations of protégés, including Randy Pausch. Another important humanistic computing figure is David Gelernter, who conceived of a huge portion of the technical underpinnings of what has come to, be called cloud computing, as well as many of the potential practical applications of clouds.
P15
--- The same thing is happening again. A self-proclaimed materialist movement that attempts to base itself on science starts to look like a religion rather quickly.
P15
---
Pop culture has entered into a nostalgic malaise. Online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media. It is a culture of reaction without action.
P16
We have to think about the digital layers we are laying down now in order to benefit future generations. We should be optimistic that civilization will survive this challenging century, and put some effort into creating the best possible world for those who will inherit our efforts. P17
There was an active campaign in the 1980s and 1990s to promote visual elegance in
software. That political movement bore fruit when it influenced engineers at companies like
Apple and Microsoft who happened to have a chance to steer the directions software was taking
before lock-in made their efforts moot.-That‟s why we have nice fonts and flexible design options on our screens. It wouldn‟t
have happened otherwise. The seemingly unstoppable mainstream momentum in the world of
software engineers was pulling computing in the direction of ugly screens, but that fate was
avoided before it was too late.A similar campaign should be taking place now, influencing engineers, designers,
businesspeople, and everyone else to support humanistic alternatives whenever possible.
Unfortunately, however, the opposite seems to be happening.
Online culture is filled to the brim with rhetoric about what the true path to a better world ought to be, and these days it‟s strongly biased toward an antihuman way of thinking. P18
The New YorkTimes, for instance, promotes so-called open digital politics on a daily basis even though that ideal and the movement behind it are destroying the newspaper, and all other newspapers. *It seems to be a case of journalistic Stockholm syndrome. P18
Cloud” is a term for a vast computing resource available over the internet. You never know where the cloud resides physically. Google, Microsoft, IBM, and various government agencies are some of the proprietors of computing clouds. P19
--
You might even be eager to embrace wars and tolerate poverty and disease in others to bring about the conditions that could prod the Rapture into being. In the same way, if you believe the Singularity is coming soon, you might cease to design technology to serve humans, and prepare instead for the grand events it will bring. P21
The Rapture and the Singularity share one thing in common: they can never be verified by the living.
P21
Bits are presented as if they were alive, while humans are transient fragments. Real people must have left all those anonymous comments on blogs and video clips, but who knows where they are now, or if they are dead? The digital hive is growing at the expense of individuality.Kevin Kelly says that we don‟t need authors anymore, that all the ideas of the world, all the fragments that used to be assembled into coherent books by identifiable authors, can be combined into one single, global book. Wired editor Chris Anderson proposes that science should no longer seek theories that scientists can understand, because the digital cloud will understand them better anyway.
P21
If you believe the distinction between the roles of people and computers is starting to dissolve, you might express that—as some friends of mine at Microsoft once did—by designing features for a word processor that are supposed to know what you want, such as when you want to start an outline within your document. You might have had the experience of having Microsoft Word suddenly determine, at the wrong moment, that you are creating an indented outline. While I am all for the automation of petty tasks, this is different. From my point of view, this type of design feature is nonsense, since you end up having to work more than you would otherwise in order to manipulate the software‟s expectations of you.
P22
--
The real function of the feature isn‟t to make life easier for people. Instead, it promotes a
new philosophy: that the computer is evolving into a life-form that can understand people better
-ppyp
than people can understand themselves.
p
Another example is what I call the “race to be most meta.” If a design like Facebook or
pg
Twitter depersonalizes people a little bit, then another service like Friendfeed—which may not
—wppp, y
even exist by the time this book is published—might soon come along to aggregate the previous
d—my pgg ggg
layers of aggregation, making individual people even more abstract, and the illusion of
yggg, g
high-level metaness more celebrated. P22
“Information wants to be free.” So goes the saying. Stewart Brand, the founder of the
g
Whole Earth Catalog, seems to have said it first.
g,
I say that information doesn‟t deserve to be free.
y
Cybernetic totalists love to think of the stuff as if it were alive and had its own ideas and
y
ambitions. But what if information is inanimate? What if it‟s even less than inanimate, a mere
artifact of human thought? What if only humans are real, and information is not? gy ,
Of course, there is a technical use of the term “information” that refers to something
, g
entirely real. This is the kind of information that‟s related to entropy. But that fundamental kind
y py
of information, which exists independently of the culture of an observer, is not the same as the
, py ,
kind we can put in computers, the kind that supposedly wants to be free.
pp,
Information is alienated experience.
p
You can think of culturally decodable information as a potential form of experience, very
y p
much as you can think of a brick resting on a ledge as storing potential energy.
P22
---
p In the same way, stored information might cause experience to be revealed if it is
y, gp
prodded in the right way. A file on a hard disk does indeed contain information of the kind that
pgy
objectively exists. The fact that the bits are discernible instead of being scrambled into
jy
mush—the way heat scrambles things—is what makes them bits.
——y g
But if the bits can potentially mean something to someone, they can only do so if they are
py g , y y y
experienced. When that happens, a commonality of culture is enacted between the storer and the
ppp, y
retriever of the bits. Experience is the only process that can de-alienate information.
-py p
Information of the kind that purportedly wants to be free is nothing but a shadow of our
ppy g
own minds, and wants nothing on its own. It will not suffer if it doesn‟t get what it wants.
, g g
But if you want to make the transition from the old religion, where you hope God will
yg, yp
give you an afterlife, to the new religion, where you hope to become immortal by getting
gy, g, ypy gg
uploaded into a computer, then you have to believe information is real and alive. So for you, it
pp, yy,
will be important to redesign human institutions like art, the economy, and the law to reinforce
pg, y,
the perception that information is alive. You demand that the rest of us live in your new
ppy
conception of a state religion. You need us to deify information to reinforce your faith. P23
--
, gy What the test really tells us, however, even if it‟s not necessarily what Turing hoped it
y , , y g p
would say, is that machine intelligence can only be known in a relative sense, in the eyes of a
y,
human beholder.*The AI way of thinking is central to the ideas I‟m criticizing in this book. If a machine
y g g
can be conscious, then the computing cloud is going to be a better and far more capacious
, pg gg p
consciousness than is found in an individual person. If you believe this, then working for the
py,
benefit of the cloud over individual people puts you on the side of the angels.
P24
--
People degrade themselves in order to make machines seem smart all the time. Before the
pg
crash, bankers believed in supposedly intelligent algorithms that could calculate credit risks
, ppy gg
before making bad loans. We ask teachers to teach to standardized tests so a student will look
g
good to an algorithm. We have repeatedly demonstrated our species‟ bottomless ability to lower
ggpy py
our standards to make information technology look good. Every instance of intelligence in a
machine is ambiguous.
P25 --
g gqppg y
A significant number of AI enthusiasts, after a protracted period of failed experiments in
g, ppp
tasks like understanding natural language, eventually found consolation in the adoration for the
g gg, y
hive mind, which yields better results because there are real people behind the curtain.
, ypp
Wikipedia, for instance, works on what I call the Oracle illusion, in which knowledge of
p, , ,
the human authorship of a text is suppressed in order to give the text superhuman validity.
p ppgpy
Traditional holy books work in precisely the same way and present many of the same problems.
P25
--
ppp, It‟s so weird to me that Ray Kurzweil wants the global computing
qy gpg
cloud to scoop up the contents of our brains so we can live forever in virtual reality. When my
p p yy
friends and I built the first virtual reality machines, the whole point was to make this world more
y , p
creative, expressive, empathic, and interesting. It was not to escape it.
P25
- g, g pg The public reaction to the defeat of Kasparov left the computer science community with
pppy
an important question, however. Is it useful to portray computers themselves as intelligent or
pq, py pg
humanlike in any way? Does this presentation serve to clarify or to obscure the role of computers
in our lives?
P26
--
, g, When people are told that a computer is intelligent, they
pppg, y
become prone to changing themselves in order to make the computer appear to work better,
pgg ppp,
instead of demanding that the computer be changed to become more useful. People already tend
g pgpy
to defer to computers, blaming themselves when a digital gadget or online service is hard to use.
p, g ggg
Treating computers as intelligent, autonomous entities ends up standing the process of
g pg, p
engineering on its head. We can‟t afford to respect our own designs so much. P27 --
ppg The border between person and nonperson might be found somewhere in the embryonic
ppgy
sequence from conception to baby, or in the development of the young child, or the teenager. Or
qpy, pyg , g
it might be best defined in the phylogenetic path from ape to early human, or perhaps in the
gpygppy , pp
cultural history of ancient peasants leading to modern citizens. It might exist somewhere in a
y pg g
continuum between small and large computers. It might have to do with which thoughts you
gpggy
have; maybe self-reflective thoughts or the moral capacity for empathy makes you human. These
f-; ygpy py y
are some of the many gates to personhood that have been proposed, but none of them seem
y gppp,
definitive to me. The borders of person-hood remain variegated and fuzzy. P29
-- the new twist in Silicon Valley is that some people—very influential people—believe they are hearing algorithms and crowds and other internet-supported nonhuman entities speak for themselves. I don‟t hear those voices, though—and I believe those who do are fooling —themselves. P29
--
The alternative to sprinkling magic dust on people is sprinkling it on computers, the hive mind, the cloud, the algorithm, or some other cybernetic object. P31
--
, g o call consciousness an illusion is to give time a supernatural quality—maybe some
y—mgpqyy
kind of spooky nondeterminism. Or you can choose a different shell in the game and say that
py ygy
time is natural (not supernatural), and that the present moment is only a possible concept because
(
of consciousness. P32
But the danger of an engineer pretending to know more than he really does is the greater danger, especially when he can reinforce the illusion through the use of computation. The cybernetic totalists awaiting the Singularity are nuttier than the folks with the food supplements. P32
According to a new creed, we technologists are turning ourselves, the planet, our species, everything, into computer peripherals attached to the great computing clouds. The news is no longer about us but about the big new computational object that is greater than us.
P34
What these critics forget is that printing presses in themselves provide no guarantee of an enlightened outcome. People, not machines, made the Renaissance. P34
py
This was as clear as ever when John Updike and Kevin Kelly exchanged words on the question of authorship in 2006. Kevin suggested that it was not just a good thing, but a “moral imperative” that all the world‟s books would soon become effectively “one book” once they were scanned, searchable, and remixable in the universal computational cloud.
P34
The approach to digital culture I abhor would indeed turn all the world‟s books into one book, just as Kevin suggested. It might start to happen in the next decade or so. Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what‟s important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of
each fragment, there will be only one book. This is what happens today with a lot of content; often you don‟t know where a quoted fragment from a news story came from, who wrote a
comment, or who shot a video. P34
t‟s easy to imagine an alternate history in which everything that was printed on early presses went through the Church and was conceived as an extension of the Bible. “Strife of Love” might have existed in this alternate world, and might have been quite similar. But the “slight” modifications would have consisted of trimming the alien bits. The book would no longer have been as strange. And that tiny shift, even if it had been minuscule in terms of word count, would have been tragic.
This is what happened when elements of indigenous cultures were preserved but de-alienated by missionaries. We know a little about what Aztec or Inca music sounded like, for instance, but the bits that were trimmed to make the music fit into the European idea of church song were the most precious bits. The alien bits are where the flavor is found. They are the portals to strange philosophies. What a loss to not know how New World music would have sounded alien to us! Some melodies and rhythms survived, but the whole is lost. Something like missionary reductionism has happened to the internet with the rise of web 2.0. The strangeness is being leached away by the mush-making process. Individual web pages as they first appeared in the early 1990s had the flavor of personhood. MySpace preserved some of that flavor, though a process of regularized formatting had begun. Facebook went further, organizing people into multiple-choice identities, while Wikipedia seeks to erase point of view entirely.
If a church or government were doing these things, it would feel authoritarian, but when technologists are the culprits, we seem hip, fresh, and inventive.
P35
It is utterly strange to hear my many old friends in the world of digital culture claim to be the true sons of the Renaissance without realizing that using computers to reduce individual expression is a primitive, retrograde activity, no matter how sophisticated your tools are.
P36
gpp A fashionable idea in technical circles is that quantity not only turns into quality at some extreme of scale, but also does so according to principles we already understand. Some of my colleagues think a million, or perhaps a billion, fragmentary insults will eventually yield wisdom that surpasses that of any well-thought-out essay, so long as sophisticated secret statistical algorithms recombine the fragments. I disagree. A trope from the early days of computer science comes to mind: garbage in, garbage out.
P36