User:Tamas Bates/RWRM/HWEssay1

From XPUB & Lens-Based wiki
< User:Tamas Bates‎ | RWRM
Revision as of 11:57, 15 January 2014 by Tamas Bates (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Lessig vs. Rushkoff

An examination of the problems and potential solutions Lawrence Lessig and Douglas Rushkoff present in their discussion of the impact digital technologies have on society in their works "Code v2" and "Program or be Programmed."

Lessig and Rushkoff both approach the same problem from different angles: the proliferation of digital technologies and their increasing presence in our daily lives is having a marked impact on individuals and the communities they participate in. Rushkoff focuses more on the individual, and on aspects of people's lives which they may need to be conscious of to avoid having their life dictated to them by the technology they use, while Lessig takes a wider view examining the impact these technologies can have on a much larger scale.

Both authors agree that in the digital realm code can be used as a means to control the users of a software system. In this, Rushkoff focuses on the notion that all technologies have intrinsic biases in the ways they seek to be used and understood. These biases push people into certain behavior patterns necessary to accommodate the technology, but the fact that our behaviors may be altered by the technology we use is not intrinsically bad. However, he maintains that we cannot determine the true effects of these biases in order to decide how positive or negative their effects may be if we do not understand them. "It's not the networking of the dendrites in our skulls that matters so much as how effective and happy we are living that way and, in the case of digital media, how purposefully we get ourselves there. Recognizing the biases of the technologies we bring into our lives is really the only way to stay aware of the ways we are changing in order to accommodate them, and to gauge whether we are happy with that arrangement." (Rushkoff 2010, p. 34) Rushkoff's primary concern is that people do not understand the technology they use on a daily basis, and thus may be unaware of how it is shaping their behavior. His ten commandments are instructions for the everyday user to be mindful of specific biases digital media holds in order to be able to defend themselves against the system.

Lessig's approach to control is a bit more concrete: in a digital system, code is law. By this interpretation, our behaviors are not dictated by some intrinsic property of the technology but are controlled by a programmer who has encoded into the system the allowable set of behaviors we may engage in. So from Lessig's point of view, we are not so much subject to the biases of the digital technology we use so much as we are subject to the biases of the authors of the technology we use. This is an important distinction, because different authors working with the same base technology may provide very different operating environments to the users. An additional problem this creates is that code is capable of not only contradicting actual laws, but can displace them as a mechanism for enforcing behavior. (Lessig 2006, p. 175) This is especially problematic because code cannot be ignored in the way real laws can. If a user disagrees with the author of the code on the behaviors they should be allowed to engage in, the only resolution is to change the code (which may be impossible if it's closed-source and the author cannot be convinced to make the changes). Rushkoff alludes to this as well in a few examples, noting that "early computers were built by hackers, whose own biases ended up being embedded in their technologies." (Rushkoff 2010, p. 134) Throughout the book, however, he maintains that these biases can be overcome. The simplest means is through understanding the technology and its biases, so that we can consciously augment our behavior if the biases lead us in a negative direction. If this fails, then learning to program and producing our own code is the only way out while still being able to make use of the technology. Unfortunately, his solution to many problems is to simply opt out when possible, but when we look at his perspective on people it's not surprising that this is his best recommendation.

Rushkoff spends a great deal of text explaining the various biases intrinsic to digital technologies, and he frequently hints at a common thread in human behavior. By following this, we can discover what he believes the natural biases of the users to be. His earliest example is a television viewer before the invention of the remote control, who is too lazy to change the channel during a commercial: "Before the remote control, the only other way out of imposed anxiety was to get up out of the recliner, take the popcorn off your lap, manually change the channel, and maybe adjust the rabbit ears." (Rushkoff 2010, p. 27) The introduction of the remote control, a nearly effortless means of changing the channel, then empowers the viewer to escape the advertisers at will. When discussing the impact of an "always on" device like a cell phone, Rushkoff also mentions, "Being open to a call from a family member 24/7 doesn't require being open to everyone. The time it takes to program your phone to ring for only certain incoming numbers is trivial compared to the time wasted answering calls from people you don't want to hear from. We are more likely, however, to ignore the timeless bias of the digital and aspire to catching up with its ever-elusive pace." (Rushkoff 2010, p. 31) He carries this forward to software, stating, "our enthusiasm for digital technology, about which we have little understanding and over which we have little control leads us not towards greater agency, but toward less." (Rushkoff 2010, p. 140) The solution to this, then, is to gain control over the technology through knowledge of programming which "is really no big deal to learn" (Rushkoff 2010, p. 133). But no matter how easy it is, it still requires more effort than pressing a button on a TV remote, so Rushkoff believes we are always open to becoming victims of the technology we use.

Perhaps the worst bias of all, then, is the one he does not explicitly discuss: humans are biased toward adaptation to their environment through the path of least resistance. It was never difficult to change a television channel, but it was still easier to sit through commercials. And while it's not difficult to learn how computers work it's still easier not to--even if avoiding the topic is costing us much more time in the long run. There is some optimism in Rushkoff's writing, however, and he includes several examples of humans adapting technology instead, including: "The 'missed call' feature on cell phones ended up being hacked to give us text messaging. Personal computers, once connected to phone lines, ended up being more useful as Internet terminals." (Rushkoff 2010, 142) These examples, however, are invariably also cases of a "technological elite" bestowing these liberating inventions upon the masses. The same people who produced text messaging technology are also the ones he accuses of giving us our always-on-by-default cell phones.

Similar problems are noted by Lessig, and his solution also includes gaining a greater understanding of digital technologies and programming. "Code is technical; courts aren't well positioned to evaluate such technicality. But even so, the failure is not even to try." (Lessig 2006, p. 324) He also emphasizes that simply having a well-educated populace (and technically competent courts) is not sufficient because "the problems that cyberspace reveals are not problems with cyberspace. They are real-space problems that cyberspace shows us we must now resolve--or maybe reconsider." (Lessig 2006, p. 313) There is still a great deal of ambiguity around how exactly governments should handle cyberspace and digital communications, and there is no technical means of resolving this. How this ambiguity gets resolved will have more to do with the biases of individuals and organizations than the biases introduced by any medium or technology. Lessig emphasizes that the more we as the users of the technology--or the inhabitants of cyberspace--participate in the conversation the more likely it will be that these issues are resolved in our favor. "When those who believe in the liberty of cyberspace, and the values that liberty promotes, refuse to engage with government about how best to preserve those liberties, that weakens liberty. Do-nothingism is not an answer; something can and should be done."

While Rushkoff seems open and somewhat optimistic that we can overcome our innate bias toward adaptation and inaction, Lessig is less encouraging. While discussing relevant court rulings on the matter so far he raises the issue that much damage has already been done and that the future of cyberspace is already partly determined. "By striking down Congress's efforts to zone cyberspace, the courts were not telling us what cyberspace is but what it should be. They were making, not finding, the nature of cyberspace; their decisions are in part responsible for what cyberspace will become." (Lessing 2006, p. 317) And even though we are already feeling some of the effects of these changes and others which were produced in opposition to the primary users of cyberspace, it seems that it's still too much trouble for most people to engage in building a digital network designed around our own needs and biases rather than those that may work against us. "There are choices we could make, but we pretend there is nothing we can do. We choose to pretend; we shut our eyes. We build this nature, then we are constrained by this nature we have built." (Lessig 2006, p. 339)


References


Lawrence Lessig, 2006, Code V2,
First Edition, Basic Books, New York

Douglas Rushkoff, 2010, Program or be Programmed: Ten Commandments for the Digital Age,
First Edition, BookMobile, USA