User:Dave Young/rm/1-2/Colossus-Imaginary-Futures

From XPUB & Lens-Based wiki

Precognitive Systems: How Cybernetics Could Control the Future

In my previous essay, titled Cybernetic Ideologies, I discussed how a generation of young activists in the 1960s and 1970s re-appropriated cybernetic theories in their attempts to create a new type of non-hierarchical society. In rebellion against the post-war trends of consumerism and “mechanisation”, the counterculture claimed that the new society would be based on the accessibility of knowledge and powered by new possibilities offered by computer technologies. The application of cybernetics that created the supposed 'mechanisation of society' came from the military-led research into systems theory in the first two decades of the Cold War. Game theory, operational theory, and cybernetics – the three 'war sciences' (Galison, 1994: 231) – allowed for the American military to rapidly develop new technologies of warfare, one of these being the computer. The computer became a physical manifestation of the paranoid drive for 'control' over a chaotic system of international relations, and another example of the desire to illustrate how Capitalism triumphs over Communism.

In this essay, I aim to provide a brief overview of the military application of cybernetics in the Cold War era, particularly during the period of high-innovation and investment in the various research and development labs during the 1950s and 1960s. In the United States, the fear of an imminent communist invasion drove an obsession for control, further exacerbated by crises such as the Berlin Blockade and the Cuban Missile Crisis that very nearly ignited another world war. Around the same time, Norbert Wiener's development of cybernetics - a science of 'control and communication in the animal and the machine' - promised an attractive mathematical framework to organise military policy and develop new technologies to combat the soviet threat. The resulting application of cybernetics can be seen as being a science of prediction: Wiener's initial theories were expanded by his contemporaries in order to formulate complex models that could supposedly predict the behaviour of the Russians, and also suggest how to respond in the event of an invasion. As the many research laboratories around the United States enjoyed massive military investment, cybernetics became not only a possible solution to the problems of the era, but an ideology – a belief that the world was comprised of complex systems that could be controlled, manipulated, and perhaps most importantly, predicted. As David Mindell states in his paper Knowledge Domains in Engineering Systems, cybernetics became what could be considered as a 'philosophy of technology.' (2000: 1)

In the 1940s, Norbert Wiener experimented with creating a model of a manned anti-aircraft gun that could estimate the future position of an enemy aircraft. In developing what he called The Anti-Aircraft Predictor, Wiener had to create a computational system that could account for “not only the mind of an inaccessible Axis opponent, but of the Allied anti-aircraft gunner as well, and then even more widely to include the vast array of human proprioceptive and electro-physiological feedback systems.” (Galison, 1994: 229) Wiener expanded this unifying vision of the man and machine into a mathematical framework that he named Cybernetics, presenting his studies in the book Cybernetics: Control and Communication in the Animal and Machine (1947), and to great acclaim at the Macy Conferences in the early 1950s. From its inception, it is clear that Cybernetics was directly concerned with temporality – more specifically, the conditions of a dynamic system at a moment in time. In developing the Predictor, the mathematical challenge was in 'predicting the future value of a pseudo-random function based on its statistical history (its autocorrelation).' (Mindell, 2000: 2) The result was the augmentation of the human operator's abilities into a servo-mechanical precognitive system, adjusting the aim of the gun to take into account the potential future manoeuvres of the aircraft.

Faced with the choice of anthropomorphising the mechanical gun or mechanising the human operator, Wiener chose the latter, for the reason that mechanical devices could be understood in greater complexity in terms of mathematics than the physiological and cognitive functioning of the human body. The human was rendered as a rational subject governed by simplified laws of self-preservation: a 'self-maintaining mechanism'. (Galison, 1994: 246)

This switch in understanding followed the wider trend of the conceptual mechanisation of organic subjects, allowing for greater depths of precognitive control. For example, cybernetic computing systems were seen as an important political tool to analyse the rapidly increasing quantities of data from sources such as the U.S census. With the use of such systems, the statistics could be processed much quicker, and be presented in a more comprehensible manner to the policy-makers in order to shorten the temporal-gap between survey and action. The sociological results of this application of cybernetics are readily visible in the control systems of the 21st century - as Brian Holmes writes: “The myriad forms of contemporary electronic surveillance now constitute a proactive force, the irremediably multiple feedback loops of a cybernetic society, devoted to controlling the future.” (2009) In this regard, we can understand cybernetics as a surveillant mathematics as it attempts to make sense of the behaviour of information within a given system.

During a time of tense Soviet-American relations, the importance of cybernetics as a war science was largely agreed upon amongst academics, while the media portrayed cybernetics as 'the epitome of computerized modernity' to the masses. (Barbrook, 2005: 48) After the shock of the Sputnik satellite launch, the dynamics of funding research into technology were dramatically altered: “Science, technology, and engineering were totally reworked and massively funded in the shadow of Sputnik.” (Hunger, 2008: 6) Fuelled by great investment from the US military designed to retain the country's hegemonic position during the tense Cold War, cybernetics became increasingly relevant to further the advancement of computer technologies at research institutes such as the National Defense Research Committee, RAND, and ARPA. As the computer technologies advanced, the perceived abilities of the Americans to control the future increased. By the 1960s, the functionality of these cybernetic computer systems were broad, and were used by the military to “plan the destruction of Russian cities, organise the invasion of 'unfriendly' countries, […] and pay the wages of its troops and manage its supply chain.” (Barbrook, 2005: 41) That the variety of uses stretches from the purely utilitarian to the planning of a potential military campaign illustrates how cybernetics had permeated as an ideology: it was not simply a mathematical toolkit, but a universal methodology of controlling and understanding flows of information.

While the military purpose for much of the technological advancements made at the research and development laboratories across US are undeniable, the Americans also gained greatly from a propagandistic perspective. The great funding drive during the 1950s and 1960s was a means to beat the soviets on a technological war front, publicly illustrating how Capitalism offered a more advanced future for its citizens - the 1964/65 New York World's Fair being an expensive performance of this. The IBM pavilion was a museum for the future, and attempting to explain the complexity of cybernetic computer systems to the general public through a spectacle of interactive mainframes and a nine-panel projected animation by Charles and Ray Eames: “The theme of this ‘mind-blowing’ multi-media show was how computers solved problems in the same way as the human mind. The audience learnt that the System/360 mainframes exhibited in the IBM pavilion were in the process of acquiring consciousness: artificial intelligence. ” (Barbrook. 2005: 15) This public manifestation of cybernetic computing research at events such as the World's Fair was about a control of appearance: visitors were invited to interact with the machines, re-presenting them not as nuclear missile launchers but as curious and helpful tools that, in the near-future, would be present as artificially intelligent devices in the homes of suburban America. Barbrook calls this unrealised fantasy the 'imaginary future' – a propagandistic performance of the future designed to illustrate the great promise of the Capitalist system.

The attempts to demystify the machine and perpetuate the fantasy of the robot servant at the IBM pavilion were partially a reaction to the cinematic representation of technology at this time. Barbrook describes the public's fascination with how characters such as Robby the Robot from the 1956 film Forbidden Planet could become consumer objects in the suspended near-future, as suggested by IBM at the World's Fair. (2005: 16) While much science-fiction rendered cybernetic technologies as friendly and convenient, a number of high-profile films in the 1960s began to engage with the political background of the computer as a control interface.

The most obvious example of this is 2001: A Space Odyssey (1968), in which Stanley Kubrick and Arthur C. Clarke raise the point that the increasing degrees of human control being afforded to machines also increases the possibility of a 'catastrophic glitch'. In 2001, the relationship between the astronauts and the artificially intelligent computer on board the space ship (HAL) becomes tense when, after its misdiagnosis of a technical problem, the astronauts consider whether to shut the system down as a precautionary measure. The paranoid fear of loss of control causes HAL to prioritise self-preservation and retain power over the ship, resulting in a series of catastrophic events for the on-board astronauts. The great fear of the catastrophic glitch was not only the infliction of immediate damage to a given system, but the wider attack on the assumed stability of computer systems and the ideology of cybernetics that helped create them. For the human operator, a glitch represents a loss of control. Therefore, a glitch in a precognitive system shatters the illusion of control over the future. The glitch is a reminder that the window into the near-future provided by cybernetic systems is a simulation, governed by potentiality and statistics.

“In 1961, the influence of the men from RAND increased dramatically. The new president John F. Kennedy turned to them to impose order – not only on nuclear strategy, but the arms race, which was threatening to run out of control.” (Curtis, 1992) The militarisation of cybernetics created a new ideology symptomatic of the Cold War era of anxiety and paranoia. In Adam Curtis' documentary Pandora's Box: To the Brink of Eternity, he describes how the advocates of cybernetics and game theory used rational systems as a means to decide what political and military steps should be taken in response to the Cuban Missile Crisis. Faced with the potential of escalating the crisis into a third world war with an incorrect precognition, “they found out they had no idea how the other side would respond to any move they made.” (Curtis, 1992) While science-fiction cinema heightened anxiety through dramatic narratives about runaway machines, the cybernetic ideology was suffering from a different kind of glitch: the very real issue of human error. In times of high crisis when the value of the war sciences could be demonstrated, the fact that their cybernetic systems only simulated precognition became apparent.

The coupling of the catastrophic technological glitch with human error displays an inherent unsuitability for the military application of cybernetics. In To The Brink of Eternity, the systems-theorists working at the many research labs around the United States are portrayed as egotistical and megalomaniacal, using cybernetic theory as an excuse to maintain control over their personal positions of authority. As a result, the systems themselves were distorted in order to prove the worth of further investment - Curtis gives the example of the US Airforce providing the researchers with flawed data pertaining to the soviet stock of nuclear weapons in order to acquire more aircraft. When the statistical basis of a system is fictitious, its fragile simulation of the future is thrown into complete disarray. At a time when the many men in various military departments fought their agendas for control and investment, the subjectivity of statistical data was an inherent flaw in the unassailable application of the war sciences. By the early 1950s Norbert Wiener reviewed the ethics of systematically creating military weaponry, writing that he felt that he had lost control over the future of cybernetics, and 'repeatedly stressed the power of cybernetics to save, enslave, or destroy humanity.' (Galison, 1994: 254) While the application of cybernetics prevalent during the 1950s and 1960s seemed to largely focus on how to manage possible future military campaigns and justify the powerful position of the US Department of Defense, the countercultural re-readings of cybernetic concepts in the late 1960s and 1970s would focus on the former of Wiener's three visions of the future.

Bibliography

Barbrook, Richard (2005) Imaginary Futures

Galison, Peter (1994) The Ontology of the Enemy

Holmes, Brian (2007) FUTURE MAP: Or, How Cyborgs Learned to Stop Worrying and Love Surveillance source: http://brianholmes.wordpress.com/2007/09/09/future-map/ accessed 25-03-12

Hunger, Brian (2008) Setun: An Inquiry into the Soviet Ternary Computer

Mindell, David (2000) Cybernetics: Knowledge Domains in Engineering Systems source: web.mit.edu/esd.83/www/notebook/Cybernetics.PDF

Triclot, Mathieu (2006) Norbert Wiener's Politics and the History of Cybernetics

Filmography

Curtis, Adam (1992) Pandora's Box source: http://thoughtmaybe.com/video/pandoras-box

Eames, Charles and Ray (1958) The Info Machine source: http://archive.org/details/InformationM

Kubrick, Stanley (1968) 2001: A Space Odyssey

Sargent, Joseph (1970) Colossus: The Forbin Project