Gary Flood reviews an important book on the real roots of computer technology and business – the dollars spent to win Cold Wars I and II
Last night I scared myself silly by renting a curious 1964 black & white Sidney Lumet movie that you probably haven’t heard of: Fail Safe. The scenario: In four enclosed environments – a room underneath the White House, a command and control Strategic Air Command centre in Nebraska, a war planning room at the Pentagon, and aboard the cockpit of a nuclear bomber – a computer malfunction’s tragic consequences are played out, climaxing in the US President’s decision to wreak atomic destruction on New York City as the only way to recompense for a successful H-bomb attack on the city of Moscow by the misled airmen.
Here in the New World Order of 1997 it’s hard to recapture the nuclear paranoia of the so-called first Cold War (1945 to 1976, say) or the second cold war (Reagan’s evil empire and Star Wars period, 1980 to circa 1989). Fail Safe – along with the same year’s deeply black comic counterpart Dr Strangelove – is a peek into that past. But apart from a reminder of a sinister and frightening past, it is only after reading Paul Edwards’ Closed World: Computers And The Politics Of Discourse In Cold War America (MIT Press), that you get a glimpse into the real reason we’re all making a living these days. War, in other words, or more specifically, the decision by military budget controllers to invest literally billions of dollars as an act of faith in an unproven technology, the digital computer, as the only way to underpin the global electronic battlefield seemingly mandated by the logic of the Cold War. Take the early warning defense system called SAGE. We talk of the Xerox Corp Palo Alto Research Center and its contributions to technology, but those are dwarfed by this endeavor. Invented for the project: magnetic core memory, video displays, light guns, the first effective algebraic computer language, graphic display techniques, simulation, synchronous parallel logic (simultaneous, not serial, bit transmission), analogue-to-digital and vice versa conversion techniques, digital data transmission over phone lines (some of the very first modems), duplexing and multiprocessing, and last but not least, the very first networks (automatic data exchange amongst different computers). Computers as we know them come out of this work. As does the computer industry: IBM built 56 SAGE computers, at a cost of $30m each, and at its peak 20% of the workforce (7,000 staff) were paid by SAGE. In the 1950s as a whole, Big Blue derived over half its income from defense work. But SAGE was only one of the so-called Big L projects of the 1950s, all having their roots in the large computing devices the Allies had built to crack Axis codes or to better aim anti- aircraft guns and other forms of artillery. The War was a time when scientists and warriors were forced to work together, and found it to their mutual advantage. But SAGE didn’t work – and was never meant to. Long before it became operational, in 1961, the intercontinental ballistic missile rendered it obsolete (it was meant to track planes). Plus, most SAGE sites were built above ground, and at Strategic Air Command air bases, which would have been automatic targets of the Russians anyway. What the Pentagon didn’t bother explaining was that the taxpayers were really paying for a glorious research and development binge to provide the military with proof of concept of the beguiling potential of computers as instruments of war, not to defend their homes. For SAGE was supposed to fight off an attack that would never come: all US nuclear war strategy in the ’50s and ’60s was based on a (secret) first strike tactic against the Soviet Union anyway. If SAGE was meant to do anything, it was to mop up a dazed and ineffective counter attack. Artificial intelligence is perhaps the branch of computer science that owes most to the deep pockets of the military. The US Navy funded the 1956 Dartmouth Conference at
which the discipline was founded, and successive bodies like the Office of Naval Research, the Advanced Research Projects Agency (renamed in 1972 the Defense Advanced Research Projects Agency), and in the 1980s the Strategic Computing Initiative have all provided the vast majority of money for research into AI.
Four minute warning
But Edwards, as a social scientist at Stanford, has a larger canvas than simple history, and one can take or leave the picture he chooses to paint. If sociology and philosophy leave you cold, there are sections of the book that you may want to skip. For his thesis is that the Cold War, the cyborg, movies like Colossus: The Forbin Project, Terminator (1 and 2), 2001 and novels like Neuromancer are all interlinked in the creation of a particular closed-world identity and discourse. If that doesn’t interest you, how’s about discovering how in 1979 we came four minutes away from nuclear war when NORAD’s computers told us the Soviets had unleashed a submarine-based ballistic missile attack? The US defenses tripped into retaliation mode until frantic officers found a training tape still mounted on one of the drives. A recommended read.
The Closed World: Computers And The Politics Of Discourse In Cold War America, Paul N Edwards, MIT PRESS 1996, ISBN: 0-262-05051-X, $40.00.