Author Jon Agar is a historian at Manchester University in the United Kingdom and Associate Director of the National Archive for the History of Computing.
In this book, Agar focuses on the development and importance of the idea of the “stored program, or the set of instructions that tells a machine what to do. The stored program transforms the modern computer from a “special-purpose machine” to a “universal machine.” This is his starting point for the book, but by the time he’s done, he has developed an investigation into the kind of society that needs a “universal machine” and why and how the computer’s development influences the way we think about everything, including the human mind itself.
Agar describes how the computer is a reflection of modern society, which is dominated by large organizations, corporations, and government bureaucracies. What these organizations have in common is a need for managers, which arose in the 19th century as a feature of the Industrial Revolution. Agar notes, ” In answer to the question of what sort of society would ever need a general-purpose machine, we have a clue to where and when to start looking: we know that in the nineteenth century there emerged organizations that embodied in the manager and clerk the general-purpose/special-purpose split. While the work at the bottom of the pyramid is often repetitive and specialized, at the top the skills required are more strategic, more general. This modern two-tier world of general and special-purpose humans was built in the 19th century as a counter-revolution.” Thus, Agar draws a continuous line from the Industrial Revolution to the “computer revolution.”
A large part of the book, and perhaps the best part for the general reader, involves the very knowledgeable Agar tracing the history of computer development from Charles Babbage (1791-1871), through the contribution of the Jacquard Loom to computing (it relied on cards to form its patterns), the impact of the United States Census of 1890 on information processing, and finally, how the requirements of World War II for speedy information prompted computing advances made by Konrad Zuse, Kurt Pannke, and Helmut Schreyer in Germany; Howard H. Aiken, John W. Mauchly, and John V. Atanasoff in the United States, and Alan Turing in Great Britain. Without the war, the history of computer technology would be very different, says Agar.
Agar then presents a short biographical background of Turing. His father worked for the Indian Civil Service during the British Raj; Alan was raised by surrogates in Britain and educated at private schools and Cambridge, gaining a reputation for eccentricity and brilliance in math and science. Agar often refers to Andrew Hodges’ biography of Turing, which makes the reader often wish that book were close by to use as a reference while reading this one.
The impact of mathematical thinkers like David Hilbert and Kurt Gödel on Turing is credited with encouraging him to develop a machine that could manipulate numbers, “a symbol-generating machine which could be said to have a certain number of states…By imagining these machines, Turing had taken an important step: he had given a straightforward and easily comprehensible meaning to the mathematical concept of ‘decidability.’” A chapter on the famous Enigma code-breaking events at Bletchley Park during WWII provides the background for Turing’s (and others’) questioning of whether the machines developed there could be said to “think.”
After his examination of how the first operational, electronic, stored-program computer came to exist in 1948, Agar provides the reader with a kind of summary of the book. He asks, “So how did the universal machine come to be used universally and in what kind of world?… Just explaining the one-off invention of a new machine is not enough, since every time it is reproduced calls for a story to explain why it was useful in those differing circumstances.” This then leads Agar to a discussion of Turing’s approach to whether machines can think.
Rather than answer this specific question, Turing instead invented a way for the question to be answered. He proposed an “imitation game” in which an interrogator is connected by a teleprinter to a man and a woman in another room. The object of the game was to determine whether or not the interrogator could determine, by asking questions of the man and the woman, which answer was of which sex. Turing further asked what would happen if a machine replaced the man. If the interrogator could not tell the difference between the human and the machine, what conclusion could be drawn from that?
Turing’s “game” makes the machine and the human comparable, but gives the human the advantage because the machine is trying to imitate the human, and not the other way around. In 1937, Turing said, “I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.”
Other provided quotations from Turing about the nature of computing machines are fascinating. First, there is the way he speaks of the “human computer,” by which he means the human being doing mathematical calculations, but which opens the way for a psychological link between human thinking and machine processing. Doubters like Dr. G. Jefferson in 1949 noted that while it is possible to build a machine that can simulate feelings, “No mechanism could feel (and not merely artificially signal, an easy contrivance) pleasure at its successes, grief when its valves fuse, be warmed by flattery, be made miserable by its mistakes, be charmed by sex, be angry or depressed when it cannot get what it wants.” And Turing’s answer to these skeptics who insisted that “feelings” indicated a difference between a person and a machine? He said this argument was equivalent to saying that you have to be someone else to really know what that person feels. According to Turing, the entire world could be a simulation, and everyone could be pretending to feel. And “The criticism that a machine cannot have much diversity of behavior is just a way of saying that it cannot have much storage capacity.”
In the final chapter, Agar presents a meditation on the connection between private industry and government in the development of the computer. Going back to Babbage, he notes the continuing difficulty and necessity of obtaining funding. He also discusses how, in order to promote trust in governments increasingly open to influence from the lower classes and the growing welfare state bureaucracies that required many more state workers to maintain, authorities began to “cast the civil service as a machine. A civil service ‘machine’ would be neutral, interest-free, even efficient, and applicable to any task.” Civil servants themselves, although human, came in two types: generalist intellectuals and rule-following ‘mechanicals,’ a split that has survived to modern times.
Agar’s final point is that for the “compulsive programmer,” on whom we rely more and more to address any crisis with more technology, the world is becoming more like a computer. However, Agar himself believes that the “universal machine” represents the “materialization of bureaucracy and managerial capitalism,” and therefore, it is actually “made like the world.”
Overall, a fascinating trip through computer development, and while not overly informative about Turing’s life, a worthwhile discussion of his theoretical views and influence on computer technology.