Charles Petzold



Alan Turing’s Contribution

Turing Day: June 23, 2012
Roscoe, N.Y.

As I was researching my book The Annotated Turing: A Guided Tour through Alan Turing's Historic Paper on Computability and the Turing Machine (Wiley, 2008), my appreciation for Turing's profound contribution to computing was brought into focus by this sentence from an article (not by Turing) about computers published in 1956:

The author of that 1956 article was no dummy. It was Howard Aiken, who had been involved with digital computers since 1937 and who was the primary mind behind IBM's seminal Harvard Mark I. We might forgive Aiken if he had made a distinction between computer hardware or software merely somewhat tailored for different types of applications. But no. He's clearly talking about the "basic logics" of digital computing, and by 1956 he really should have known better.

Now let's hear Alan Turing in his article "Computing Machinery and Intelligence" (this is the famous "Turing Test" article) for a 1950 issue of Mind:

Today is the 100th anniversary of the birth of Alan Turing, who was the first person who really understood these concepts in a modern way. Turing's understanding dated from 1936, when at the age of 24 he published a paper with the forbidding title "On Computable Numbers, with an Application to the Entscheidungsproblem." This ostensible purpose of this paper was to answer a question posed by German mathematician David Hilbert in the field of first-order predicate logic, but it went far beyond that immediate goal.

Alan Turing had a mind that worked unlike that of anyone else, and he wasn't much interested in the way that other people solved mathematical problems. Today, he would probably be diagnosed with Asperger's syndrome (at least to some degree), but what impresses me most is how he was able to understand the nature of mental processes that are universal among all sentient humans.

To solve Hilbert's question in mathematical logic, Turing went deep into himself and analyzed the steps involved in carrying out any numerical process (for example, long division). He then broke down these operations into simple abstract steps. These steps are so trivial, it's hard to imagine how much simpler they could be, and yet they formalize the universal process of working with numbers: during each step of a mathematical recipe, you might write down a symbol (or letter or number), and you might later erase it or replace it with another symbol. You examine what symbol exists in a particular place, and base your next step on that.

Turing shows how these individual simple steps can be consolidated in a table that encapsulates the numerical process. Such a table came to be known as the Turing Machine, and it is an abstract formulation of what we know call an "algorithm." Turing shows that this abstract machine is equivalent to any digital computer (even though at the time, digital computers did not yet exist) as well as to human minds working on these same problems. More than anyone else, Turing understood that digital computing has much less to do with hardware than with software.

Yet, this fundamental equivalence among digital computers is a double-edged sword: All digital computers are computionally equally capable, but also equally deficient. It is a crucial outcome of Turing's paper that there are certain problems beyond the reach of any digital computer, including the computer inside our own heads. The big deficiency involves applying the tool of software to itself: We can never be sure that an arbitrary computer program will run correctly, and we can't write a program that can successfully debug any other program. Before anyone had actually built a digital computer, Turing had already demonstrated their intrinsic limitations!

For those of us who are programmers, Alan Turing's 1936 paper on computable numbers is our foundational document. But its universality has revealed implications far beyond programming, and it continues to contribute to our understanding of ourselves, of our minds, and of the informational nature of the universe.

On this 100th anniversary of Alan Turing's birth, we know that as we stare into the Turing Machine, the Turing Machine stares back at us.

Wiley Amazon US Barnes & Noble
Amazon Canada Amazon UK Amazon Deutsch
Amazon Français Amazon Japan Blackwell
“Petzold will be a stalwart companion to any reader who undertakes to read Turing's classic with his aid. The Annotated Turing will also be quite enjoyable to a more casual reader who chooses to dip into various parts of the text.” — Martin Davis in American Scientist