PETZOLD BOOK BLOG

Charles Petzold on writing books, reading books, and exercising the internal UTM


Recent Entries
< PreviousBrowse the ArchivesNext >
Subscribe to the RSS Feed

“The Imitation Game” and Alan Turing’s Real Contribution to Computing

December 11, 2014
New York, N.Y.

As Alan Turing (portrayed by Benedict Cumberbatch) races against the clock to build a machine to crack the Nazi Enigma code in the recent movie The Imitation Game, only Joan Clarke (Keira Knightley) understands the underlying quest of this tortured genius.

“You’re trying to build your Universal Machine,” she says [about 44 minutes into the movie], alluding to Turing’s 1936 paper on Computable Numbers. “You theorized a machine that could solve any problem. It didn’t just do one thing; it did everything. It wasn’t just programmable. It was reprogrammable,” to which Turing grunts an assent.

But the Turing character really shouldn’t agree, for two reasons:

First, contrary to what the Joan Clarke character says — or rather, contrary to the screenplay by Graham Moore — the machine portrayed in the movie being built at Bletchley Park (which was called a "bombe" but which the movie ludicrously names "Christopher") does not qualify as a Universal Machine. It a special-purpose machine rather than a general-purpose computer.

Secondly, although the hypothetical Universal Machine that Turing constructed in his 1936 paper is recognized today as an abstract model of digital computing, it cannot solve every mathematical problem. Some problems are forever out of its reach, and they remain impossible even on today’s sophisticated and powerful computers.

We might excuse a Hollywood biopic such as The Imitation Game for lapses of mathematical accuracy (although the inaccuracies of the movie go far beyond the mathematical), but the same problem pops up in the 2011 British television documentary released in the U.S. as Codebreaker. Here we are told [about 17-18 minutes into the movie] that “Turing’s Universal Machine was purely hypothetical but it laid out the fundamental principle underpinning all computers—that any conceivable mathematical calculation can be done by a single device shuffling 1’s and 0’s back and forth…. And part of Alan Turing’s genius was to realize that a machine like this can compute absolutely anything, because anything can be written as 1’s and 0’s.” Not true! Computers cannot compute everything. They have built-in restrictions that are intrinsic to the mathematics of digital computation.

Everyone agrees that Alan Turing is a seminal figure in the computer revolution, so it's puzzling why these movies deliberately misrepresent his work. We should at least make an attempt to get it right, and particularly get a handle on Turing's most significant contribution to computer science, the 1936 paper “On Computable Numbers, with an Application to the Entscheidungsproblem.”

The Entscheidungsproblem or “decision problem” was posed by German mathematician David Hilbert in the 1920’s. Hilbert was working with a form of mathematical logic, and he asked: Is there a general procedure that can tell us whether any arbitrary statement is this logic is true or false?

Today we call such a general procedure an “algorithm,” a methodical mathematical recipe formulated to be independent of the person who carries it out. The mechanical nature of such processes perhaps suggested to Turing a unique approach for answering Hilbert’s question: Turing imagined the actions of a person carrying out an algorithm as a machine that performs simple rudimentary steps of writing and erasing symbols on paper based on a history of previous steps. Each algorithm is implemented in what Turing called an “automatic machine” but which today we call a Turing Machine.

By formalizing algorithms as sequences of simple uniform steps, Turing could assign each algorithm a unique number that describes these steps. Turing then created a special Turing Machine called the Universal Machine. If you input the number for any Turing Machine into the Universal Machine, it will carry out that algorithm.

At the time of Turing's paper, no working digital computer had yet been built. Yet the concepts are all there: A Turing Machine is analogous to the bits and bytes of a computer application, and the Universal Machine is the computer that reads and runs these apps. Only in retrospect after actual digital computers had been built was it evident that Turing had already established a deep understanding of the mathematical foundations of computing. Turing’s work implies that all digital computers and programming languages are fundamentally equivalent in their algorithmic capabilities.

Yet, the real intent of Turing’s paper on Computable Number was to demonstrate that these machines — and hence human beings carrying out algorithms — were inherently limited in what they could do. There is no general procedure for the Entscheidungsproblem, and the problem goes deeper.

In particular, one important category of Turing Machine is impossible: You cannot define a Turing Machine that can determine if other Turing Machines are working correctly.

Translated to modern terms, this has unnerving implications: Computer programs often contain flaws or errors called bugs, and the work of debugging software is a vital part of a programmer’s job. But this is a task that cannot be automated. You cannot write a generalized computer program that will find all the bugs in other computer programs. (If there were such a thing, the world wouldn’t have held its breath as the year ticked from 1999 to 2000.) The problem is that you must use this program on itself, and because you don’t know if this debugging program is itself entirely free of bugs, you can’t entirely trust it.

Or, to rephrase the problem in the form of an age-old question: Who will debug the debugger?

So to say that Turing’s Universal Machine can “solve any problem” or “compute absolutely anything” is simply wrong, and the statement is wrong for today’s digital computers as well. For the makers of The Imitation Game and Codebreaker to ignore this fact is not just a simplification for the non-mathematically inclined. It is a fundamental misrepresentation of the capabilities and limitations of digital computers, and either a deliberate or ignorant failure to describe Alan Turing's work.

Almost 80 years ago, before a single digital computer had been built and before any software for these computers had been written, Alan Turing demonstrated that we can never be certain that such software works correctly. As we increasingly use computers and software to run this world, of course we’d like some assurance everything is working OK. But that’s not possible. A perpetual state of uncertainty is now yet another aspect of the human condition. This is Turing’s real contribution to computing, and it should not be a secret only among computer scientists.

Of course, it’s not Turing’s fault. That’s just the way the math works out: Buggy software is as inevitable as death and taxes.

Wiley Amazon US Barnes & Noble
Amazon Canada Amazon UK Amazon Deutsch
Amazon Français Amazon Japan Blackwell
“Petzold will be a stalwart companion to any reader who undertakes to read Turing's classic with his aid. The Annotated Turing will also be quite enjoyable to a more casual reader who chooses to dip into various parts of the text.” — Martin Davis in American Scientist

Comments:

Where's the e-book version?

Jeff Martens, Thu, 11 Dec 2014 10:52:45 -0500

The Annotated Turing is built around chunks of Turing's original paper reproduced in its original typography. These tend to prevent the book from being issued in a reflowable ebook format. — Charles

Terrific post. I hope that we can look forward to another dissecting the inaccuracies in the movie that you mention that "go far beyond the mathematical."

— Stewart, Thu, 11 Dec 2014 10:57:57 -0500

kind of... although we can solve a lot of problems without Turing completeness. We can solve a lot (most) problems in logical systems that are proven not to halt -- [http://en.wikipedia.org/wiki/Total_functional_programming].

We should not let programmers off the hook *too* easily for bugs!

Buggy software is inevitable (at the specification), but we can and should do a lot better than we the current state of affairs.

— aProgrammer, Thu, 11 Dec 2014 12:00:19 -0500

Great post. Finny enough picked up your book last week and cant put it down, what a pleasent read. Congrats.

— Luiz, Thu, 11 Dec 2014 12:05:52 -0500

Too bad this post also mis-represent Turing's innovation. In fact, he wasn't the first one to find this: Godel already proved the impossibility of this very problem in ... 1931. And using pretty much the same method, but in the Peano arithmetic rather than his own theory. In short: he encoded every problem as a unique natural number and worked on them with arithmetic rules. He then proved that you couldn't have both an arithmetic that would be at the same time sound and complete. No doubt Turing was a genius, but it is sad to see how it is completely mis-represented by ... pretty much everybody speaking about him. It is far too common to give to Turing the achievements of others.

— Pierre, Thu, 11 Dec 2014 12:21:02 -0500

Nobody is denigrating Gödel's contribution, and, of course, Turing referred to Gödel's work in his own paper. And although completeness and decidability are certainly related, they are two different problems. Moreover, both Turing and John von Neumann recognized the unique importance of Turing's work to nascent computer technologies, whereas Gödel's work did not seem to have that applicability. — Charles

You say "We might excuse a Hollywood biopic ... for lapses of mathematical accuracy" but surely film makers are allowed to depict fallible characters. Joan Clarke makes a statement to the main protagonist which the audience may possibly want to ask him themselves. The accuracy of which is unimportant.

— Chris Nash, Sun, 14 Dec 2014 04:07:41 -0500

But if the question is really something the audience wants to ask themselves, why should the answer plunge the viewer into even greater ignorance? — Charles

An interesting question is how are humans capable of bridging the gap from turing machines to discovering new mathematical facts. Maybe the human brain is more than just a neural network and humans are more than turing machines.

Cristian Minoiu, Wed, 31 Dec 2014 19:16:18 -0500


Recent Entries
< PreviousBrowse the ArchivesNext >
Subscribe to the RSS Feed

(c) Copyright Charles Petzold
www.charlespetzold.com