PETZOLD BOOK BLOG
Recent Entries | ||
< Previous | Browse the Archives | Next > |
Subscribe to the RSS Feed |
Turing Day: June 23, 2012
Roscoe, N.Y.
As I was researching my book The Annotated Turing: A Guided Tour through Alan Turing's Historic Paper on Computability and the Turing Machine (Wiley, 2008), my appreciation for Turing's profound contribution to computing was brought into focus by this sentence from an article (not by Turing) about computers published in 1956:
The author of that 1956 article was no dummy. It was Howard Aiken, who had been involved with digital computers since 1937 and who was the primary mind behind IBM's seminal Harvard Mark I. We might forgive Aiken if he had made a distinction between computer hardware or software merely somewhat tailored for different types of applications. But no. He's clearly talking about the "basic logics" of digital computing, and by 1956 he really should have known better.
Now let's hear Alan Turing in his article "Computing Machinery and Intelligence" (this is the famous "Turing Test" article) for a 1950 issue of Mind:
Today is the 100th anniversary of the birth of Alan Turing, who was the first person who really understood these concepts in a modern way. Turing's understanding dated from 1936, when at the age of 24 he published a paper with the forbidding title "On Computable Numbers, with an Application to the Entscheidungsproblem." This ostensible purpose of this paper was to answer a question posed by German mathematician David Hilbert in the field of first-order predicate logic, but it went far beyond that immediate goal.
Alan Turing had a mind that worked unlike that of anyone else, and he wasn't much interested in the way that other people solved mathematical problems. Today, he would probably be diagnosed with Asperger's syndrome (at least to some degree), but what impresses me most is how he was able to understand the nature of mental processes that are universal among all sentient humans.
To solve Hilbert's question in mathematical logic, Turing went deep into himself and analyzed the steps involved in carrying out any numerical process (for example, long division). He then broke down these operations into simple abstract steps. These steps are so trivial, it's hard to imagine how much simpler they could be, and yet they formalize the universal process of working with numbers: during each step of a mathematical recipe, you might write down a symbol (or letter or number), and you might later erase it or replace it with another symbol. You examine what symbol exists in a particular place, and base your next step on that.
Turing shows how these individual simple steps can be consolidated in a table that encapsulates the numerical process. Such a table came to be known as the Turing Machine, and it is an abstract formulation of what we know call an "algorithm." Turing shows that this abstract machine is equivalent to any digital computer (even though at the time, digital computers did not yet exist) as well as to human minds working on these same problems. More than anyone else, Turing understood that digital computing has much less to do with hardware than with software.
Yet, this fundamental equivalence among digital computers is a double-edged sword: All digital computers are computionally equally capable, but also equally deficient. It is a crucial outcome of Turing's paper that there are certain problems beyond the reach of any digital computer, including the computer inside our own heads. The big deficiency involves applying the tool of software to itself: We can never be sure that an arbitrary computer program will run correctly, and we can't write a program that can successfully debug any other program. Before anyone had actually built a digital computer, Turing had already demonstrated their intrinsic limitations!
For those of us who are programmers, Alan Turing's 1936 paper on computable numbers is our foundational document. But its universality has revealed implications far beyond programming, and it continues to contribute to our understanding of ourselves, of our minds, and of the informational nature of the universe.
On this 100th anniversary of Alan Turing's birth, we know that as we stare into the Turing Machine, the Turing Machine stares back at us.
Wiley | Amazon US | Barnes & Noble | |
Amazon Canada | Amazon UK | Amazon Deutsch | |
Amazon Français | Amazon Japan | Blackwell | |
“Petzold will be a stalwart companion to any reader who undertakes to read Turing's classic with his aid. The Annotated Turing will also be quite enjoyable to a more casual reader who chooses to dip into various parts of the text.” — Martin Davis in American Scientist |
Recent Entries | ||
< Previous | Browse the Archives | Next > |
Subscribe to the RSS Feed |
(c) Copyright Charles Petzold
www.charlespetzold.com
Comments:
No question that Turing's contribution is enormous and we would be suffering without it. But he might not have been the sole inventor of the ideas that he contributed. He might have been standing on the shoulders of another giant. Before anyone had actually built a digital computer, Lady Ada had already demonstrated their capabilities.
— Partly capable, partly limited, Sun, 24 Jun 2012 19:47:39 -0400
Why don't you have a kindle version available? I was about to buy your book until then. Surely a book about the fundamentals of computers deserves to be available to be read on a computer?
— Ben, Sun, 24 Jun 2012 22:16:29 -0400
I am flattered that you believe I have the power to determine how my books should be published, but it is actually the publisher (in this case, Wiley) who makes that decision.
I believe the main issue involving an "Annotated Turing" ebook is a typographical one. Turing's original paper is reproduced in the book (as I discussed a bit in this blog entry), and that — together with a frightful amount of hairy math — makes the book very resistent to the type of reflow that readers expect on ebook readers such as the Kindle.
Sometime in the future I hope to write a book about "the fundamentals of computers" that will contain nothing but prose, and that book surely will be readable on a computer. — Charles
<<Sometime in the future I hope to write a book about "the fundamentals of computers">>
I'm interested in knowing how this will differ from Code.
— John, Mon, 25 Jun 2012 21:29:30 -0400
Maybe the word "fundamentals" was not entirely accurate. But there are several books that I would like to write on various contributions to computing between 1600 and 1930, and for at least a couple of these books, I'd like to try to tell the story entirely in prose.
Unfortunately, writing a book these days is a threat to one's solvency. — Charles
Write a book that portrays Turing as a vampire, you could make a fortune!
— John, Tue, 26 Jun 2012 09:25:38 -0400
How can anyone have trouble selling a book about kinky sex and death? His portrait alone might soon be worth 10 pounds:
http://finance.yahoo.com/news/uk-lawmakers-call-alan-turing-190615797.html
Bonus video:
http://vimeo.com/44202270
— Don't believe her, I'm really the computer., Wed, 27 Jun 2012 22:34:54 -0400
Thanks for the links! — Charles
Hopefully this will get some traction:
Parliamentary bill launched for Alan Turing pardon...
http://www.guardian.co.uk/uk/the-northerner/2012/jul/25/alan-turing-private-members-bill-lord-sharkey
And slightly related:
Scotland to legalise same-sex marriages in church and civil ceremonies...
http://www.guardian.co.uk/society/2012/jul/25/scotland-legalise-same-sex-marriage
"We believe that in a country that aspires to be an equal and tolerant society, as we do in Scotland, then this is the right thing to do," - Nicola Sturgeon, the deputy first minister.
As usual the Scots take point. Again.
Enjoy.
— Rusty, Wed, 25 Jul 2012 17:56:30 -0400