Alan Turing’s Everlasting Contributions to Computing, AI and Cryptography – NIST

An enigma machine on display outside the Alan Turing Institute entrance inside the British Library, London.

Credit: Shutterstock/William Barton

Suppose someone asked you to devise the most powerful computer possible. Alan Turing, whose reputation as a central figure in computer science and artificial intelligence has only grown since his untimely death in 1954, applied his genius to problems such as this one in an age before computers as we know them existed. His theoretical work on this problem and others remains a foundation of computing, AI and modern cryptographic standards, including those NIST recommends.

The road from devising the most powerful computer possible to cryptographic standards has a few twists and turns, as does Turings brief life.

Alan Turing

Credit: National Portrait Gallery, London

In Turings time, mathematicians debated whether it was possible to build a single, all-purpose machine that could solve all problems that are computable. For example, we can compute a cars most energy-efficient route to a destination, and (in principle) the most likely way in which a string of amino acids will fold into a three-dimensional protein. Another example of a computable problem, important to modern encryption, is whether or not bigger numbers can be expressed as the product of two smaller numbers. For example, 6 can be expressed as the product of 2 and 3, but 7 cannot be factored into smaller integers and is therefore a prime number.

Some prominent mathematicians proposed elaborate designs for universal computers that would operate by following very complicated mathematical rules. It seemed overwhelmingly difficult to build such machines. It took the genius of Turing to show that a very simple machine could in fact compute all that is computable.

His hypothetical device is now known as a Turing machine. The centerpiece of the machine is a strip of tape, divided into individual boxes. Each box contains a symbol (such as A,C,T, G for the letters of genetic code) or a blank space. The strip of tape is analogous to todays hard drives that store bits of data. Initially, the string of symbols on the tape corresponds to the input, containing the data for the problem to be solved. The string also serves as the memory of the computer. The Turing machine writes onto the tape data that it needs to access later in the computation.

Credit: NIST

The device reads an individual symbol on the tape and follows instructions on whether to change the symbol or leave it alone before moving to another symbol. The instructions depend on the current state of the machine. For example, if the machine needs to decide whether the tape contains the text string TC it can scan the tape in the forward direction while switching among the states previous letter was T and previous letter was not C. If while in state previous letter was T it reads a C, it goes to a state found it and halts. If it encounters the blank symbol at the end of the input, it goes to the state did not find it and halts. Nowadays we would recognize the set of instructions as the machines program.

It took some time, but eventually it became clear to everyone that Turing was right: The Turing machine could indeed compute all that seemed computable. No number of additions or extensions to this machine could extend its computing capability.

To understand what can be computed it is helpful to identify what cannot be computed. Ina previous life as a university professor I had to teach programming a few times. Students often encounter the following problem: My program has been running for a long time; is it stuck? This is called the Halting Problem, and students often wondered why we simply couldnt detect infinite loops without actually getting stuck in them. It turns out a program to do this is an impossibility. Turing showed that there does not exist a machine that detects whether or not another machine halts. From this seminal result followed many other impossibility results. For example, logicians and philosophers had to abandon the dream of an automated way of detecting whether an assertion (such as whether there are infinitely many prime numbers) is true or false, as that is uncomputable. If you could do this, then you could solve the Halting Problem simply by asking whether the statement this machine halts is true or false.

Turing went on to make fundamental contributions to AI, theoretical biology and cryptography. His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort.

Turing was gay. After the war, in 1952, the British government convicted him for having sex with a man. He stayed out of jail only by submitting to what is now called chemical castration. He died in 1954 at age 41 by cyanide poisoning, which was initially ruled a suicide but may have been an accident according to subsequent analysis. More than 50 years would pass before the British government apologized and pardoned him (after years of campaigning by scientists around the world). Today, the highest honor in computer sciences is called the Turing Award.

Turings computability work provided the foundation for modern complexity theory. This theory tries to answer the question Among those problems that can be solved by a computer, which ones can be solved efficiently? Here, efficiently means not in billions of years but in milliseconds, seconds, hours or days, depending on the computational problem.

For example, much of the cryptography that currently safeguards our data and communications relies on the belief that certain problems, such as decomposing an integer number into its prime factors, cannot be solved before the Sun turns into a red giant and consumes the Earth (currently forecast for 4 billion to 5 billion years). NIST is responsible for cryptographic standards that are used throughout the world. We could not do this work without complexity theory.

Technology sometimes throws us a curve, such as the discovery that if a sufficiently big and reliable quantum computer is built it would be able to factor integers, thus breaking some of our cryptography. In this situation, NIST scientists must rely on the worlds experts (many of them in-house) in order to update our standards. There are deep reasons to believe that quantum computers will not be able to break the cryptography that NIST is about to roll out. Among these reasons is that Turings machine can simulate quantum computers. This implies that complexity theory gives us limits on what a powerful quantum computer can do.

But that is a topic for another day. For now, we can celebrate how Turing provided the keys to much of todays computing technology and even gave us hints on how to solve looming technological problems.

Original post:
Alan Turing's Everlasting Contributions to Computing, AI and Cryptography - NIST

Related Posts

Comments are closed.