1953: The Year That Revolutionized Life, Death, and the Digital Bit

Three technological eras began in 1953: thermonuclear weapons, stored-program computers, and modern genetics.

At 10:38 p.m. on March 3, 1953, in a one-story brick building at the end of Olden Lane in Princeton, New Jersey, Italian Norwegian mathematical biologist Nils Aall Barricelli inoculated a 5-kilobyte digital universe with random numbers generated by drawing playing cards from a shuffled deck. "A series of numerical experiments are being made with the aim of verifying the possibility of an evolution similar to that of living organisms taking place in an artificially created universe," he announced.

A digital universe -- whether 5 kilobytes or the entire Internet -- consists of two species of bits: differences in space, and differences in time. Digital computers translate between these two forms of information -- structure and sequence -- according to definite rules. Bits that are embodied as structure (varying in space, invariant across time) we perceive as memory, and bits that are embodied as sequence (varying in time, invariant across space) we perceive as code. Gates are the intersections where bits span both worlds at the moments of transition from one instant to the next.

The term bit (the contraction, by 40 bits, of "binary digit") was coined by statistician John W. Tukey shortly after he joined von Neumann's project in November of 1945. The existence of a fundamental unit of communicable information, representing a single distinction between two alternatives, was defined rigorously by information theorist Claude Shannon in his then-secret Mathematical Theory of Cryptography of 1945, expanded into his Mathematical Theory of Communication of 1948. "Any difference that makes a difference" is how cybernetician Gregory Bateson translated Shannon's definition into informal terms. To a digital computer, the only difference that makes a difference is the difference between a zero and a one.

That two symbols were sufficient for encoding all communication had been established by Francis Bacon in 1623. "The transposition of two Letters by five placeings will be sufficient for 32 Differences [and] by this Art a way is opened, whereby a man may expresse and signifie the intentions of his minde, at any distance of place, by objects ... capable of a twofold difference onely," he wrote, before giving examples of how such binary coding could be conveyed at the speed of paper, the speed of sound, or the speed of light.

That zero and one were sufficient for logic as well as arithmetic was established by Gottfried Wilhelm Leibniz in 1679, following the lead given by Thomas Hobbes in his Computation, or Logique of 1656. "By Ratiocination, I mean computation," Hobbes had announced. "Now to compute, is either to collect the sum of many things that are added together, or to know what remains when one thing is taken out of another. Ratiocination, therefore is the same with Addition or Substraction; and if any man adde Multiplication and Division, I will not be against it, seeing ... that all Ratiocination is comprehended in these two operations of the minde." The new computer, for all its powers, was nothing more than a very fast adding machine, with a memory of 40,960 bits.

In March of 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth. Five kilobytes were at the end of Olden Lane, 32 kilobytes were divided among the eight completed clones of the Institute for Advanced Study's computer, and 16 kilobytes were unevenly distributed across a half dozen other machines. Data, and the few rudimentary programs that existed, were exchanged at the speed of punched cards and paper tape. Each island in the new archipelago constituted a universe unto itself.

In 1936, logician Alan Turing had formalized the powers (and limitations) of digital computers by giving a precise description of a class of devices (including an obedient human being) that could read, write, remember, and erase marks on an unbounded supply of tape. These "Turing machines" were able to translate, in both directions, between bits embodied as structure (in space) and bits encoded as sequences (in time). Turing then demonstrated the existence of a Universal Computing Machine that, given sufficient time, sufficient tape, and a precise description, could emulate the behavior of any other computing machine. The results are independent of whether the instructions are executed by tennis balls or electrons, and whether the memory is stored in semiconductors or on paper tape. "Being digital should be of more interest than being electronic," Turing pointed out.

Von Neumann set out to build a Universal Turing Machine that would operate at electronic speeds. At its core was a 32-by-32-by-40-bit matrix of high-speed random-access memory -- the nucleus of all things digital ever since. "Random access" meant that all individual memory locations -- collectively constituting the machine's internal "state of mind" -- were equally accessible at any time. "High speed" meant that the memory was accessible at the speed of light, not the speed of sound. It was the removal of this constraint that unleashed the powers of Turing's otherwise impractical Universal Machine.

Electronic components were widely available in 1945, but digital behavior was the exception to the rule. Images were televised by scanning them into lines, not breaking them into bits. Radar delivered an analog display of echoes returned by the continuous sweep of a microwave beam. Hi-fi systems filled postwar living rooms with the warmth of analog recordings pressed into vinyl without any losses to digital approximation being introduced. Digital technologies -- Teletype, Morse code, punched card accounting machines -- were perceived as antiquated, low-fidelity, and slow. Analog ruled the world.

Excerpt from:
1953: The Year That Revolutionized Life, Death, and the Digital Bit

Related Posts

Comments are closed.