June 23, 2012 would have been the 100th birthday of the British polymath Alan Turing. Among his achievements, Turing contributed substantially to the field of computers, and his name shows up multiple times in the lexicon of IT. Reflecting on this made me wonder who else I might find represented in the vocabulary of the field. Lots of people, it turns out.
Let's start with Turing. In 1936 (before computers), Turing formulated a hypothetical "automatic machine," now known as a Turing machine. Although a Turing machine isn't a practical design for an actual computer, it's a useful way to think about what sorts of things a machine can compute. If you happened to go to the Google home page on Turing's birthday, you might have seen that day's Google Doodle, which was a brilliant interactive model of a Turing machine:
(Someone has also built a Turing machine out of LEGO.)
Along these lines, a computer or computer language is said to be Turing complete if it can theoretically compute any algorithm. These days, not only is your desktop computer Turing complete, but so is your phone. Turing also left us the Turing test, which tries to distinguish humans from computers, as I mentioned in the article about CAPTCHAs.
Many computer languages are named for people. The language Ada honors Ada Lovelace, a 19th-century mathematician (and the daughter of Lord Byron) who worked with Charles Babbage on an early mechanical computer. The computer scientist Niklaus Wirth created two languages named for people: Pascal for Blaise Pascal, and Euler for the mathematician Leonhard Euler. The language Haskell is named for the American mathematician Haskell Curry, Erlang is named for the Danish mathematician Agner Krarup Erlang, and Gödel is named for the logician Kurt Gödel. There's even a language named for the artist M. C. Escher. (If you're familiar with Escher's work, you might be able to imagine why it would inspire computer scientists.) In a less formal process, the name of the operating system Linux arose as a combination of the originator's first name (Linus Torvald) and Unix, which was the name of the system he was copying.
The computer industry is rich in "laws" and principles, some of which you might know, that are named after people. In 1965, Intel's Gordon Moore observed that the capacity of computer chips seemed to double every year. Moore's Law, as it was eventually called, was prescient: although Moore only looked forward to 1970, Moore's Law has described computer chip technology for nearly 50 years, which enabled the kind of miniaturization that lets you carry a computer (i.e., a smartphone) in your pocket. Many people have probably heard, and possibly suffered, some variation of Brooks's Law ("Adding manpower to a late software project makes it later"), which comes from Fred Brooks's legendary 1975 book The Mythical Man-Month. Linus's Law, named in honor of Linus Torvald, says that "given enough eyeballs, all bugs are shallow," which is a kind of programmer version of the proverb "many hands make work light." An important design principle in software is Postel's Prescription, named for Jon Postel, which states that designs should be "conservative in what they send, liberal in what they accept," a maxim that sounds nearly Biblical in its wisdom. This principle has been critical for the development of the Internet; it's what makes it practical for us to request a web page that might live on an entirely different computer halfway around the world.
Certain programming techniques have their eponyms as well. George Boole was an English logician who invented Boolean algebra. The logic in digital computers ultimately just consists of head-spinningly complex chains of Boolean tests (on/off, true/false) and of Boolean operators (AND, OR, and NOT). A more obscure programming technique is currying, which I mention only because it's a second thing in computers, in addition to the programming language, that's named for Haskell Curry. There are names pretty well hidden in the LZ compression algorithm, named for Abraham Lemper and Jacob Ziv, which you and I take advantage of every time we use .GIF or .PNG files.
Programmers also frequently use a couple of near-eponyms, so to speak. If you write an addition problem as "+ 4 1" (add 4 and 1), you're using something called Polish notation, which is useful in certain computer contexts. The "Polish" in Polish notation refers indirectly to its inventor, the logician Jan Łukasiewicz. Reasonably enough, if instead you write an expression as "4 1 +", you're using Reverse Polish notation (RPN), which is useful in different computing contexts. In fact, some readers might be old enough to remember that RPN was actually how you used some old calculators and adding machines.
Another near-eponym in programming is Hungarian notation, where things are named using a prefix to suggest their function — for example, iCounter might be a counter that's an integer, and txtName might be a text box for entering a name. One theory is that Hungarian notation is named for its inventor, the Hungarian programmer Charles Simonyi. It might also be named for the fact that personal names in Hungarian are reversed (last name, then first name). Or a probably apocryphal story says it was named Hungarian because the prefixes looked strange — "like Hungarian." Maybe all three stories explain the name.
Most of these name-related terms aren't used much in everyday English, of course, although several (Turing test, Moore's Law, Linux) appear often in general-interest articles or books about computers. But it's pleasing that the everyday life of programmers does remember people who have contributed so much to our understanding of the field.