Computers truly came into their own as great inventions in the last two decades of the 20th century. But their history stretches back more than 3000 years to the Abacus: a simple calculator made from beads divided into two parts which are movable on its rods. It was developed by the Mesopotamians and later improved by the Chinese in order to add and multiply using the place value of the digits of numbers and positions of the beads on the abacus. The difference between an ancient abacus and a modern computer seems vast, but the principle—making repeated calculations more quickly than the human brain—is exactly the same. Even today, abacuses are used which is still used in some parts of the world today.
In the 1550s, Napier’s logs and bones were developed. Also known as Napier’s rods, these are numbered rods which can be used for multiplication of any number by a number 2 – 9. In 1642, the French scientist and philosopher, Blaise Pascal invented the first practical mechanical calculator. It was a machine made up of gears which were used for adding numbers quickly. It consisted of numbered toothed wheels having unique position values, which controlled the addition and subtraction operations.
Several decades later, in 1671, German mathematician and philosopher Gottfried Wilhelm Leibniz improved the adding machine and constructed a new machine that was able to perform multiplication and division as well. Instead of using gears, it had a “stepped drum” (a cylinder with teeth of increasing length around its edge), an innovation that survived in mechanical calculators for hundreds of years later.
Leibniz is remembered for another important contribution to computing: he was the man who invented binary code, a way of representing any decimal number using only the two digits zero and one. Although Leibniz made no use of binary in his own calculator, it set others thinking. In 1854, a little over a century after Leibniz had died, Englishman George Boole (1815–1864) used the idea to invent a new branch of mathematics called Boolean algebra. In modern computers, binary code and Boolean algebra allow computers to make simple decisions by comparing long strings of zeros and ones.
Anyway, neither the abacus nor the mechanical calculators constructed by Pascal and Leibniz really qualified as computers. Calculators evolved into computers when people devised ways of making entirely automatic, programmable calculators.
In the early 1800s, Joseph Marie Jacquard invented Jacquard’s loom. He manufactured punched cards and used them to control looms in order to weave. This entire operation was under a program’s control. With this invention of punched cards, the era of storing and retrieving information started that greatly influenced the later inventions and advancements.
The first person to attempt a computer was an English mathematician named Charles Babbage. Many regard Babbage as the “father of the computers” because his machines had an input (a way of feeding in numbers), a memory (something to store these numbers while complex calculations were taking place), a processor (the number-cruncher that carried out the calculations), and an output (a printing mechanism)—the same basic components shared by all modern computers.
Initially, he developed a Difference Engine Machine to calculate logarithmic tables to a high degree of precision. The machine, however, could not be made. In fact, during his lifetime, Babbage never completed a single one of the hugely ambitious machines that he designed including the Analytical engine which remained in the conceptual phase. However, little of Babbage’s work survived after his death. But when, by chance, his notebooks were rediscovered in the 1930s, computer scientists finally appreciated the brilliance of his ideas.
Augusta Ada Byron, Countess of Lovelace, daughter of the poet Lord Byron, an enthusiastic mathematician and a long-time friend of Babbage, collaborated with him on the Analytical Engine. The Analytical Engine created plans for how the machine could calculate Bernoulli numbers. Lovelace refined Babbage’s ideas for making his machine programmable and created the first computer program. She is referred to as the world’s first computer programmer.
Toward the end of the 19th century, other inventors were more successful in their effort to construct “engines” of calculation. American statistician Herman Hollerith fabricated the first electromechanical punched card tabulator, which used punched cards from input, output, and instructions. This machine was used by the American Department of Census to compile their census data.
Soon afterward, Hollerith realized his machine had other applications, so he set up the Tabulating Machine Company in 1896 to manufacture it commercially. A few years later, it changed its name to the Computing-Tabulating-Recording (C-T-R) company and then, in 1924, acquired a new name: International Business Machines (IBM).
At the time when C-T-R was becoming IBM, US government scientist Vannevar Bush made a series of unwieldy contraptions with equally cumbersome names: The New Recording Product Integraph Multiplier. Later, he built a machine called the Differential Analyzer, which was used to carry out calculations. Bush’s ultimate calculator was an improved machine named the Rockefeller Differential Analyzer. Machines like these were known as analog calculators—analog because they stored numbers in a physical form (as so many turns on a wheel or twists of a belt) rather than as digits. Although they could carry out incredibly complex calculations, it took several days before the results finally emerged.
One of the key figures in the history of 20th-century computing, Alan Turing was a brilliant Cambridge mathematician whose major contributions were to the theory of how computers processed information. In 1936, at the age of just 23, Turing wrote a groundbreaking mathematical paper called “On computable numbers, with an application to the Entscheidungs problem,” in which he described a theoretical computer now known as a Turing machine (a simple information processor that works through a series of instructions, reading data, writing results, and then moving on to the next instruction). Turing’s ideas were hugely influential in the years that followed and many people regard him as the father of modern computing—the 20th-century’s equivalent of Babbage.
Although essentially a theoretician, Turing did get involved with real, practical machinery, unlike many mathematicians of his time. During World War II, he played a pivotal role in the development of code-breaking machinery that, itself, played a key part in Britain’s wartime victory; later, he played a lesser role in the creation of several large-scale experimental computers including ACE (Automatic Computing Engine), Colossus, and the Manchester/Ferranti Mark I. Today, Alan Turing is best known for conceiving what’s become known as the Turing test, a simple way to find out whether a computer can be considered intelligent by seeing whether it can sustain a plausible conversation with a real human being.
Just before the outbreak of the second war, in 1938, German engineer Konrad Zuse constructed his Z1, the world’s first programmable binary computer. The following year, American physicist John Atanasoff and his assistant, electrical engineer Clifford Barry, assembled a prototype of a more elaborate binary machine that they named the Atanasoff-Berry Computer (ABC). These were the first machines that used electrical switches to store numbers: when a switch was “off”, it stored the number zero; flipped over to its other, “on”, position, it stored the number one. Hundreds or thousands of switches could thus store a great many binaries. These machines were digital computers: unlike analog machines, which stored numbers using the positions of wheels and rods, they stored numbers as digits.
Professor Howard Aiken of Harvard University constructed Mark-I, an automatic, general purpose electro-mechanical computer, which could multiply two 10-digit numbers in 5 seconds – a record at the time. It was a large machine, stretching 15m in length, it was like a huge mechanical calculator built into a wall. It stored and processed numbers using electromagnetic relays. These, although impressive, needed a lot of power to make them switch.
During World War II, the military co-opted thousands of the best scientific minds: recognizing that science would win the war. Things were very different in Germany. When Konrad Zuse offered to build his Z2 computer to help the army, they couldn’t see the need—and turned him down. On the Allied side, great minds began to make great breakthroughs. In 1943, a team of mathematicians, including Alan Turing, built a computer called Colossus to help them crack secret German codes. Colossus was the first fully electronic computer. Instead of relays, it used a better form of switch known as a vacuum tube (invented in 1904 by John Ambrose Fleming).
Just like the codes, it was trying to crack, Colossus was top-secret and its existence wasn’t confirmed until after the war ended. As far as most people were concerned, vacuum tubes were pioneered by a more visible computer that appeared in 1946: The Electronic Numerical Integrator and Calculator (ENIAC). The ENIAC’s inventors, two scientists from the University of Pennsylvania, John Mauchly and J. Presper Eckert, were originally inspired by Bush’s Differential Analyzer. But the machine they constructed was far more ambitious. It contained nearly 18,000 vacuum tubes (nine times more than Colossus), was around 24 m (80 ft.) long, and weighed almost 30 tons.
Following the invention of the ENIAC, other first-generation computers such as EDVAC, EDSAC, and UNIVAC-I came up.