When was the first computer invented? It sounds like it should be a simple question to answer, right?
It’s not so straightforward; you’ll get different answers depending on who you ask. Keep reading to find out more.
The question of who invented the first computer is greatly influenced by how you define the word.
There’s some disagreement, even among dictionary publishers. Here’s how the UK’s Oxford English Dictionary defines it:
“An electronic device for storing and processing data, typically in binary form, according to instructions given to it in a variable program.”
And here’s how Merriam-Webster defines “computer” in the U.S.:
“A programmable usually electronic device that can store, retrieve, and process data.”
The critical difference is Merriam-Webster’s ambiguity over whether a computer needs to be electronic to meet the criteria. Most experts agree that computers can be sub-divided into analog computers and digital computers. Analog computers do not necessarily need an electrical supply.
Depending on your opinion, many different contenders for the honor emerge.
Who Invented the First Computer?
The most commonly-cited name when considering who invented the first computer is Charles Babbage.
Babbage (1791-1871) was a British Polymath. He specialized in several fields, including mathematics and mechanical engineering.
His two most notable machines were the Difference Engine and the Analytical Engine. The Difference Engine (started in 1822) could compute values of polynomial functions to aid navigation; the more complex Analytical Engine (proposed in 1837) was the first computer that could be considered “Turing complete”.
The Analytical Engine had many of the same traits as a modern computer, included a precursor to a CPU (which Babbage called the “Mill”) and memory (called the “Store”).
Babbage never had enough funding to build the Analytical Engine. In 1991, the London Science Museum finally built a complete and working model of the machine using techniques that were available in Babbage’s time.
Computing in Ancient Times
Although Babbage is rightly considered to be the father of modern computing, two ancient devices are often thought to be the first analog computers: the south-pointing chariot in China and the Antikythera mechanism in Greece.
The south-pointing chariot was an adaptation of a 5th century BC armored carriage called the Dongwu Che. The south-pointing feature was added around the 1st century BC. It did not use magnets; the direction was set at the start of a journey and relied on a gear system linked to the wheels to adjust its heading.
The Antikythera mechanism was an orrery (used to determine astronomical positions). It was discovered in 1901 on a shipwreck in the Greek islands. The device has been dated to sometime between 205 BC and 60 BC. It contained more than 30 meshing gear wheels, a fixed ring dial, and a hand crank.
After the collapse of Ancient Greece, the technology was lost for more than a millennium. It wasn’t until the arrival of mechanical astronomical clocks in Europe in the 14th century that civilization saw similar levels of technological complexity.
When Was the First Programmable Computer Invented?
German pioneer Konrad Zuse built the world’s first programmable computer—dubbed the Z1—in Berlin between 1935 and 1938.
The Z1 could read instructions from a perforated 35 mm film but never worked efficiently due to inaccuracies in the 30,000 metal parts. The computer was destroyed in an air raid during World War II.
Undeterred, Zuse went on to create the Z2 (1940), Z3 (1941), and Z4 (1949). The Z3 was the world’s first working programmable, fully automatic digital computer. It was a binary 22-bit floating point calculator. The Z3 had loops, but no conditional jumps; the memory and calculation units were based on telephone relays.
The First Electronic Computer Inventor: Tommy Flowers
If you believe computers inherently need to be electronic, then British telephone engineer Tommy Flowers can make a strong claim to have invented the first computer.
Flowers designed and built Colossus. It was used by the British to decipher encrypted messages between the German High Command during World War II.
The computer could perform Boolean and counting operations using thermionic valves. It was the world’s first programmable, electronic, digital computer.
But Colossus was still programmed with switches and plugs rather than stored programs; if you wanted to change a computer’s program, you had to undertake a lengthy rewiring and restructuring process.
The Manchester Baby
The world’s first electronic stored-program computer was the Small-Scale Experimental Machine (SSEM)—nicknamed the Manchester Baby.
It was created by Frederic Williams, Tom Kilburn, and Geoff Tootill at the Victoria University of Manchester, England. The Manchester Baby ran for the first time in on 21 June 1948.
Oddly, the machine was never intended to be a practical, usable computer. Instead, it was a test bed for the world’s first RAM.
Williams, Kilburn, Tootill quickly went about refining the SSEM into the Manchester Mark I (1949). By 1951, the Mark I had evolved into the Manchester Electronic Computer (Ferranti Mark 1)—the world’s first commercially available general-purpose computer.
The Modern Contenders: John Blankenbaker, Xerox, and IBM
Of course, the Manchester Electronic Computer was still a long way from the machines we use today. But by the mid-1950s, the pace of development was growing exponentially. The rate of development is one of the many reasons why you shouldn’t bother future-proofing your computer.
- 1953: IBM unveils the 701, the world’s first scientific computer.
- 1955: MIT launches Whirlwind, the first computer with integrated RAM.
- 1956: MIT demos the first transistorized computer.
- 1964: Italian Pier Giorgio Perotto unveils the Programma 101, the first desktop machine. 44,000 were sold.
- 1968: Hewlett Packard started selling the HP 9100A. It was the first mass-marketed desktop computer.
And so to the 1970s. American John Blankenbaker created what many experts consider to be the first personal computer—the Kenbak-1. The computer went on sale in 1971; just 50 machines were built. They sold for $750, that’s about $5,000 today.
But even the Kenbak-1 was a far cry from today’s machines. It used a series of switches and lights for inputting data.
The first computer that resembled a modern machine was the Xerox Alto (1974). It had a display, GUI, and mouse. Apps opened in windows and icons, and menus were commonplace across the operating system. The Xerox Alto never went on general sale, but about 500 were used in universities around the world.
Steve Jobs received a demo of the Alto in 1979; the concepts it used formed the basis of the Apple Lisa and Macintosh systems.
Finally, in August 1981, IBM released its Personal Computer. The open architecture machine was instantly popular, giving rise to a host of compatible programs and peripherals. Within a year of its release, there were 753 software packages available, more than four times as many as on the Apple Macintosh a year after its release.
Who Invented the Computer?
There are other contenders that we’ve not touched on. There’s Blaise Pascal, who invented the mechanical calculator in 1642, and Ismail al-Jazari (1136-1206), whose castle clock is considered to be the earliest programmable analog computer.
And what about Alan Turing? He theorized the Turing machine in 1936 and designed the Automatic Computing Engine (ACE) in the post-war years.
So, who deserves the crown? We can’t decide, but make sure you let us know what you think in the comments.
And what about the future? Will computers take over the world? Well, there are definitely some jobs that computers can never do.