Technology Explained

Who Invented the First Computer and When? We Investigate

Dan Price 07-03-2019

When was the first computer invented? It sounds like it should be a simple question to answer, right?


It’s not so straightforward; you’ll get different answers depending on who you ask. Keep reading to find out more.

Defining “Computer”

The question of who invented the first computer is greatly influenced by how you define the word.

There’s some disagreement, even among dictionary publishers. Here’s how the UK’s Oxford English Dictionary defines it:

“An electronic device for storing and processing data, typically in binary form, according to instructions given to it in a variable program.”

And here’s how Merriam-Webster defines “computer” in the U.S.:

“A programmable usually electronic device that can store, retrieve, and process data.”

The critical difference is Merriam-Webster’s ambiguity over whether a computer needs to be electronic to meet the criteria. Most experts agree that computers can be sub-divided into analog computers and digital computers. Analog computers do not necessarily need an electrical supply.


Depending on your opinion, many different contenders for the honor emerge.

Who Invented the First Computer?

babbage difference engine
Image Credit: Wikimedia Commons

The most commonly-cited name when considering who invented the first computer is Charles Babbage.

Babbage (1791-1871) was a British Polymath. He specialized in several fields, including mathematics and mechanical engineering.


His two most notable machines were the Difference Engine and the Analytical Engine. The Difference Engine (started in 1822) could compute values of polynomial functions to aid navigation; the more complex Analytical Engine (proposed in 1837) was the first computer that could be considered “Turing complete”.

The Analytical Engine had many of the same traits as a modern computer, included a precursor to a CPU (which Babbage called the “Mill”) and memory (called the “Store”).

Babbage never had enough funding to build the Analytical Engine. In 1991, the London Science Museum finally built a complete and working model of the machine using techniques that were available in Babbage’s time.

Computing in Ancient Times

Antikythera mechanism
Image Credit: Wikimedia Commons


Although Babbage is rightly considered to be the father of modern computing, two ancient devices are often thought to be the first analog computers: the south-pointing chariot in China and the Antikythera mechanism in Greece.

The south-pointing chariot was an adaptation of a 5th century BC armored carriage called the Dongwu Che. The south-pointing feature was added around the 1st century BC. It did not use magnets; the direction was set at the start of a journey and relied on a gear system linked to the wheels to adjust its heading.

The Antikythera mechanism was an orrery (used to determine astronomical positions). It was discovered in 1901 on a shipwreck in the Greek islands. The device has been dated to sometime between 205 BC and 60 BC. It contained more than 30 meshing gear wheels, a fixed ring dial, and a hand crank.

After the collapse of Ancient Greece, the technology was lost for more than a millennium. It wasn’t until the arrival of mechanical astronomical clocks in Europe in the 14th century that civilization saw similar levels of technological complexity.


When Was the First Programmable Computer Invented?

German pioneer Konrad Zuse built the world’s first programmable computer—dubbed the Z1—in Berlin between 1935 and 1938.

The Z1 could read instructions from a perforated 35 mm film but never worked efficiently due to inaccuracies in the 30,000 metal parts. The computer was destroyed in an air raid during World War II.

Undeterred, Zuse went on to create the Z2 (1940), Z3 (1941), and Z4 (1949). The Z3 was the world’s first working programmable, fully automatic digital computer. It was a binary 22-bit floating point calculator. The Z3 had loops, but no conditional jumps; the memory and calculation units were based on telephone relays.

The First Electronic Computer Inventor: Tommy Flowers

colossus computer
Image Credit: Wikimedia Commons

If you believe computers inherently need to be electronic, then British telephone engineer Tommy Flowers can make a strong claim to have invented the first computer.

Flowers designed and built Colossus. It was used by the British to decipher encrypted messages between the German High Command during World War II.

The computer could perform Boolean and counting operations using thermionic valves. It was the world’s first programmable, electronic, digital computer.

But Colossus was still programmed with switches and plugs rather than stored programs; if you wanted to change a computer’s program, you had to undertake a lengthy rewiring and restructuring process.

The Manchester Baby

manchester baby
Image Credit: Wikimedia Commons

The world’s first electronic stored-program computer was the Small-Scale Experimental Machine (SSEM)—nicknamed the Manchester Baby.

It was created by Frederic Williams, Tom Kilburn, and Geoff Tootill at the Victoria University of Manchester, England. The Manchester Baby ran for the first time in on 21 June 1948.

Oddly, the machine was never intended to be a practical, usable computer. Instead, it was a test bed for the world’s first RAM.

Williams, Kilburn, Tootill quickly went about refining the SSEM into the Manchester Mark I (1949). By 1951, the Mark I had evolved into the Manchester Electronic Computer (Ferranti Mark 1)—the world’s first commercially available general-purpose computer.

The Modern Contenders: John Blankenbaker, Xerox, and IBM

Of course, the Manchester Electronic Computer was still a long way from the machines we use today. But by the mid-1950s, the pace of development was growing exponentially. The rate of development is one of the many reasons why you shouldn’t bother future-proofing your computer.

  • 1953: IBM unveils the 701, the world’s first scientific computer.
  • 1955: MIT launches Whirlwind, the first computer with integrated RAM.
  • 1956: MIT demos the first transistorized computer.
  • 1964: Italian Pier Giorgio Perotto unveils the Programma 101, the first desktop machine. 44,000 were sold.
  • 1968: Hewlett Packard started selling the HP 9100A. It was the first mass-marketed desktop computer.

And so to the 1970s. American John Blankenbaker created what many experts consider to be the first personal computer—the Kenbak-1. The computer went on sale in 1971; just 50 machines were built. They sold for $750, that’s about $5,000 today.

But even the Kenbak-1 was a far cry from today’s machines. It used a series of switches and lights for inputting data.

xerox alto
Image Credit: Wikimedia Commons

The first computer that resembled a modern machine was the Xerox Alto (1974). It had a display, GUI, and mouse. Apps opened in windows and icons, and menus were commonplace across the operating system. The Xerox Alto never went on general sale, but about 500 were used in universities around the world.

Steve Jobs received a demo of the Alto in 1979; the concepts it used formed the basis of the Apple Lisa and Macintosh systems.

Finally, in August 1981, IBM released its Personal Computer. The open architecture machine was instantly popular, giving rise to a host of compatible programs and peripherals. Within a year of its release, there were 753 software packages available, more than four times as many as on the Apple Macintosh a year after its release.

Who Invented the Computer?

There are other contenders that we’ve not touched on. There’s Blaise Pascal, who invented the mechanical calculator in 1642, and Ismail al-Jazari (1136-1206), whose castle clock is considered to be the earliest programmable analog computer.

And what about Alan Turing? He theorized the Turing machine in 1936 and designed the Automatic Computing Engine (ACE) in the post-war years.

So, who deserves the crown? We can’t decide, but make sure you let us know what you think in the comments.

And what about the future? Will computers take over the world? Well, there are definitely some jobs that computers can never do 6 Human Jobs That Computers Will Never Replace Read More .

Related topics: Charles Babbage, First Computer.

Affiliate Disclosure: By buying the products we recommend, you help keep the site alive. Read more.

Whatsapp Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *

  1. csaba
    April 21, 2020 at 12:41 pm

    The father, or midwife is Johnny von Neumann
    Many people have acclaimed von Neumann as the "father of the computer" (in a modern sense of the term) but I am sure that he would never have made that mistake himself. He might well be called the midwife, perhaps, but he firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing— in so far as not anticipated by Babbage… Both Turing and von Neumann, of course, also made substantial contributions to the "reduction to practice" of these concepts but I would not regard these as comparable in importance with the introduction and explication of the concept of a computer able to store in its memory its program of activities and of modifying that program in the course of these activities

  2. Sarah Joy Waingi
    May 2, 2019 at 5:05 am

    I don't get why we have to have various definitions of a simple man made device. To this day the human brain is still a complex biological phenomena yet we have a simple name and definition for it; one which all countries in the world agree to. However we're still arguing on the definition and inventor of a computer.

  3. Sam C
    March 15, 2019 at 10:11 am

    If you believe in a higher being then God created the first computer called the Brain. If not, then nature created the first computer called the Brain.

    My first computer was the Commodore 64.

  4. Ricardo Leon
    March 14, 2019 at 6:47 pm

    In this article you are talking about the PERSONAL computer, not mainframes,like the IBM 1401, 1620, 360 and the like.

  5. Peter Shaw
    March 13, 2019 at 9:31 am

    A notable exception, usually unknown to American audiences, is the LEO (Lyons Electronic Office) which was the first computer used for commercial business applications. Leo III was my training ground. It has an interesting history and they ran from 1954 to 1981, initially valve and later transistor machines. There are lots of Google references and some videos.

  6. Rick Pike
    March 12, 2019 at 4:27 pm

    How about the Atanasoff computer from Iowa State in the 1930s?

  7. Adrian Zeffert
    March 8, 2019 at 4:47 pm

    Hi all:
    There are several pieces of history missing from the article. The article was very interesting but:
    Alan Turing provided practical 'electrical' computing to deciphering the Enigma codes in 1943 or so.
    In 1962 I was at the British bureau of Standards in Teddington, and saw one of the first practical computations for hydro plane-ing. A car doing 120 mph on the, then, first section of the new M1 Motorway, crashed after it became airborne. The computer was in a very large room full if racks of Octal valve (tube) flip-flops. The main control panel had dozens of tube amplifiers, and a digital voltmeter. I was there to service the digital voltmeter and trade out some amplifiers.
    Next, the Timex/Sinclair 1000, the Apple 2, the National Semiconductor 4 bit processors and many others which came along between 1947 and 1982. In 1966 I was attending Newark College Of Engineering, now NJIT. Back then our main computer was an RCA Spectra 70. It filled a very large airconditiond room a bvb d could batch process Holerith cards cards coded with Fortran IV equations.
    So, I believe that your arrival should be corrected to include the early kit computers, Cray 1, etc.

  8. dragonmouth
    March 7, 2019 at 12:47 pm

    Where in this history does ENIAC fit? Becoming operational in December of 1945, it pre-dates both "The Manchester Baby" and "The Modern Contenders".

    • Dan Price
      March 7, 2019 at 5:44 pm

      Yeah, it slots between Colossus and the Manchester Baby. I didn't include it as it wasn't the "first" for anything, there were already electronic computers and it wasn't a stored program machine. An important piece of kit though, no doubt.