Imagine a world without computers. A world where humanity’s knowledge is no longer at your fingertips. A world where a tool you use every day just no longer exists. You would not be reading this article and not watching this website right now. Computers have occupied nearly every facet of our lives. However, how did they become so ubiquitous? Let us discuss the history of the invention of the computer. We shall answer the questions like when the first computer was invented and how it was invented.
Today, the word computer refers to the electronic devices we interact with to work, connect and play. However, we historically describe it as a machine used to perform calculations with numbers. Therefore, this article will study the evolution of the earliest devices used for computations and how they became the computers we depend on today.
One may encounter the question of when was the first computer ever made. The answer is that historians consider ‘Abacus‘ the first calculator. It was a computational tool that people have used for hundreds of years. It constitutes a wooden frame having parallel rods. These rods had several wooden beads that could slide freely along the length. They moved beads up and down with fingers while performing calculations.
The people used Abacus to perform addition, subtraction, multiplication, and division. The exact origin of this device is still unknown, But The Sumerian Abacus appeared as early as 23002700 BCE in Mesopotamia. It has been in numerous civilizations throughout history, including ancient Egypt, Persia, Greece, Rome, India, and China, till the end of the 20th century.
Abacus – the first known computer in the history
People from the past used another famous calculator to measure the elevation of celestial bodies in the sky. We find its earliest known reference around the 2nd century, in the Hellenistic civilization. In addition to its value to astronomers, the astrolabe became indispensable for sailors since it allowed them to determine the local latitude on long voyages.
One defining quality of modern computers that separates them from simple calculators is that we can program them. It allows them to perform specific tasks without continual human input automatically.
In 1614, a Scottish mathematician, John Napier, invented a calculating device called Napier’s Bone. It consisted of a wooden box containing rotating cylinders with digits from 0 to 9. It could multiply, divide and find square roots of numbers using simple addition and subtraction. Moreover, Napier’s most significant achievements include the invention of the computer and the logarithm.
Napier’s Bone
Blaise Pascal, a French mathematician, invented the calculating machine called Pascaline in 1642 when he was only 19 years old. Pascaline used rotating wheels. Each wheel had ten parts having digits from 0 to 9.
The rotation of wheels performed calculations. The next wheel moves by one digit when one wheel completes a rotation. It had several small slots for displaying the result and could perform addition and subtraction on whole numbers.
Pascaline
In 1822, the English mathematician Charles Babbage started working on a big calculating machine. He called it the Difference Engine. It was about the size of a room.
Babbage worked for many years on this machine, but he could not complete it. Later, he came up with the idea of an Analytical Engine. He conceptualized the first programmable, mechanical computer. His design utilized punch cards to input instructions that the machine would carry out. Unfortunately, it proved too complex to be produced economically. Therefore, Babbage could not complete it because the technology was not advanced enough. However, he laid the foundation for modern digital computers. Unfortunately, they canceled the project after the British Government stopped funding. Today’s modern digital computers are based on the idea of an analytical engine.
We know Charles Babbage as the father of modern digital computers due to his contributions to the invention of the computer.
Analytic Engine
- A) Difference Engine only
- B) Analytic Engine only
- C) Hollerith Desk
- D) Both Difference Engine and Analytic Engine
Since Charles Babbage invented the Difference Engine and Analytic Engine, the correct answer is option D, which indicates both Difference Engine and Analytic Engine.
In 1890, Herman Hollerith built a tabulating machine called Hollerith Desk. Americans invented this machine to help with the census of 1890 in America. Hollerith Desk consisted of a card reader which sensed the holes in the cards, a gear-driven mechanism that could count, and a large set of dial indicators to display the results. After building Hollerith Desk, Hollerith started a company named Tabulating Machine Company. Eventually, this company changed its name to International Business Machines (IBM).
Hollerith Desk
Based on the idea of the logarithm, English mathematician William Oughtred developed a device called Slide Rule in 1614. It has three parts: slide, rule, and a transparent sliding cursor. It was handy for solving problems that involved multiplications and divisions.
Electronic pocket calculators replaced the Slide Rule in the early 1970s.
Slide Rule
Howard Aiken built the first large-scale digital computer in 1944 at Harvard University. It was named Mark-1. It was one of the first machines that used electrical switches to store numbers. Mark-I was able to add three numbers having eight digits in one second. It was able to print out its results on punched cards or an electric typewriter. Mark-I was 50 feet long, 8 feet high, and weighed about 5 tons. It used 3,000 electric switches. When the buttons were off, it kept zero; on, it kept the number one. Moreover, Howard Aikens also supervised the development of Mark II, Mark III, and Mark IV with extended capabilities. Modern computers follow this same binary principle.
Mark-I Computer
The 20th century saw analog computers develop further as scientists put them to work to solve complex mathematical problems. The differential analyzer is the most famous example built at ‘MIT’ by Vannevar Bush in the 1920s. However, Bush later became involved in the Manhattan project to produce nuclear weapons and even inspired the invention of the world wide web nearly 50 years before its creation.
World War 2 led to a decisive leap in computer technology as nations tried to gain the upper hand over their adversaries. This time, scientists developed computers to calculate firing tables, improve artillery accuracy, and break enemy code to gain valuable intelligence.
The evolution of the computer has not stopped in the modern era since it is a continuous process. For example, computer scientists are developing new systems to provide voice recognition and understand natural languages. Furthermore, High-Performance Computing (HPC) is used in today’s data centers for fast processing. High-performance computing (HPC) uses parallel processing to run advanced application programs efficiently, reliably, and quickly.
Moreover, please read our article on which invention allowed computers to be smaller and What is Transall in Information Technology.