The answer may surprise you!
Man has been calculating since ancient history. Even in comic strips, we see how man-made four vertical slashes and then a diagonal slash to indicate a count of five.
The earliest “machine” to help man do arithmetic was the abacus. The abacus has been used in Europe, the Near East, China, and Russia for centuries. They are still used in some areas today.
The abacus is constructed with a series of rods held in a wooden frame. Each rod has moveable beads which represent digits. A number is set up and then a mathematical operation is performed involving a second number.
The first important development in modern computing came in the 1800s. Charles Babbage, born in London and educated at Cambridge, was a first-class mathematician. He is called the “Father of the Computer” because he developed the first mechanical computer, which led to the modern electronic computer. The machine was called the difference engine. It was a mechanical calculating machine that used addition to calculate complex mathematical functions. It operated in the decimal system and was powered by cranking a handle. Babbage’s design was flawless, but the metalworkers of that time could not produce the gears and other parts to the tolerances that were required. Years later, in the 1980s, the Science Museum in London used Babbage’s plans to build a working section of the difference engine. In 1991, an entire functioning difference engine was built from Babbage’s original plans. The required tolerances were achieved, and the finished computer was a success proving that Babbage’s machine would have worked.
Babbage went on to design a more advanced machine, the analytical engine, which could perform all four of the major mathematical functions — addition, subtraction, multiplication, and division. This machine had all the essential parts and functions of modern computers — a central processing unit, controlled flow using branches and loops, and memory. All of today’s electronic computers have the same structure. This machine was still mechanical, general purpose (it could be programmed) but, of course, modern computers are all electronic — no moving parts. He was unable to construct a working model of the analytical engine, again due to the lack of machining technology necessary to produce the gears.
Alan Turing’s life story was shown in the movie, The Imitation Game. Many consider Turing to be the father of theoretical computer science and artificial intelligence, but England (his home country) didn’t fully recognize his accomplishments for two reasons. First, as shown in the movie, there was a strong dislike of and prejudice against homosexuals, and second, a lot of his work was shrouded in government secrecy. He developed the Turing machine, which was an electrical general-purpose computer. The movie also showed that he was the brains behind breaking the extremely complex German naval codes, which allowed the allies to defeat the Nazis in many crucial battles, including the famous Battle of the Atlantic. Some estimates say Turing’s work saved over 14 million lives and shortened the war by more than two years. He continued working on computers after the war and designed one of the first stored program computers.
Here’s the problem: You are fighting a war using artillery weapons. You would like to hit the target on the first shot, but there are many variables to consider — the angle of the gun, the type of gun, the type of ammunition, the density of the air, wind speed and direction, and more.
John von Neumann working in the Aberdeen Proving Grounds where the U. S. Army tests and evaluates military weapons and equipment, contacted John William Mauchly and J. Presper Eckert, scientists at the University of Pennsylvania. He asked them to design and develop a computer that could calculate firing tables, which considered all the variables for gunnery officers to aim and fire their weapons with greater precision. They invented the Electronic Numerical Integrator and Computer (ENIAC), the first general-purpose large scale electronic digital computer in 1946. This computer had both the data and program stored in the computer’s memory. This is the basis of most modern computer designs.
The ENIAC was unveiled to the public on February 14, 1946. Though it had been intended to help the war effort, the war was over by that time. ENIAC was nevertheless employed by the military to do a variety of calculations, for example: the design of the hydrogen bomb, weather prediction, cosmic-ray studies, and wind-tunnel design. It was built from 17,468 vacuum tubes and weighed more than 60,000 lbs. — at the time, it was the largest single electronic apparatus in the world. The system could perform 5,000 additions and 300 multiplications per second — slow by today’s standards but 1,000 times faster than any existing machines. It was also highly reliable. It marked the beginning of a long road of computer technology development.
Even though Mauchly and Eckert produced the first large-scale electronic computer, they were unable to patent their work. As they worked on their machine, they made frequent trips to Iowa to pick the brain of John Vincent Atanasoff, an Iowa State College professor. Atanasoff had designed and built a computing device (a very small one) in the late 1930s early 1940s, which incorporated all the concepts that Mauchly and Eckert used.
In January 1968, a trial was held to see if Atanasoff or Mauchly had the right to patent the computer. Atanasoff established clearly he had made sketches, ideas, and plans available to Mauchly in June 1941. These plans could be used to construct an electronic digital computer. He was able to firmly prove that he had conceptualized the major elements of an electronic digital computer.
The trial lasted 135 days. The judge’s opinion was issued on October 19, 1973, and the finding was clear. Mauchly’s basic ENIAC ideas were “derived from Atanasoff, and the invention claimed in ENIAC was derived from Atanasoff.” The judge further found: “Eckert and Mauchly did not themselves first invent the automatic electronic digital computer, but instead derived that subject matter from one Dr. John Vincent Atanasoff.”
And so, the inventor of the computer? Dr. John Vincent Atanasoff!