The logical first step in becoming computer literate is to appreciate the origins of computers. Computers are the result of a long history of mathematical exploration and innovations. They have their earliest roots in primitive systems of counting that relied on fingers and toes or stones to enumerate objects.
Historically, the most important early computing instrument is the abacus, which has been known and widely used for more than 2,000 years. It is simply a wooden rack holding parallel wires on which beads are strung. When these beads are manipulated (moved along the wire) according to “programming” rules that the user must memorize, all ordinary arithmetic operations can be performed. Another computing instrument, the astrolabe, was also in use about 2,000 years ago for navigation.
Blaise Pascal is widely credited with building the first “ digital calculating machine” in 1642. It performed only the addition of numbers entered by means of dials and was intended to help Pascal’s father, who was a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer that was built in 1694. It could add, and by successive adding and shifting, multiply. Leibniz invented a special “stepped gear” mechanism for introducing the addend digits, and this mechanism is still used. The machine of Leibniz was in a sense a forerunner of the mechanical desk calculator invented by Charles X. de Comar in 1820.
The first real computer didn’t change the world. It was never built. It existed, in fantastic detail, in the mind of a grumpy English mathematics teacher named Charles Babbage around the time of our civil war. He loved problems and puzzles, as do computer people today. He taught himself arithmetic, and when he went to college, he knew more algebra than his teacher. He invented speedometers and a machine for playing tic-tac-toe. Later he built an adding machine that could solve a particular kind of problem. Then, he began to design an “ analytical engine” that could solve any kind of arithmetic problem. Babbage put together the idea of instructions sorted in punched cards with the idea of a calculating machine. To set up the machine to solve a new problem—weave a new arithmetic pattern—he would just change cards. The two ideas added up to a sum vastly greater than its parts. Inside Babbage’s head was the first true computer. His design was practical, but it required cogwheels and gears and other parts that the machinists of his time could not make, and so the analytical engine had to wait a hundred years to be translated from a brilliant idea to working machine.
The next influential invention was the census machine of Herman Hollerith. In the late nineteenth century, census taking had become a major task; tabulation of such a vast amount of data was slow and problematical. In an effort to find a faster way to compile raw statistical data, the Census Bureau sponsored a contest. Herman Hollerith’s device was chosen the most effective and practical.
Hollerith had designed a device that read data from punched cards and kept track of the count. The keypunch system of data processing was popular for many years, although recently it has succumbed to faster and less cumbersome methods. Hollerith was so successful that he left the census Bureau in 1896 to form the International Business Machine corporation -IBM, a recognized leader in the field of data-processing technology even today.
This concept led to systems using electromechanical devices, in which electric power provided mechanical motion—such as for turning the wheels of an adding machine. Such systems soon included features to feed in automatically a specified number of cards from a “read-in” station; and feed out cards punched with results. By modern standards the punch-card was slow, typically processing from 50 to 250 cards per minute, with each card holding up to 80 decimal numbers. At the time, however, punched cards were an enormous step forward.