History of the Microcomputer Revolution - The Historic background
Human beings have been thinking about computers for hundreds of years, they're not really unique to the 20th century. Since humans walked the earth and engaged in commerce, there has been a need for a system of counting and calculating. For thousands of years this was done either in peoples' heads, or with simple clever devices such as the abacus.

In the early 1800's a French inventor named Jacquard revolutionized the weaving industry by creating a loom which could create extremely complicated designs by reading instructions which were punched onto cards. The holes punched into the cards - which were strung together into a chain of continuous instructions - directed the loom which threads to use and what to do.

In the mid 1800's a British inventor named Charles Babbage came up with the idea of an Analytical Engine which would do mathematical computations using this same concept of storing instructions onto cards, but he lacked the technology to create the powerful engine needed.

A contemporary of his, a woman named Augusta Ada Byron, who was the daughter of the poet Lord Byron, was a gifted mathematician who immediately understood the concepts and the possibilities of Babbage's analytical engine. She was able to expand this concept into actual theoretical steps and procedures which would be used in the computations, and she is credited by some as the first computer programmer.

In the late 1800's an American inventor named Herman Hollerith invented a punchcard counting device which was used successfully for tabulating statistics in taking the 1890 census. Hollerith's business eventually ran into financial difficulties and he was forced to sell out to a company named CTR, which stood for Computer Tabulating Recording.

A young salesman at CTR named Tom Watson had started off his career selling pianos off the back of a horse-drawn cart. Now Watson had worked his way up through corporate America - spending time at the National Cash Register Company along the way - and he recognized the potential of selling punchcard-based calculating machines to American business. Watson would later take over this company himself and in the 1920's rename it the International Business Machines Corporation, IBM.

Necessity is the mother of all invention, and the modern day mainframe computer as we know it was created by the United States Military's need to calculate such things as shell trajectories in a minimal amount of time. The electronic vacuum tube ENIAC computer, operational in 1945, was a thousand times faster than the older electro-mechanical calculating machines previously used for such tasks.

The inventors of this computer, J. Presbert Eckley and John Mauchly, went on to become part of the Univac corporation, a name which became synonymous with computers, until the late 1950's when IBM fought back and regained the industry with its IBM 360 mainframe.

In the 1960's a new generation of computer appeared - the mini-computer - introduced by Digital Equipment Corporation. Physically smaller and far less expensive than the mainframe computers, and in some ways better, it was still exclusively a business computer - far beyond the budget of individuals.

Vacuum tubes were replaced by transistors; transistors merged into integrated circuits, the age of microelectronics was born. Long-haired hippies of the 60's would soon turn into the bell-bottom disco dancers of the 70's.

In 1969 a small California electronics company named the Intel Corporation received an order from a Japanese firm named Busicom to design a set of chips for programmable calculators. But a young Intel engineer named Ted Hoff had a better idea, and next week we'll learn how this tiny company became the architect of the microcomputer revolution.

0 comments