Marcian Hoff
Before the invention of the microprocessor, computers used to take up acre-sized rooms. Different integrated circuit chips were needed for every application that a computer performed. The relatively inexpensive and compact central control systems that we know today didn't exist until Ted Hoff invented the microprocessor.
Hoff developed a chip small enough and cheap enough to fit into almost any device, making computers, cameras, calculators, and dozens of other appliances and machines able to "think."
The story of the microprocessor actually began in the late 1950s, when two computer experts, Jack Kilby and Robert Noyce, discovered that large numbers of transistors and their connections could be etched onto a piece of silicon. However, these first chips had one key weakness – they were hardwired, meaning they could only do the jobs that they were originally designed for.
Hoff's breakthrough was to design a set of chips that worked together to perform a device's functions. He found that if a chip was designed to run conventional computer programs on its own – to act as a Central Processing Unit (CPU) – processing power could be made much more versatile. The CPU that Hoff had in mind was the size of a thumbnail and contained 2,300 transistors. Despite its small size, Hoff's CPU had the same computing power as computers that cost thousands of dollars more with central processing units the size of a large desk.