About this Article
Written by: Matthew Michihara
Written on: May 2nd, 2008
Tags: computer science
Thumbnail by: cokebottle/stock.xchng
About the Author
In Spring 2008, Matthew was a sophomore at USC’s Viterbi School of Engineering, and College of Letters, Arts and Sciences set to graduate with a degree in Computer Engineering and Computer Science, and East Asian Languages and Cultures in May of 2010. After graduation, he planned to pursue a graduate degree in Computer Science.
Stay Connected

Volume X Issue II > Microprocessors: The Silicon Revolution
The microprocessor can be considered one of the greatest inventions of the twentieth century, placing an entire room of computer equipment with a single chip. The fundamental operations of a microprocessor are basic, yet it has allowed so much to be accomplished. As transistors, the building blocks of microprocessors, approach their minimum size limits, creative ways to continue increasing computing power have emerged. These include technologies such as pipelining and the multi-core paradigm. The microprocessor has proven to be a versatile invention, branching out into numerous fields outside the personal computer.


Microprocessors have revolutionized the world, especially in the area of electronics. A myriad of modern items ranging from cell phones to digital watches, elevators to washing machines contain microprocessors. It is incredible that, just a few decades ago, the microprocessor did not even exist, and yet today it can be found almost anywhere.

What is a "Microprocessor"?

A microprocessor is essentially an entire basic computer fitted on a single chip [1]. Sure, computers purchased today usually come with peripherals like monitors, hard drives, and DVD drives, but the most important component of the system is the microprocessor. You most likely have heard of companies such as Intel or AMD, and probably even have some version of one of their microprocessors inside your desktop or laptop, but what is this device? The microprocessor's job is essentially to do all the calculations and computations inside that system. At a fundamental level, computers accomplish these tasks by controlling the flow of electric current through a circuit.

From Tubes to Transistors

Basic computers that could do the same thing as the microprocessor have been around for a long time, but it wasn't until the invention of the transistor that microprocessors could be made. A transistor is essentially a switching device that either allows, or does not allow electric current to flow, and many transistors working together make up the processor [2]. Before transistors, computers were gigantic machines taking up entire rooms. Instead of using transistors as switching devices, they used large and inefficient "vacuum tubes," [2]. The invention of the microprocessor was so amazing because it allowed for one chip to replace entire rooms full of computers (Fig. 1).
Matt Gibbs/Wikipedia
Figur​e 1: A closeup shot of an Intel microprocessor, released in 1992. Microprocessors play an integral component in almost all modern electronic devices.
The first microprocessor was the Intel 4004 created in 1971 and was made primarily for use in calculators [3]. By today's standards, this microprocessor would widely be considered inadequate, but at the time it was state-of-the-art. The 4-bit processor was made up of 2,300 transistors, and had a clock speed of 108 kHz [4]. Keep in mind that modern processors are 64 bit, are made up of billions of transistors, and have clock speeds thousands of times higher.