You may have seen them and you know what to call it, but don't know what it is. In the new millennium manufactures plaster ads all over magazines, television, and billboards hawking their wares as if we've stepped from our paltry realities and into a Ridley Scott movie.

Small rectilinear packages. Stubs of metal protrude from the sides, or bottom. You'd have to get technical with someone to find out these are SIPs--Single Inline Packages. On some there are just little nibs of metal, balls or tiny stripes. These are BGAs--ball grid arrays or SMPs--surface mount packages.

You know the black rectangle is a chip somehow. An electronic brain. But you don't know why or how, or what's really inside. It's as if the will of the mad scientist is incarnate in our world and now the distilled essence of something wet, loving, and intelligent has been siphoned from the living and presented in dense, concentrated form.

We technologists are happy to confuse you with metaphor. We smile and spout specialized jargon that makes you feel like the only person on earth who doesn't understand the inside joke.

Megahertz. Gigabytes. MIPS. LVD. SCSI.

And some you've never heard of:

Tantalum Silicide. Parasitic Interconnect Capacitance. Deep UV. Clock Skew. Band Gap Reference.

It makes us seem like magicians.

Now I will tell you what a chip is and you will understand it and know it is not magic. It is something built by humans who go home every night, eat their dinners, kiss their spouses, watch Hollywood Squares, and actually think it's funny.

It started with boards

Chips are also known as integrated circuits, or ICs. The term integrated was coined in the 60's when all circuits were made of discrete, individual parts each of which did exactly one thing. These discrete parts were wired together on circuit boards. The circuit board was a flat piece of phenolic resin with holes drilled into it and copper plated in interesting patterns between the holes. The copper traces served as wires.

Each discrete part was a little, sometimes colorful jellybean of metal or plastic that had tiny wire legs that came out. The legs were put through the holes and soldered to the copper traces on the board. If you've ever looked inside your computer or radio or television you've seen these little jellybeans glued by melted metal to the green or tan phenolic board. That's how circuits like radios and televisions were made in the old days, and they still are. Only now there are less of the little jellybeans and more of these square black things--the fabled "chips".

It's all just electricity

All electronic circuits are pathways for electricity. If you were to take a simple paper clip and connect the metal at the top of a battery to the metal at the bottom, electrons will flow through the clip. It might even get hot because of the electric current flowing. This is a circuit. It's a loop that electrons can follow.

A battery shorted by a wire is not horribly interesting. It doesn't play music or show us pictures. But it is a circuit and all electronics are derived from the simple idea electrons come out one place, drive through wires, and end up in another place.

To make something interesting happen you need parts that modulate--change the flow of--electricity. A transistor does just that. It's like a valve. Most transistors have three connection points. Think of them as in, out, and control. Electricity flows from the in to the out, but only in amounts allowed by the way we adjust the control.

The analogy most used is that of a faucet on a kitchen sink. The water comes out fast or slow or not at all depending on how much we crank the handle. We can turn the faucet on or off, but we can also leave it in between, or even make the flow of water warble if we screw around with the handle.

Transistors do exactly the same thing to electricity. Electricity that warbles in the right way sounds like Paul McCartney when connected to a loudspeaker that changes the electricity to air pressure. Some electricity warbles like the television show Survivor and paints pictures of wirey idiots in remote places when it is used to paint a phosphor screen with an electron beam. Warbling electricity makes the computer you're working with seem like it's alive, and makes you hate technology when Windows crashes.

Making things smaller

Most transistors in electronic gadgets you have in your house are made of silicon with extra sauce added. Silicon is the most abundant mineral on planet earth. Silicon dioxide is regular old glass and it's used to make transistors, too. Silicon is the primary component of beach sand. In silicon valley, we make all kinds of jokes about being in the "sand" business. They're not very funny jokes. We're all geeks here.

Transistors are made on silicon through a semiconductor fabrication process. They use crystals of silicon instead of the amorphous stuff you find on the beach because crystalline silicon has the interesting property of being able to act as a wire, or as a slab of ceramic dinner plate, depending on conditions. Those exact physical conditions are the subject of why transistors work and have to do with atoms and electrons and energy levels and all sorts of interesting physics that people love who have never kissed anyone but their mother. The thing to remember is that electricity can flow through a wire, and it can't flow through a ceramic dinner plate, which is why you'd better not be playing golf in a rainstorm with clubs with metal shafts.

Silicon is called a semiconductor because sometimes it acts as a wire, and sometimes not. Sometimes it conducts, and sometimes it's an insulator.

The integrated circuit

In the beginning, people put exactly one transistor on a little chunk of crystalline silicon. The guy who invented the transistor, William Shockley, wondered if you could put more than one transistor on a slab of silicon. Shockley and a couple of his friends started a company called Fairchild, and did just that. They put a couple of transistors on one silicon slab, they used some metal painted on the silicon to hook the transistors together as wires, and voila, suddenly you didn't need to have a board to put them on anymore.

Some of Shockley's friends, Gordon Moore, Andy Grove, and Robert Noyce, went off and started another company called Intel, which stands for "Integrated Electronics". (Lots of other interesting companies like National Semiconductor are omitted for the sake of brevity.) They called it integrated because the transistors were all together in one chunk instead of in their own jellybean of metal with three wires coming out. Instead of being planted on a phenolic board, they were manufactured together on a slab of silicon.

The Intel guys took the idea of lots of transistors on one slab of silicon and ran. The modern Pentium III microprocessor is a slab of silicon about the size of a U.S. postage stamp with tens of millions of transistors on it. The postage stamp of silicon is fragile. Crystalline silicon shatters like glass. Also, static electricity sparks, some so small you can't feel them, can destroy the tiny transistors on the silicon. So handling the chip is risky business. All integrated circuits are glued into a package (sometimes black plastic, but other materials are used). Wires are connected from the silicon chip itself to the package on the inside. Those wires connect to bigger wires, or metal balls or nibs on the outside. The package is sealed up. That's what you hold in your hand when you buy a memory chip or a processor chip in a box.

What do all those transistors do?

The type of transistors on a computer circuit are usually MOSFETS. MOSFETS have three terminals as I explained above, but the control signal is used mostly to turn the MOSFET completely on or completely off, like a light switch on the wall in your house. You can, but usually don't use a MOSFET like a dimmer switch, where you can warble the current through all sorts of levels. At least in the Pentium, most of those transistors are simply turning on or off.

As weird as it sounds, all the behavior you experience on your computer: playing MPEGs, downloading porn, crashing OSes, killing aliens, chatting with friends, all takes place through millions of transistors simply turning on or off like light switches. The trick is having enough of them to switch, and switching them fast enough to make it seem like a lot is happening.

A simple example is that of your computer display. Your display is using a monitor to paint a picture based on the contents of the video memory in your computer. The memory is a chip that is just an array of transistors. You turn them on or off, and they stay that way until you do something else to them. Each transistor represents a bit of information. On or off represents a number, one or zero. This is binary and you need not be concerned with how those numbers work for now. Just remember that each bit in a memory is a transistor that's turned on or off and stays that way until you come back to look at it or turn it another way. (Actually, most memory bits require between four and six transistors--but that's irrelevant for the sake of this explanation.)

If you take a whole bunch of these transistor switches and create an array and allow each transistor to represent a particular "spot" (pixel) on the screen, you can flip the switches to make a picture. For instance, your monitor can paint a black spot on a part of its screen if the memory for that spot is turned on, or paint white if the transistor is turned off.

And this is exactly what your computer is doing. It keeps flipping the switches in the video memory array and shows your monitor what's in there something like sixty times per second. Your monitor paints a section of its screen depending on whether the transistor that represents that part of the screen is on or off. It does this one picture at a time, many times per second, and so it looks like it's moving (just like TV or movies). Color is achieved by having several transistor bits control each pixel on the screen. Then you have a bunch of numbers for each pixel, more than just on or off (black or white). The number is turned into a color. It really is no more complicated than that.

All those transistors in your Pentium III are doing more than controlling video, of course. Mathematicians have invented a whole science around working with lots of tiny switches that are only on or off. This is called computer science. You can add, multiply, subtract any numbers you want using only on or off transistors. You can compare numbers. You can control devices like disks and cameras. You can solve complicated math equations.

But in it's most reduced form all that's happening is a transistor is being turned on or off like a faucet. If you have enough of these switches going fast enough, it seems like something smart is happening. The goal of software developers is to control the switches in a way that makes it seem like you're shooting aliens.

How they make integrated circuits

The name of the IC game is miniaturization. The smaller you can make your transistor, the more of them you can cram onto a silicon postage stamp. The more transistors you have, the more interesting things you can make your electronic device do. The smaller the transistors are, the easier they can be to switch, and so the faster they are to run (ignoring an icky detail called interconnect effects--not a concern for this article).

ICs are fabricated in extremely clean places. The reason for this is simple: the typical transistor in today's IC is somewhere around 1/4th of a millionth of a meter in size. This is much smaller than the typical particle of invisible dust. If dust gets in the way while a transistor is being made, it screws up the process. This is why you've seen people in tyvek bunny suits when you see a picture of an IC fab.

Fabs are built in clean rooms. This room is pressurized above the air pressure outside so that air only flows out of the room and no dust can be sucked in. Anything going into the room is cleansed and sterilized. Germs are bigger than most transistors.

Clean rooms are rated by how much dust of a certain size can be found in a given volume of air. A class-10 clean room has no more than 10 1-micron particles per cubic foot of space. This is mighty clean. You have to go into outer space to get much cleaner. A class 10,000 clean room is also very clean, but will have 10,000 1-micron dust particles per cubic foot of space. Your bedroom, by contrast, has tens of millions of particles this size or bigger per cubic foot of space.

Fabs for the most modern ICs are almost completely robotic. Robots can be made cleaner than people, and they don't complain that those bunny suits don't breathe.

The size of transistors in the most modern, miniaturized form is not much bigger than a wavelength of red light. If you didn't understand that sentence, think of it this way--some transistors are only a hundred silicon atoms wide. It's hard to see something that small even with the most powerful electron microscope.

Nowadays we're talking about making transistors that are 90 nanometers wide. That's 90 Billionths of a meter. We're starting to talk about switches the size of viruses.

How the hell do you make something that small?

It's called lithography. In it's simplest form, it's a type of chemistry that has its origins in the photographic world. It used to be called photolithography because it was a photographic process.

Integrated circuits are built up out of layers of material. Silicon. Silicon dioxide. Aluminum. Copper. Some of these layers are blasted with phosphorus or arsenic ions to change their electrical characteristics. Each layer is created using a sort of photographic pattern we called a mask.

To understand what we're photographing when we create a mask, you need to know the basics of what we're putting on an IC. Here's a simplified version of the process of realizing an IC design. Keep in mind that most of this process is automated these days--meaning we have computer programs that do what I'm suggesting humans do below.

Chip designers envision their circuits in terms of functions: memory, controllers, multipliers, etc. These are things people can understand easily. They put these functions together and make, say, an arithmetic logic unit which is a fancy name for the thing in your G4 chip that does math. This function description would be represented as an abstract drawing of boxes and lines or as a sort of computer program. For instance, you could represent the addition function of the ALU in this program by saying A+B->C, which means add the number in the 'A' memory with the number in the 'B' memory and put the result in the 'C' memory. This is also called register transfer language or RTL.

Then, circuit designers take that drawing or program and break it down into connections of transistors that behave the way the function demands. This representation would also be a sort of drawing or another type of computer program that's much more detailed than the original function. That original A+B->C gets turned into something that looks like, "connect the OUT leg of transistor #1 to the CONTROL leg of transistor #2..." etc. for lots of pages. The circuit that gets drawn would do the adding of 'A' to 'B' and put it in 'C'.

Then, layout designers take the interconnected transistor description and using CAD drawing computers create a pattern of silicon and aluminum or copper that represents the transistors and connection wires you want the factory to manufacture. Many layers of silicon, implants, and metal are represented. Each layer looks like a pattern of lines and rectangles. This is the mask design. And the mask design would be used to build the transistors that add 'A' to 'B' and put it in 'C'.

Modern IC processes require upwards of 30 mask layers to represent the transistors that need to be made. A mask set for a modern, state-of-the-art chip like the Pentium III costs about $1,000,000 per set. One set is used for each "assembly line", and there are many lines running at once for a popular chip like the Pentium III. In addition, mask sets are swapped out after so many days or weeks of use because they can become damaged. So to supply the world with Pentiums, Intel is spending millions of dollars per month in masks alone. (Cost of running a state-of-the-art IC fabrication facility runs in the millions of dollars per day. Depreciation alone is about $1M per day.)

In older IC fabrication processes, patterns for a particular mask are drawn on quartz glass through photography. The glass is covered with a photosensitive material. The computer generated pattern is then "projected" onto the photosensitive layer. The exposed glass is then etched in a process that's like developing a picture. The result is a piece of clear glass with a pattern of rectilinear black squiggles on it.

That piece of glass is then put into a machine (called a stepper) that is used to expose the crystal silicon wafers. Photosensitive material is painted onto the flat, round, thin silicon wafer crystals. Light shines through the mask and the image of the squiggles is projected onto the wafer. Many chips will be made on the wafer at once, and so the image may be exposed a number of times on the same wafer until a matrix of images is formed. The image is "developed". Everywhere the light hit the photosensitive chemical will etch away, and where it didn't hit will stay fixed.

That wafer with the pattern on it will then be exposed to a beam of ions or a flow of gas that will deposit a material onto it. Naturally, the ions or gas molecules will only touch the wafer where the photosensitive material (photoresist) has been etched away. That material, like aluminum for instance, will stick to the silicon but not the photosensitive material. Then, the photoresistive material is washed off. It goes, the deposited material stays, and voila, one layer of stuff is on top of the silicon wafer in the right places.

This process is repeated for tens of steps to complete an integrated circuit.

Complications arise when we start talking about devices that are so small they're nearly the size of the waves of light you want to use to expose the photosensitive material. Modern IC fabs use beams of electrons, Ultraviolet lasers, or even X-rays to expose the photosensitive materials. Sometimes the beams of ions draw directly on the wafer without a photographic mask process, just like the electron beam draws a picture on your computer monitor.

When the process is complete, the wafer is diced up into little squares, each being one chip. The chips are then tested briefly to see if they have any rudimentary functioning. If so, they're packaged and then tested more thoroughly. The resulting parts are sold to people who put them on boards.

Where they make integrated circuits

Most integrated circuits in the world today come from fabrication facilities in Taiwan. TSMC and UMC are two of the largest fabricators in the world. Chartered in Singapore is growing at a rapid rate. These businesses manufacture chips other companies design.

Owning a fabrication facility is an expensive and low margin business which is why smaller companies no longer own their own fabs. Modern, state-of-the-art fabs cost upward of $1B U.S. to construct. Intel and Motorola still fabricate their own parts, as do Mitsubishi and Toshiba. But even these companies will go to TSMC for overflow work because they can only afford to make their highest margin parts in their expensive fabs.

Typically, companies continue to operate their own fabs if they have special manufacturing needs or are afraid to put production of their product at the whim of a third party.

In general, the cost of creating an integrated circuit increases greater than linearly with the decrease in size. The reason has to do with the physics of the situation. When devices become atomic in size, you start to have to worry about strange wave-particle effects both in your design and in the lithography you use to create it. And it takes ever higher energies to draw ever smaller lines. Modern, state-of-the-art ICs are exposed with high energy ultraviolet light and IBM has done work using x-rays. This equipment is hideously expensive as fabs start to become physics experiments rivaling the likes of CERN or Fermilab in miniature.

But is it really a brain?

The ubiquitous integrated circuit is simply a collection of transistors, most of which are operating like light switches. Because there's so many of them in one place switching so rapidly, our experience of electronic devices tends to be anthropomorphic. We believe some intelligence to be behind the operation. But the great and mighty OZ behind the curtain is an army of geeks who break down big problems into little ones, and draw transistor circuits that solve them. Lots of tiny solved problems can be assembled to seem like complicated behavior. Perhaps the moral of the IC story is this: put enough of something in one place and it seems like god made it.

I'll leave you with this thought. In the middle part of the 20th century a brilliant man named Alan Turing proposed a test which now bears his name. The ramifications of the test are somewhat disturbing to those who fear mankind's creations, and wonderous to those who are creators

Turing suggested putting people in separate rooms connected by simple computer terminals that could do only the equivalent of chat. The people could neither see nor hear each other. Typically, people typing to each other using IRC/chat can tell if they're "speaking" to another human. While there are computer programs that are the progeny of the original Eliza that responded to queries and statements with human-like replies, most people could tell whether they were conversing with a program or a living, breathing human. Even with today's technology, most people could tell if they were chatting to a program or a person after a couple of minutes.

However, the pace of computer design and program design is making Eliza style programs more sophisticated, and many can envision a day where a test person chatting into a computer would not know if she was "speaking" electronically to another living creature, or just a clever program.

And so on the day an integrated circuit runs a software program that fools people into thinking it's human it will have passed the Turing Test. Then, will all those switching transistors be any less a living person than the test subject just because we know how it works? For there may be a God somewhere who knows how all the neurons in our brains work to make us seem real.

Would we have created electronic life out of billions of switches and aluminum?

sources for this article include: "Introduction to VLSI Systems" by Mead and Conway "Basic Integrated Circuit Engineering" by Hamilton and Howard "Physics and Technology of Semiconductor Devices" by Grove www.intel.com "Scheduled Revolution" by Mastroianni, Electronic News, May 2001.

Log in or register to write something here or to contact authors.