Distributed Production
David Ashley, September 9, 2011

In the manufacture of computer chips there is huge capital investment involved. The foundries cost billions of dollars to set up. This truly is the highest of tech. Trying to shape matter with precision measured in fractions of a millionth of a meter. It's really amazing.

The business model depends on massive, massive volume. The silicon wafers that are used are something like 10 inches in diameter, and as many copies of the circuit to be made are spread over the surface. It's been said this is the most expensive real estate on the planet. Big circuits take up more area, and cost more. Building up the electronic components involves many repeated steps of vacuum depositing material on the surface (everywhere), then using photographic techniques (projecting an image onto the surface) to expose certain regions to light but not others, and then etching away material. The details are so small electrons might even be used instead of light.

The model heads for smaller and smaller feature sizes, faster and faster clock rates (how fast the circuit operates), lower and lower power requirements, and greater and greater production volume to reduce costs as much as possible. It takes a lot of chips to earn back the billions of dollars in startup costs.

So a centralized manufacturing approach is used, where we have large numbers of very small, intricate things produced that operate at very high speeds, as the demand for computing power is insatiable. Higher speed means more computing power. The business model carries a huge barrier to entry. Expertise in the industry is expensive also. It's almost certainly harder to build a state of the art chip manufacturing plant than it is to build an auto making plant. The air in these places is far cleaner than the air in an operating room, a single speck of dust can take out an entire chip.

Might there be a better approach to the whole thing?

I imagine an approach where instead of a very small number of high volume production sites we have a very large number of low volume production sites. Say one per household garage. And instead of trying to make the chips as small as possible and making many at once, you make one at a time, and it can be pretty big, say as big as a 3x5 notecard. And instead of making the features as small as possible and running the circuits at many gigahertz clock rates, you have features perhaps 10 to 100 microns and you run the circuit at maybe 100 megahertz. 1000 microns is a millimeter, so 10 microns is 1/100th of a millimeter and 100 microns is 1/10th of a millimeter.

And instead of building the circuit as a single flat 2d layer, you build it up in successive layers to make it 3d. You make it as thick as you want. That makes it much easier to connect circuit elements together, wire lengths can be very much shorter. I imagine an approach where a device like an old 35 inch vacuum tube television is used to spray material onto your 3x5 inch target, which can be just a piece of glass. It would be similiar to an inkjet printer, only instead of ink the device sprays different types of material at the target in with precise control of quantity and position. Some material can act as an insulator, some can act as a conductor, and some can be P or N type semiconductor. The device is operated by a computer that selects which type of material needs to be sprayed (there can be a wheel holding reservoirs of different materials that can be rotated around to select one at a time). For a layer all the insulating material can be sprayed. Then all the conducting material. Then each of the other types in turn. Once a layer is done, move on to the next layer, building the circuit up a level at a time. The target is placed in the "printer", it is sealed closed, the air is pumped out, and the machine starts printing.

It might take a week to build one circuit. One can build an entire computer, with cpu and memory. Connections can come out that would attach to a display, a keyboard, power, and various input devices like mice. It doesn't really matter how long it takes to build something, because it will be months to years before it's obsolete. We solve the problem of low clock rate by going parallel. Rather than a single processor running at 2 gigahertz, we have 100 processors running at 100 megahertz, and we're still ahead.

With a feature size of 100 microns (1/10th of a millimeter) in a 5cm by 5cm by 1cm volume of material we have 25 million building blocks we can use. A lot of interesting circuits can be built with 25 million parts. If we can get our feature size down to 10 microns we can have 25 billion blocks in the same volume (1000 times as many). A lot more interesting circuits can be built with that many parts.

When it's time to upgrade, we recycle the old components to recover the various materials. We can keep using them over and over again in new products. Circuit designs would be the easy part. There are so many talented designers who understand this stuff that if you just gave them the machine they'd create useful circuits just for the pleasure of doing so. I'd be one of them myself.

In this model each individual just purchases one of the production machines and from then on he only needs power (to run it), a supply of raw material, and designs for circuits to be built. The designs are just a form of information that can be transmitted over the internet. The raw material would be cheap and if done right could be recycled. Not a lot of it would be needed anyway. Ounces to pounds.

If done right the printer can actually be used to construct the parts for another printer. So it is self reproducing, as long as the raw materials are available. Someone who has one of the printers can make one for his neighbor, and he can pass it on, and so on. Open source hardware! Freely copyable. Anyone can build onto what's there. No massive startup costs. No centralized control. Anyone can be an innovator.

Access count (e^i)