Athenæum

Previous entry | Next entry

04/06/2004: Technologica Technologica

Moore's Second Law
by Michael S. Malone, Wired

The biggest impediment to our technological future isn't extending Moore's law. Thanks to recent breakthroughs at the semiconductor manufacturing level, by 2010 top-tier processors should be stuffed with a billion transistors and running at more than 20 gigahertz. No, the biggest challenge to progress is much more ordinary: It's battery life. What good is a super-functional cell phone if it runs out of juice after 20 minutes? Or a laptop supercomputer that weighs 15 pounds and singes your thighs?

The problem can be stated in a single word: wireless. When Intel cofounder Gordon Moore made his famous proclamation in 1965, he may have anticipated the existence of untethered electronics. But in those days of core memory and wired logic, integrated circuits were seen as astounding breakthroughs in energy conservation. No one could have imagined that billions of chips would be in use, each packed with millions of transistors - and that so many of the chips would unplug themselves from the wall.

But that's what happened. So now, even after a decade of impressive innovation in battery and "green" chip technologies, we're beginning to lose the race to power our wireless electronics. Moore's first law is a two-edged sword - more transistors for the same price is great for computers, but it's hell on batteries: As the processor power doubles, the power consumption also rises. An inability to run the next generation of chips at their full capability will play havoc with the semiconductor business, consumer electronics, telecommunications, the PC industry, and ultimately the world's economy. Moore's law could come to an ironic end - not because we can't build the next generation of chips, but because we can't run them.


So how about a second Moore's law, derived, remarkably enough, from the very paper in which Gordon Moore unveiled the first? As Applied Materials VP Tom Hayes notes, Moore's legendary paper didn't end with the most famous portion of it. He also dealt with successful strategies for companies that chose not to keep up with the torrid pace set by the law.

In Moore's own prophetic words, "it may prove more economical to build large systems out of smaller functions, which are separately packaged and interconnected. The availability of large functions, combined with functional design and construction, should allow the manufacturer of large systems to design and construct a considerable variety of equipment both rapidly and economically."

It may not have the pop of Moore's legendary logarithmic memory-chip chart, but implicit in those words are the roots of a new rule regarding the efficiency of electronic devices. Like Moore's first law, the second is actually a pact. The first, as explained by networking pioneer Robert Metcalfe, was a promise made by the chip industry that it would strive mightily, for as long as physically possible, to double net chip performance along the three axes - speed, miniaturization, and price - every 18 to 24 months. Increasing overall system efficiency requires much greater collusion.

What we need is a fourth axis of development - a systematic improvement of overall system efficiency, from the individual silicon gate, through motherboards and displays, all the way up to the Internet itself. How do we do it? Exhaustively.

At the recent Intel Developer Forum, CTO Patrick Gelsinger called for an "architectural paradigm shift" in semiconductor design to address the power problem.

But that alone won't do it. We need to improve system layouts and cooling techniques. We must create better interconnects, reduce sloppy software code, eschew processors that are faster than necessary, and, of course, build better batteries. As Moore's words suggest, inefficiency is a system problem that can be solved only with system-wide solutions. What's required is a pact made by the whole electronics industry to push forward at the limits of human ingenuity. By necessity, such an agreement would include semiconductor equipment manufacturers, chipmakers, telecommunications firms, battery companies, academia, and the federal government. (Speaking of which: If Washington is looking for a Darpa-like project to keep the US competitive in the world, it could do worse than fund core research in this effort. A battery project might not have the sizzle of missions to Mars, but it would have a much greater impact on our lives.) The Semiconductor Industry Association, the American Electronics Association, and all of the other electronics trade groups should band together to make this the great joint endeavor of this century. The final piece of the pact - frustrated consumers demanding and willing to pay for continuous improvement in electronic devices - probably won't be difficult to find.

That leaves one question: What should be the second law's equivalent to the first's famous "double every 18 to 24 months" formulation? We need something sufficiently Herculean without being impossible.

Let's try this. Moore's second law: Overall net efficiency of any electronic system will double every 24 months.

This will probably be easy at first, then within a few years seem beyond human imagination. But that was also true for the first Moore's law, and yet here we are, almost four decades after Moore's speech, still racing along at the pace he set for us and still making miracles. Who says we can't do the same this time around?