Moore's Law Lives: the Future Is Still Alive
In a week of big stories, the biggest didn’t take place in Pakistan or Washington, D.C., but in Santa Clara, California. Unlike Osama bin Laden, we managed to dodge a bullet. If we hadn’t, it wouldn’t have ended modern civilization, but it might have sent it off on a much different, and much less happy, path.
You probably didn’t read this story. So, put simply, Intel Corp. announced Wednesday that Moore’s Law isn’t going to end anytime soon. Because of that, your life and the lives of your children and grandchildren are going to be a whole lot better than they might have been.
Today, almost a half-century after it was first elucidated by legendary Fairchild and Intel co-founder Dr. Gordon Moore in an article for a trade magazine, it is increasingly apparent that Moore’s Law is the defining measure of the modern world. All other predictive tool for understanding life in the developed world since WWII -- demographics, productivity tables, literacy rates, econometrics, the cycles of history, Marxist analysis, and on and on -- have failed to predict the trajectory of society over the decades ... except Moore’s Law.
Alone, this oddly narrow and technical dictum -- that the power, miniaturization, size, and power of integrated circuit chips will, together, double every couple years -- has done a better job than any other in determining the pace of daily life, the ups and downs of the economy, the pace of innovation, and the creation of new companies, fads, and lifestyles. It has been said many times that, beneath everything, Moore’s Law is ticking away as the metronome, the heartbeat, of the modern world.
Why this should be so is somewhat complicated. But a simple explanation is that Moore’s Law isn’t strictly a scientific law -- like, say, Newton’s Laws of Motion -- but rather a brilliant observation of an implied contract between the semiconductor industry and the society it serves. What Gordon Moore observed back in the mid-1960s was that each generation of memory chips (in those days they could store a few thousand bits, compared to a few billion today), which appeared about every 18 months, had twice the storage capacity of the generation before. Plotting the exponential curve of this development on logarithmic paper, Moore was pleased to see a straight line ... suggesting that this developmental path might continue into the foreseeable future.
This discovery has been rightly celebrated for years. But often forgotten is that there was technological determinism behind the Law. Computer chips didn’t make themselves. And so, if the semiconductor industry had decided the next day to slow production or reduce their R&D budgets, Moore’s Law would have died within weeks. Instead, semiconductor companies around the world, big and small, and not least because of their respect for Gordon Moore, set out to uphold the Law -- and they have done so ever since, despite seemingly impossible technical and scientific obstacles. Gordon Moore not only discovered Moore’s Law, he made it real. As his successor at Intel, Paul Otellini, once told me, “I’m not going to be the guy whose legacy is that Moore’s Law died on his watch.” And that’s true for every worker in the semiconductor industry. They are our equivalent of medieval workers, devoting their entire careers to building a cathedral whose end they will never see.
And so, instead of fading away like yet one more corporate five-year plan, Moore’s Law has defined our age, and done so more than any of the more celebrated trend-setters, from the Woodstock generation to NASA to the personal computer. Moore’s Law today isn’t just microprocessors and memory, but the Internet, cellular telelphony, bioengineering, medicine, education, and play. If, in the years ahead, we reach that Singularity of man and computer that Ray Kurzweill predicts for us, that will be Moore’s Law too. But most of all, the virtuous cycle of constant innovation and advancement, of hot new companies that regularly refresh our economy, and of a world characterized by continuous change -- in other words, the world that was created for the first time in history only about sixty years ago, and from which we can hardly imagine another -- is the result of Moore’s Law.
When Gordon Moore first enunciated his Law, only a handful of industries -- the first minicomputers, a couple scientific instruments, a desktop calculator or two -- actually exhibited its hyperbolic rate of change. Today, every segment of society either embraces Moore’s Law or is racing to get there. That’s because they know that if only they can get aboard that rocket -- that is, if they can add a digital component to their business -- they too can accelerate away from the competition. That’s why none of the inventions we Baby Boomers as kids expected to enjoy as adults -- atomic cars! personal helicopters! ray guns! -- have come true; and also why we have even more powerful tools and toys -- instead. Whatever can be made digital, if not in the whole, but in part -- marketing, communications, entertainment, genetic engineering, robotics, warfare, manufacturing, service, finance, sports -- it will, because going digital means jumping onto Moore’s Law. Miss that train and, as a business, an institution, or a cultural phenomenon, you die.
So, what made this week’s announcement -- by Intel -- so important? It is that almost from the moment the implications of Moore’s Law became understood, there has been a gnawing fear among technologists and those who understand technology that Moore’s Law will someday end -- having snubbed up against the limits of, if not human ingenuity, then physics itself. Already compromises have been made -- multiple processors instead of a single one on a chip, exotic new materials to stop leaking electrons -- but as the channels get narrower and bumpier with molecules and the walls thinner and more permeable to atomic effects, the end seems to draw closer and closer. Five years away? Ten? And then what? What will it be like to live in a world without Moore’s Law ... when every human institution now depends upon it?
But the great lesson of Moore’s Law is not just that we can find a way to continuously better our lives -- but that human ingenuity knows no bounds, nor can ever really be stopped. You probably haven’t noticed over the last decade the occasional brief scientific article about some lab at a university, or at IBM, Intel, or HP, coming up with a new way to produce a transistor or electronic gate out of just two or three atoms. Those stories are about saving Moore’s Law for yet another generation. But that’s the next chapter. Right here and now, the folks at Intel were almost giddy in announcing that what had been one of those little stories a decade ago -- tri-gate transistors -- would now be the technology in all new Intel chips.
I’m not going to go into technical detail about how tri-gate transistors work, but suffice to say that since the late 1950s, when Jean Hoerni, along with the other founders of the semiconductor industry at Fairchild (including Gordon Moore), developed the "planar" process, all integrated circuits have been structurally flat, a series of layers of semiconductors, insulators, and wiring "printed" on an equally flat sheet of silicon. For the first time, Intel’s new tri-gate technology leaves the plane of the chip and enters the third dimension. It does so by bringing three "fins" of silicon up from beneath the surface, having them stick up into the top, transistor, layer. The effect is kind of like draping a mattress over a fence -- and then repeating that over a billion fences, all just inches apart. The result is a much greater density of the gates, lower power consumption, faster switching, and fewer quantum side-effects. Intel claims that more than 6 million of these 22 nanometer Tri-Gate transistors can fit in the period at the end of this sentence.
The first processors featuring Tri-Gate transistors will likely appear later this year. And you can be sure that competitors, with similar designs, will appear soon after. But that’s their battle.
What counts for the rest of us is that Moore’s Law survives. The future will arrive as quickly as ever ....