Opportunity's Unexpected Turns


Two stories from history . . .and a third story that we are living through at this very moment.

First story: The transcontinental railroad was one of the greatest – perhaps the greatest engineering achievement of the 19th century. But what is surprising to note is that when the great Union Pacific/Central Pacific line finally linked the two coasts of the United States in May, 1869, the achievement was most celebrated for what it meant to trade and travel outside the country, rather than within.

Here’s a description of that thinking from Stephen Ambrose’s Nothing Like it in the World:

“Throughout the building of the [rail]road, its proponents had predicted that the China-Japan-India trade from the East Coast of America and with Europe would pass through San Francisco and then over the transcontinental railroad to points east, or to be shipped to Europe via New York. The first through-car on the transcontinental line carried a shipment of India tea, forerunner of the future.

“But trade with Asia didn’t happen, certainly not to the extent that people hoped. . .”

Instead, the real impact of the transcontinental railroad was upon trade and migration within the U.S.; indeed, it opened up continental America to everyday citizens, not just intrepid pioneers. Meanwhile, the much promise international trade instead found its way through another new engineering marvel: the Suez Canal.

Second story: The microprocessor, perhaps the single most ubiquitous and influential invention of the 20th century was invented by a team of four scientists – Federico Faggin, Ted Hoff, Stan Mazor and Masatoshi Shima – at the end of that annus mirabilus, 1969. The microprocessor, it almost goes without saying, went on to sell by the billions, and transform every corner of modern life, from cars and planes to personal computers and cell phones to medical equipment to games to the World Wide Web. In the process, as the leading supplier of microprocessors, Intel quickly became a hugely wealthy company – and justly described as “the world’s most important company.”

Less well known is the fact that, at first, Intel wasn’t even sure it wanted this so-called ‘computer on a chip.’ Intel had been founded just a couple years before as a memory chip company and had done extremely well in that business. Indeed, the microprocessor project had been merely a way to help one of Intel’s calculator clients – Japan’s Busicom – build a competitive product that would, it was hoped, help sell more Intel memory chips in the future. In fact, when Busicom complained about the project taking too long, Intel agreed to a lower fee, but demanded ownership of the technology, believing that it could be used to sell Intel memory chips to other companies.

Now, with the Intel 4004 and 8008 microprocessors completed, Intel found itself not only with a fast-growing existing business, but owning the most celebrated new chip technology on the planet. The Intel leadership troika of Bob Noyce, Gordon Moore and Andy Grove were smart enough business people to know that a young company had neither the workforce nor the resources to tackle two exploding businesses at the same time. So, the growing consensus inside the company was to stick to what Intel knew – memory – and license way the microprocessor.

But within Intel, there were two true believers in the future of the microprocessor. One was marketing director Ed Gelbach, who began vigorously promoting Intel’s microprocessors in speeches and ads even as the rest of the company was having second thoughts. The other was the company PR person, Regis McKenna, who would go on to become Silicon Valley’s most famous marketer. Regis, for his part, was so convinced of the importance of the microprocessor that, on his own, he prepared a series of notebooks showing potential applications for the new device.

It was a pretty bizarre list, ranging, in Regis’ words, “from automatic toilet flushers to cow milking machines, airport marijuana sniffers, electronic games and blood analyzers.” But that list, combined with the orders beginning to trickle in from Gelbach’s heroic efforts, finally convinced Intel’s senior management to stick with the microprocessor.

The rest is history. It is estimated that there are more 25 billion microprocessors in use around the world right now, with the number of transistors they contain equal to the amount of raindrops that fall on North America in a year.

But here’s the punchline to this story: None of the applications that Regis described, that convinced the Intel execs to stay with the microprocessor, ever panned out . . .well, at least not for a decade or more. In other words, the Invention of the Century was sold on falsehoods.

And that brings us up to today, and the news – of the disputed election in Iran — that is breaking as I write this column.

Even five years ago, it might have been possible for the Mullahs of Iran, facing rioting and protests in the streets and the potential of revolution or civil war — to institute a news black-out both within the country and to the outside world. All it would have taken would have been a shutdown of the phone system (which would have stopped the Internet as well) and the tossing out of AP, Reuters and CNN reporters. Iran would have looked like Burma/Myanmar or North Korea, and all we would have known about events there surrounding the election would have come from refugee reports and, a week from now, perhaps a few jumpy, graining amateur videos.

Instead, despite the government’s crackdown on news coverage, we have been deluged with blogs, cellphone videos, Facebook entries, and Twitter tweets covering every aspect of the protest. And every attempt by the Iranian government to shut down these sources only seems to pop up even more. The Mullahs are faced with the unsolvable dilemma that in order to make Iran a regional and nuclear power they have to put in place the same sophisticated digital infrastructure that will keep Iran from ever again being a closed society. They are going to lose this fight, if not now, then soon, because their old autocratic apparatus for running the country has now proven to be incompatible with life in the 21st century.

But there is another story in this too, one that harkens back to those other historical examples I just gave. It is that Great technologies are typically sold, incorrectly, on applications we know – but succeed on applications we can’t yet guess. Trade with Japan was what sold the transcontinental railroad; populating the western United States was what made it a historic milestone. Automatic toilet flushing sold the microprocessor to Intel; the personal computer, cellphone, iPod and video game player is what made Intel and its competitors rich and famous.

The same is true, I think, for the technologies of Web 2.0. MySpace and Facebook saw their initial success as platforms for people to connect socially online. Twitter began as side project, a novelty application for smartphone users to share their day with friends.

But now, under the press of history, these technologies are beginning to morph before our eyes. Suddenly, as you read the Facebook postings of Iranian protesters, it suddenly becomes apparent that social networks are becoming their own pseudo-nation states, complete with voluntary citizens, laws (often in conflict with their real-life counterparts) and degrees of sovereignty.

Meanwhile, Twitter (and video counterparts like YouTube and Qik) is becoming the new wire service, replacing newspapers and television as “the first draft of history.” I think we are going to see the same thing happen with other Web 2.0 companies, such as LinkedIn, which will likely become the vast global job pool for protean corporations.

Like the people of the late 1860s and the early 1970s, we have seen the future – but it has only been through a glass, darkly. Now, that glass is becoming clear . . .

[Michael S. Malone’s new book The Future Arrived Yesterday is available in bookstores and from Amazon.com.]