Mark Twain would undoubtedly be able to identify with Gordon Moore—or, more specifically, the “law” that the founder of Fairchild Semiconductors and Intel devised more than 50 years ago. In Twain’s case, his death was “greatly exaggerated.” In the case of Moore’s Law, its so-called death is more irrelevant than overblown.
In this article, I want to underscore the importance of recognizing that the technological forces that got us to where we are today—a period of out-and-out transformation—won’t be driving us in the same way into the future. However, they also won't be slowing us down; they'll be doing anything but.
For those who need a brief recap, Moore’s Law addresses the growth in computer chip technology alongside an accompanying drop in price. In 1965, Moore observed that, between 1958 and 1965, the number of transistors on an integrated circuit had doubled every 18 to 24 months. At the same time, Moore noted, the price of those integrated circuits had dropped by half.
The concept has held to be true for more than half a century. In recent years, though, predictions of the death of Moore’s Law have become widespread. For some, the reason is that transistors can only become so much smaller or more powerful. Others say the growing expense of producing chips makes continually dropping costs impossible.
That may lull some into a misguided sense of complacency—the feeling that the speed with which change has occurred in recent years is finally hitting the brakes. I don't think that's true.
The perceived death of Moore’s Law shrouds a more important conclusion. For one thing, researchers are pursuing strategies using things such as carbon nanotubes and silicon photonics that have the potential to increase chip speeds without having to shrink chips to quantum scale. That makes size irrelevant.
For another, chip development is no longer focused on mere computational muscle. It’s more specialized, meaning that overall computing power will continue to improve as functions such as storage and network processing are spread out over a number of chips. That emphasizes the importance of the overall computing system rather than just one particular element.
The important message that the so-called death of Moore’s Law obscures is understanding that computing power is not merely an issue of speed but what can be accomplished by new and powerful tools.
Consider the proliferation of semiautonomous cars. Are they capable of going that much faster than cars that were on the road 10 or 20 years ago? No, and that’s not the point. The focus is on greater safety and features that enhance the overall driving experience.
Additionally, the fact that Moore’s Law may be “dying” misses the essential issue that new and innovative technology has no intention of slowing down—it’s only going to become faster, and exponentially so. It won’t necessarily be Moore’s Law that drives that accelerated rate of change, but rather the exponential growth of the capabilities of the overall ecosystem, of which chips are merely one part.
Moreover, it’s essential to focus on what can be done rather than just speed and cost. For instance, today’s smartphones have greater computing power than the enormous mainframes of years past. Using tools such as Siri, you can access the world’s most powerful supercomputers with a few spoken words. That’s not an issue of speed, but rather, the overall capability of an entire system.
All this points to tremendous opportunities for those who recognize the speed with which every element of our lives is being transformed.
Daniel Burrus is considered one of the world’s leading futurists on global trends and innovation. The New York Times has referred to him as one of the top three business gurus in the highest demand as a speaker. He is a strategic advisor to executives from Fortune 500 companies, helping them to accelerate innovation and results by develop game-changing strategies based on his proven methodologies for capitalizing on technology innovations and their future impact. His client list includes companies such as Microsoft, GE, American Express, Google, Deloitte, Procter & Gamble, Honda, and IBM. He is the author of seven books, including The New York Times and Wall Street Journal best-seller Flash Foresight, and his latest book The Anticipatory Organization. He is a featured writer with millions of monthly readers on the topics of innovation, change and the future and has appeared in Harvard Business Review, Wired, CNBC, and Huffington Post to name a few. He has been the featured subject of several PBS television specials and has appeared on programs such as CNN, Fox Business, and Bloomberg, and is quoted in a variety of publications, including The Wall Street Journal, Financial Times, Fortune, and Forbes. He has founded six businesses, four of which were national leaders in the United States in the first year. He is the CEO of Burrus Research, a research and consulting firm that monitors global advancements in technology driven trends to help clients profit from technological, social and business forces that are converging to create enormous, untapped opportunities. In 1983 he became the first and only futurist to accurately identify the twenty technologies that would become the driving force of business and economic change for decades to come. He also linked exponential computing advances to economic value creation. His specialties are technology-driven trends, strategic innovation, strategic advising and planning, business keynote presentations.