One of the biggest news subjects in the past few years has been artificial intelligence. We have read about how Google’s DeepMind beat the world’s best player at Go, which is thought of as the most complex game humans have created; witnessed how IBM’s Watson beat humans in a debate; and taken part in a wide-ranging discussion of how A.I. applications will replace most of today’s human jobs in the years ahead.
Way back in 1983, I identified A.I. as one of 20 exponential technologies that would increasingly drive economic growth for decades to come. Early rule-based A.I. applications were used by financial institutions for loan applications, but once the exponential growth of processing power reached an A.I. tipping point, and we all started using the Internet and social media, A.I. had enough power and data (the fuel of A.I.) to enable smartphones, chatbots, autonomous vehicles and far more.
As I advise the leadership of many leading companies, governments and institutions around the world, I have found we all have different definitions of and understandings about A.I., machine learning and other related topics. If we don’t have common definitions for and understanding of what we are talking about, it’s likely we will create an increasing number of problems going forward. With that in mind, I will try to add some clarity to this complex subject.
Artificial intelligence applies to computing systems designed to perform tasks usually reserved for human intelligence using logic, if-then rules, decision trees and machine learning to recognize patterns from vast amounts of data, provide insights, predict outcomes and make complex decisions. A.I. can be applied to pattern recognition, object classification, language translation, data translation, logistical modeling and predictive modeling, to name a few. It’s important to understand that all A.I. relies on vast amounts of quality data and advanced analytics technology. The quality of the data used will determine the reliability of the A.I. output.
Machine learning is a subset of A.I. that utilizes advanced statistical techniques to enable computing systems to improve at tasks with experience over time. Chatbots like Amazon’s Alexa, Apple’s Siri, or any of the others from companies like Google and Microsoft all get better every year thanks to all of the use we give them and the machine learning that takes place in the background.
Deep learning is a subset of machine learning that uses advanced algorithms to enable an A.I. system to train itself to perform tasks by exposing multi-layered neural networks to vast amounts of data, then using what has been learned to recognize new patterns contained in the data. Learning can be Human Supervised Learning, Unsupervised Learning and/or Reinforcement Learning like Google used with DeepMind to learn how to beat humans at the complex game Go. Reinforcement learning will drive some of the biggest breakthroughs.
Autonomous computing uses advanced A.I. tools such as deep learning to enable systems to be self-governing and capable of acting according to situational data without human command. A.I. autonomy includes perception, high-speed analytics, machine-to-machine communications and movement. For example, autonomous vehicles use all of these in real time to successfully pilot a vehicle without a human driver.
Augmented thinking: Over the next five years and beyond, A.I. will become increasingly embedded at the chip level into objects, processes, products and services, and humans will augment their personal problem-solving and decision-making abilities with the insights A.I. provides to get to a better answer faster.
A.I. advances represent a Hard Trend that will happen and continue to unfold in the years ahead. The benefits of A.I. are too big to ignore and include:
Technology is not good or evil, it is how we as humans apply it. Since we can’t stop the increasing power of A.I., I want us to direct its future, putting it to the best possible use for humans. Yes, A.I. — like all technology — will take the place of many current jobs. But A.I. will also create many jobs if we are willing to learn new things. There is an old saying “You can’t teach an old dog new tricks.” With that said, it’s a good thing we aren’t dogs!
Start off The New Year by Anticipating disruption and change by reading my latest book The Anticipatory Organization.
Daniel Burrus is considered one of the world’s leading futurists on global trends and innovation. The New York Times has referred to him as one of the top three business gurus in the highest demand as a speaker. He is a strategic advisor to executives from Fortune 500 companies, helping them to accelerate innovation and results by develop game-changing strategies based on his proven methodologies for capitalizing on technology innovations and their future impact. His client list includes companies such as Microsoft, GE, American Express, Google, Deloitte, Procter & Gamble, Honda, and IBM. He is the author of seven books, including The New York Times and Wall Street Journal best-seller Flash Foresight, and his latest book The Anticipatory Organization. He is a featured writer with millions of monthly readers on the topics of innovation, change and the future and has appeared in Harvard Business Review, Wired, CNBC, and Huffington Post to name a few. He has been the featured subject of several PBS television specials and has appeared on programs such as CNN, Fox Business, and Bloomberg, and is quoted in a variety of publications, including The Wall Street Journal, Financial Times, Fortune, and Forbes. He has founded six businesses, four of which were national leaders in the United States in the first year. He is the CEO of Burrus Research, a research and consulting firm that monitors global advancements in technology driven trends to help clients profit from technological, social and business forces that are converging to create enormous, untapped opportunities. In 1983 he became the first and only futurist to accurately identify the twenty technologies that would become the driving force of business and economic change for decades to come. He also linked exponential computing advances to economic value creation. His specialties are technology-driven trends, strategic innovation, strategic advising and planning, business keynote presentations.