Chatbot Champions – Training Bots Using Human Senses

Chatbot Champions – Training Bots Using Human Senses

PV Kannan 08/06/2018 4

As the Olympics kick into high gear, I’m looking forward to watching people who were once ordinary humans perform superhuman feats. This doesn’t happen by accident. These humans have worked incredibly hard to do things we didn’t think were physically possible a few years ago, and with each Olympics, we see new records broken. That’s what’s happening in AI for customer experience. By incorporating human senses, and with lots of training, super-human feats are possible.

Companies like Google, IBM and Microsoft are all looking at ways to incorporate AI into our everyday lives, to re-create the way humans interact with the world in order to create better experiences for consumers. With sensors becoming more ubiquitous, there are more opportunities than ever before for bots to interact with the world using vision, hearing, taste, smell and touch. The Internet of Things promises to open up all kinds of possibilities for companies to incorporate the five senses to make people’s lives better, and make money while they’re at it.

Vision

Google recently announced that it’s in the process of revamping its “Google Goggles” technology and re-introducing it as “Google Lens,” which it will integrate with Google Assistant. Basically, it compares images taken on your phone to those in a giant database. It’s easy to imagine several scenarios where image recognition could be connected to an AI-powered bot.

For example, you could take a picture of someone wearing a jacket you like and ask a bot “where can I buy this jacket?” The bot can recognize the image and say “I found you one at Niemen Marcus for $300,” or “I can’t find an exact match, but here are some jackets that are similar.” In another example, imagine setting up your set-top cable box but it’s not working. You could use your phone to show the bot how you set it up, and it could respond with “I see what you did wrong there. Try switching the input on the white cable.” In yet another example, you could open a box from Ikea, lay the parts out across the floor and the bot could look at the parts and instruct you to “take the panel on the left and use the two long screws to connect it to the base.”

The same technology could be applied to auto accidents. Insurance companies could ask themselves “should I really send someone to the field?” A bot could process the pictures from both parties in an accident, compare them to police reports and determine liability and payout amounts.

Hearing

With recent advancements in natural language processing, some bots do this today. New technology that is replacing outdated interactive voice response (IVR) systems with AI-powered speech applications, and today you can tell a bot that you’re that you’re trying to book a flight, rent a car and stay in a specific hotel and it will understand you and make your reservations.

In the future, bots will be able to understand the inflections in someone’s voice and determine if they’re angry, happy or sad or if they’re congested with cold. It’s a matter of connecting the AI “brain” to the technology that can recognize changes in tone, biometric voiceprints, etc. When that happens, you can really have a conversation with a bot as if it’s your best friend because it will really know you. It’s not that a machine will understand this simply based on your tone, but rather, it will get to know you over time and can be trained to recognize and respond to your moods, if you want it to. “I can tell that you’re sad Dave. Can I buy you some ice cream?” Which brings me to my next sense.

Taste

Gamers are already incorporating taste and smell into their games in order to make the experiences more immersive. There is also technology that can determine what things taste like and can analyze it. Imagine being able to take a slice of deep dish pizza from your favorite restaurant and put it on a plate that can sense what it taste like. It could offer an approximation of its ingredients and say “I’ve found a recipe that’s pretty close. Here it is.” Eventually, the bot can learn to recognize the foods that you like and proactively offer up recipes. It can also offer up restaurant choices based on what you like. Imagine how that will transform restaurant reviews and advertising.

Smell

Man’s best friend has a sense of smell that’s more than 10,000X that of a human. We now have technology that can replicate this. A bot could smell that you left the stove on and alert you to gas in the house. It could tell you when it’s time to change the kitty litter, or throw out an overly ripe banana. Perhaps the most exciting applications of this could be in medicine where it could analyze your scent and compare it to what you normally smell like to detect the presence of illness. The AI technology could compare the new smell to a database of known illnesses and not only tell you that you should see a doctor, but which doctors you should see. 

Touch

Touch technology is making its way into a variety of applications, from entertainment to military training. But one of the main reasons some people avoid e-commerce is because they want to touch what they buy. Again, it’s about a bot being familiar with your preferences and anticipating your needs. I can envision a database of stored tactile sensations, and an interface that could reproduce the feel of an object on the other end, perhaps through virtual reality gloves. The consumer could provide the bot with her likes/dislikes and based on that the bot could make recommendations. “You’re going to like this blanket, Carol. It’s on sale at Target for $49. Feel it.” Gestural technology could be incorporated too. Imagine being able to walk into a store and “pick up” the objects.

Right now all of these technologies are available piecemeal, however, the technology exists to tie it all together. When it all of these senses do come together, that will be powerful. Perhaps the most exciting thing is what we can’t imagine yet.

The Sixth Sense

In the movie “the Sixth Sense,” a young boy, Cole Sear (played by Haley Joel Osment) is able to see and talk to the dead. Over the course of the film he works with child psychologist Malcolm Crowe (Bruce Willis) who tries to help him make sense of his gift and come to terms with it. In the real world, AI-powered chatbots are like children, and some of them possess remarkable gifts. The greatest minds in business and technology are trying to come to terms with the remarkable gifts and guide these children in a positive direction.

We are rapidly moving towards a future where chatbots will not only be able to incorporate every one of the above five senses, but will also be able to connect the dots in unexpected ways. They will be able to see patterns we humans can’t see and find new ways to make our experiences better. Their ability to predict what we want, when we need something and how we would like it will seem uncanny… perhaps almost psychic. I look forward to that day.

Share this article

Leave your comments

Post comment as a guest

0
terms and condition.
  • Micheli Romeu

    Great thought piece. Thank you for sharing.

  • Julia Meares

    Quality read !!!!

  • Julia Meares

    Nice and concise explanation

  • Kobe Pierce

    Another great article.

Share this article

PV Kannan

Innovation Expert

PV Kannan is the Co-Founder and CEO of [24]7.ai. Since 2000, he has been leading the revolution to make customer service easy and enjoyable for consumers. In 1995, PV's first company, Business Evolution Inc., developed the first generation of email and chat solutions. The company was acquired by Kana in 1999 and PV became part of the management team. At [24]7.ai, PV was a pioneer in integrating customer service technology with business process operations to improve all aspects of the customer experience. PV has been at the forefront of customer experience from creating contact center agent services, developing a big data predictive analytics platform, creating omnichannel solutions for the web, mobile, chat, social, and speech IVR, to innovating mobile-centric applications. Over the years PV has been a thought leader in global customer service and has been featured in the books, The World is Flat and That Used to Be Us by Thomas L. Friedman, India Inside by Nirmalya Kumar and Phanish Puranam, and Reinventing Management: Smarter Choices for Getting Work Done by Julian Birkinshaw. PV is on the Board of Directors for Achievers. He has over 20 patents (issued and pending). PV has degrees in accounting and finance from the Institute of Chartered Accountants and The Institute of Cost and Works Accountants of India.

   
Save
Cookies user prefences
We use cookies to ensure you to get the best experience on our website. If you decline the use of cookies, this website may not function as expected.
Accept all
Decline all
Read more
Analytics
Tools used to analyze the data to measure the effectiveness of a website and to understand how it works.
Google Analytics
Accept
Decline