P.V. Kannan, Co-founder and CEO
As the Olympics kick into high gear, I’m looking forward to watching people who were once ordinary humans perform superhuman feats. This doesn’t happen by accident. These humans have worked incredibly hard to do things we didn’t think were physically possible a few years ago, and with each Olympics, we see new records broken. That’s what’s happening in AI for customer experience. By incorporating human senses, and with lots of training, super-human feats are possible.
Companies like Google, IBM and Microsoft are all looking at ways to incorporate AI into our everyday lives, to re-create the way humans interact with the world in order to create better experiences for consumers. With sensors becoming more ubiquitous, there are more opportunities than ever before for bots to interact with the world using vision, hearing, taste, smell and touch. The Internet of Things promises to open up all kinds of possibilities for companies to incorporate the five senses to make people’s lives better, and make money while they’re at it.
Google recently announced that it’s in the process of revamping its “Google Goggles” technology and re-introducing it as “Google Lens,” which it will integrate with Google Assistant. Basically, it compares images taken on your phone to those in a giant database. It’s easy to imagine several scenarios where image recognition could be connected to an AI-powered bot.
For example, you could take a picture of someone wearing a jacket you like and ask a bot “where can I buy this jacket?” The bot can recognize the image and say “I found you one at Niemen Marcus for $300,” or “I can’t find an exact match, but here are some jackets that are similar.” In another example, imagine setting up your set-top cable box but it’s not working. You could use your phone to show the bot how you set it up, and it could respond with “I see what you did wrong there. Try switching the input on the white cable.” In yet another example, you could open a box from Ikea, lay the parts out across the floor and the bot could look at the parts and instruct you to “take the panel on the left and use the two long screws to connect it to the base.”
The same technology could be applied to auto accidents. Insurance companies could ask themselves “should I really send someone to the field?” A bot could process the pictures from both parties in an accident, compare them to police reports and determine liability and payout amounts.
With recent advancements in natural language processing, some bots do this today. New technology that is replacing outdated interactive voice response (IVR) systems with AI-powered speech applications, and today you can tell a bot that you’re that you’re trying to book a flight, rent a car and stay in a specific hotel and it will understand you and make your reservations.
In the future, bots will be able to understand the inflections in someone’s voice and determine if they’re angry, happy or sad or if they’re congested with cold. It’s a matter of connecting the AI “brain” to the technology that can recognize changes in tone, biometric voiceprints, etc. When that happens, you can really have a conversation with a bot as if it’s your best friend because it will really know you. It’s not that a machine will understand this simply based on your tone, but rather, it will get to know you over time and can be trained to recognize and respond to your moods, if you want it to. “I can tell that you’re sad Dave. Can I buy you some ice cream?” Which brings me to my next sense.
Gamers are already incorporating taste and smell into their games in order to make the experiences more immersive. There is also technology that can determine what things taste like and can analyze it. Imagine being able to take a slice of deep dish pizza from your favorite restaurant and put it on a plate that can sense what it taste like. It could offer an approximation of its ingredients and say “I’ve found a recipe that’s pretty close. Here it is.” Eventually, the bot can learn to recognize the foods that you like and proactively offer up recipes. It can also offer up restaurant choices based on what you like. Imagine how that will transform restaurant reviews and advertising.
Man’s best friend has a sense of smell that’s more than 10,000X that of a human. We now have technology that can replicate this. A bot could smell that you left the stove on and alert you to gas in the house. It could tell you when it’s time to change the kitty litter, or throw out an overly ripe banana. Perhaps the most exciting applications of this could be in medicine where it could analyze your scent and compare it to what you normally smell like to detect the presence of illness. The AI technology could compare the new smell to a database of known illnesses and not only tell you that you should see a doctor, but which doctors you should see.
Touch technology is making its way into a variety of applications, from entertainment to military training. But one of the main reasons some people avoid e-commerce is because they want to touch what they buy. Again, it’s about a bot being familiar with your preferences and anticipating your needs. I can envision a database of stored tactile sensations, and an interface that could reproduce the feel of an object on the other end, perhaps through virtual reality gloves. The consumer could provide the bot with her likes/dislikes and based on that the bot could make recommendations. “You’re going to like this blanket, Carol. It’s on sale at Target for $49. Feel it.” Gestural technology could be incorporated too. Imagine being able to walk into a store and “pick up” the objects.
Right now all of these technologies are available piecemeal, however, the technology exists to tie it all together. When it all of these senses do come together, that will be powerful. Perhaps the most exciting thing is what we can’t imagine yet.
In the movie “the Sixth Sense,” a young boy, Cole Sear (played by Haley Joel Osment) is able to see and talk to the dead. Over the course of the film he works with child psychologist Malcolm Crowe (Bruce Willis) who tries to help him make sense of his gift and come to terms with it. In the real world, AI-powered chatbots are like children, and some of them possess remarkable gifts. The greatest minds in business and technology are trying to come to terms with the remarkable gifts and guide these children in a positive direction.
We are rapidly moving towards a future where chatbots will not only be able to incorporate every one of the above five senses, but will also be able to connect the dots in unexpected ways. They will be able to see patterns we humans can’t see and find new ways to make our experiences better. Their ability to predict what we want, when we need something and how we would like it will seem uncanny… perhaps almost psychic. I look forward to that day.