Human interactions are complex. That’s why most consumers prefer to interact with humans for complex interactions. Now, through artificial intelligence it’s possible to handle more complex issues the way a human would. Today we announced that 7 AIVA is now powered with emotional intelligence, making it the first virtual agent with the ability to detect human emotion during interactions. This enables the chatbot to acknowledge the customer’s frustration and respond empathetically.
Using advanced AI techniques to detect emotion, 7 AIVA can not only understand how customers feel, but also take logical actions the way your best human agents would. This ensures that customers feel heard, while freeing up agents. This new capability is part of the 7.ai Summer 2018 Release that furthers the company’s vision of humans and chatbots working side-by-side as the live agent can now be aware of the customer emotion and handle the interaction accordingly.
Awarded the highest rating in the 2018 edition of Opus Research’s Decision Makers’ Guide to Enterprise Intelligent Assistants report. 7 AIVA enables a new level of self-service because the bot can interpret complex layers of human speech and the different possible meanings of phrases to detect positive or negative emotion. 7 AIVA can also analyze the intensity of those emotions and react accordingly before handing over to a human agent as needed. This results in a better overall experience as well as high customer satisfaction scores.
As human-chatbot interactions continue to grow and become more and more complex, capabilities like emotion detection will go a long way in offering a truly interactive customer engagement solution, resulting in a memorable experience.
The Summer 2018 Release also includes the following enhancements:
Easier Conversational Model Design – In an industry first, 7 AIVA features an improved modeling workbench that exposes the AI behind the solution to end users offering them self-serve options. The new graphical user interface enables non-data scientists and business analysts to easily build and test conversation models. The technical capabilities also incorporate machine learning to proactively suggest improvements to intent models, meaning better understanding of what the consumer is trying to do. These intent models can be implemented quickly to anticipate and improve the customer experience and Net Promoter Score (NPS).
New Visualization Capabilities – 7 Journey Analytics now features new visualization capabilities that enable analysts to view dominant customer behavioral paths more interactively, making it easier to connect the dots between data and insights. The Single Customer Journey Viewer provides a consolidated view of customer engagement, including interactions across channels and transactions across systems, for improved customer service and complaint resolution. And with the unstructured data player, analysts can also get to root cause quicker by listening to calls and reading chat transcripts from within the workspace.