Chatbots are all the rage in customer service, and though they’ve come a long way in recent years, like their human counterparts, they’re still not perfect. Sometimes chatbot technology gets it wrong. Usually when it happens, it makes for an amusing anecdote, but as we become more accustomed to interacting with technology, our tolerance for poor exchanges is decreasing.
This is especially true when it comes to customer service chatbots. Companies have had more than enough time to test the waters and find out what works. If in 2019, they’re still offering a bot that doesn’t truly help their customers, it can actually damage their reputation. A recent Digitas survey found that 73% of customers would never use a chatbot again after one bad experience – so in today’s digital age, companies truly have one chance to get it right. But with more and more organizations deploying chatbots – Business Insider estimates that by 2020, 80% of companies will have a chatbot – the potential for things to go wrong is very real.
In advance of the webinar, let’s take a look at some entertaining examples of previous chatbot deployments that didn’t quite perform the way they were supposed to.
Starting with Siri, the chatbot we all know and love. The Apple invention that first launched in 2011 was designed to make our lives easier, but she doesn’t always do so. Though she’s mainly used for entertainment purposes and is reliable enough when tasked with simple questions or basic tasks, she still struggles to derive meaning from what we’re asking if we’re not specific (sometimes, even when we are specific), as you can see from the examples below.
Conversational technology has advanced enough today that we’re able to interact with it the same way we’d interact with friends or family, so it’s frustrating when we can’t. Not to say it isn’t fun to spend time interacting with Siri, but with Amazon Alexa and Google Home gaining traction every day in the virtual personal assistant market, Siri will have to find a way to be a little more helpful if she wants to compete.
Want to experience the silly side of Siri? Ask her:
Another popular example of a chatbot not exactly living up to expectations is Poncho, the now defunct weather app. Poncho was packed with personality and made weather forecasting fun – until it didn’t. Users eventually found that Poncho’s answers and attention span left a lot to be desired and that perhaps they were better off getting their weekend weather from a more traditional source.
In an interview with Gizmodo, Sam Mandel, CEO of Poncho, said that less than 24 hours after the bot launched, they ran into unanticipated problems and that fixes were needed fast “because tolerance for a mediocre bot is much less than for a mediocre app.” He was right, and Poncho was pulled after a few years of frustrating forecasts.
Perhaps the most famous incident of a chatbot gone bad is Microsoft’s Tay (though humanity might be more to blame than poor chatbot technology). Microsoft launched Tay in an effort to learn about natural language and artificial intelligence. The intent was to allow Tay to interact with Twitter users so she could develop a unique personality based on those interactions. The more people chatted with Tay, the smarter she’d get – or so they thought. Of course, Twitter was very… Twitter about things, and within 24 hours Tay was pulled after becoming incredibly racist. You can find some of Tay’s more colorful exchanges online if you’re so inclined, but be warned, it’s not nice.
Chatbot fails don’t have to be this grand to be considered problematic. Any time your chatbot can’t help a customer find the information they’re after, it’s failing to do its job. This can happen for a number of reasons such as poor chatbot architecture or a wrong choice of chatbot framework development, but often it’s because organizations severely underestimate the complexity of enterprise grade chatbots and what they need to succeed. Don’t let your deployment go off the rails – hear more tales of chatbots gone rogue and get advice to avoid becoming a statistic. Register today for our Bot Experimentation is Over webinar happening January 30th featuring guest speaker Sarah Blocker.