Hi Rosie, I can see you’re thinking of leaving us,’ a customer services adviser called Ann wrote via the live chat service on my broadband provider’s website.
‘Yes, I’ve been trying to speak to the call centre but I’ve been on hold for 42 minutes.’
‘I understand that,’ came the message back. But did she?
Robo helpers: The use of chatbots has become far more widespread as companies struggle to keep up with a surge in enquiries from customers during the pandemic
The chances are high that ‘Ann’ was not a real person and was, instead, a chatbot — an automated answering service which uses artificial intelligence (AI) to read written messages, calculate what a customer wants, then answer their questions.
Automated chat services are usually accessed via a pop-up on the company’s website or through messaging on social media channels such as Facebook and Twitter.
Live chat channels can be manned by real people, but chatbots are increasingly common.
They could be involved in up to 85 per cent of all our customer service interactions with companies, according to U.S. technology firm Gartner.
Their use has become far more widespread as companies struggle to keep up with a surge in enquiries from customers during the pandemic, at a time when many have fewer staff manning their call centres due to social distancing rules.
Use of live chat soared during each coronavirus lockdown, according to call answering service Moneypenny, which runs phones and live chat services for thousands of UK businesses.
Some firms have even removed details of their email addresses from websites or stopped answering phones, meaning chat is now the only way to get in touch.
And that’s likely to remain the case beyond the pandemic, as chat services are much cheaper for businesses to run than call centres.
When it works well, live chat is quicker than email and easier than making a phone call.
Research shows younger customers prefer to use live chat and one U.S. survey found more than half of all consumers prefer contacting customer services via a chatbot if it saves them ten minutes.
But another U.S. survey found 86 per cent of customers still prefer to interact with a human – and only 30 per cent believe chatbots make solving problems easier.
And nearly half of 5,000 consumers in Europe, the U.S. and Australia surveyed in 2018 said automated chatbots were ‘annoying’, while 80 per cent said they were ‘too impersonal’. One in five would prefer companies stop using them altogether.
Something the studies do agree on is that customers are only happy using live chat services if they actually resolve their problems — unfortunately, that is often not the case.
Going round in circles
‘Chatbots are great at providing quick, simple, factual information,’ says Nigel Cannings, co-founder of technology company Intelligent Voice.
‘But they are not designed to solve complex problems and complaints that require a little ‘out of the box’ thinking by a human.’
And customers who deal with a chatbot that cannot understand their problems can end up feeling like they’re banging their head against a wall.
Take parcel courier Hermes, which has come under fire from frustrated customers on social media who claim it takes them in a ‘loop’ of questions without answering their queries and only provides information that is already available on its website.
Twitter users describe the chatbot as ‘like [talking to] a brick wall’ and ‘poorly designed’, saying it makes it ‘impossible’ to speak to a real person. It appears unable to answer the most-asked question by Hermes customers: ‘Where is my parcel and why is it delayed?’.
Money Mail asked Hermes to comment on the allegations but the company did not reply.
Consumers can get even more frustrated when chatbots fail to answer their questions and then prevent them from contacting a real person who might be able to help. Sainsbury’s has been accused of having a chatbot that rarely connects shoppers to a human.
Ikea’s chatbot has regularly come under fire, with customers complaining online it ‘doesn’t know anything’ and is ‘the antithesis of artificial intelligence’.
Sainsbury’s says its chatbot is designed to ‘answer frequently asked questions and direct customers to a member of the team where this is needed’.
Customers who deal with a chatbot that cannot understand their problems can end up feeling like they’re banging their head against a wall
Ikea says its chatbot is ‘designed to provide simple answers to customers’ queries’ and there were other ways customers can get in touch, including calling or live chat with a person, if it could not help – though it admitted that these did get full at ‘peak periods’, despite it taking on more customer service staff since the start of the pandemic.
‘Most customers are contacting customer services because they need to actually speak to someone who understands their specific problem – so being fobbed off by an automated chatbot that can only handle simple queries is no use at all,’ says Martyn James, of consumer complaints service Resolver.
Resolver has handled around half a million complaints since lockdown began, and incredibly more than half – 260,000 – involved customers not being able to call, contact or email a business, including thousands which specifically cited ‘chatbots’ as part of the problem.
‘A lot of customers give up after being sent round in circles by useless chatbots, which, of course, then means the company doesn’t have to deal with their complaint.
‘Chatbots should only be used to accentuate customer service — they should never be a blocker designed to stop people taking things further,’ he adds.
No human touch?
Frustratingly it can be hard to know whether the messages you receive through live chat are sent from a real person or a bot.
Businesses use all sorts of tricks to make customer service chatbots seem real, such as giving them a human name and using friendly language, exclamation marks and emojis – which can be all the more infuriating when they fail to answer your questions.
In my chat with, ‘Ann’, for example, she responded to my answers to security questions with ‘Cool!’, ‘Brill!’ and ‘Thank you very much [with a smiley face emoji]’.
Some are programmed to wait a certain number of seconds before sending a message so it looks like a human is typing.
Others use artificial intelligence to send more personalised messages to customers who have contacted the service before, such as: ‘Good to speak with you again Rosie, how can I help you this time?.’
‘Small details like these often put customers at ease and make conversations feel slightly less transactional,’ wrote Bruce Hogan, chief executive of technology research firm SoftwarePundit, in a report.
But many customers hate it if chatbots pretend to be a real person, says Sarah Cantillon, partner at digital media agency Movement.
‘There is nothing more grating than when hoping to speak to a real person, you are confronted with a bot wasting your time with false friendly language – and, worst of all, pretending to be a human,’ she adds.
She believes chatbots work best if it’s clear they are automated and don’t try to mislead customers.
Even when live chat is manned by real people, often they are still using chatbot technology to save typing time. Chatbots can assist customer service advisers with basic checks including security details, or sending holding messages such as ‘Please wait while I check for you’.
All this makes it quicker for advisers to deal with each customer, so they can handle multiple queries at once – up to ten times as many as email or phone, according to Moneypenny.
So even if you are speaking to a real person on live chat, it’s likely they are assisted by a chatbot. Some responses will be automated and you’re likely to be competing for their attention with a handful of other customers at the same time.
Too few staff trying to handle a large volume of enquiries is likely to be why a Moneypenny survey found nine in ten customers have had delays starting a live chat conversation since lockdown began.
Other common complaints include the conversation cutting out or restarting if you don’t respond quickly enough – sometimes within just a few minutes. This can be particularly difficult for disabled or elderly people who need longer to type or think about answers.
So when do chatbots work?
‘If you are looking to find out simple information like what your balance is or how you pay a bill, chatbots cut out the need to interact with a human,’ says Nigel Cannings.
‘They’re also good at gathering information that can be later assessed by a human.’
Ms Cantillon from Movement agrees bots can help save time for both the customer and the adviser when it comes to ‘tasks that require form filling – such as booking plane or train tickets and hotels’.
She cites the example of ChatBotlr from the Marriott Hotel Group, which answers simple questions from guests via Facebook Messenger, text or the messaging service Slack.
Another positive example is shoe store Foot Locker, whose live chat tells users they are talking to a bot, asks three simple questions and if it can’t solve the problem connects customers to a real person.
Having a written record of customer service conversations can also help consumers – you can copy and paste any chat conversation into a document on your computer to save for future reference.
This can prove invaluable if you later need evidence of what was discussed, for example, when making a complaint.
Chatbots are also getting more sophisticated – and more human. Mr Cannings says the newest technology can pick up on customers’ emotions – and detect fraudsters. And if you can’t stand talking to a chatbot, there may be a way around it.
Typing ‘Can I speak to a human?’, reportedly connects you to a real person in Amazon’s customer services chat, for example. Some chatbots may connect you to a person if they detect you are writing angry messages.
But all of this only helps us as consumers if chatbots are used to supplement, rather than replace, human customer service.
‘A good chatbot should identify if it can help in no more than three questions – and if it can’t, it needs to connect you to someone who can,’ says Mr James.
‘Today’s chatbots might be able to answer simple questions but they can’t address the emotional complexity that comes with a complaint.’