For over a year, I talked to a lot of people who were very sceptical about chatbots. They told me about chatbots that didn’t understand them and others that just sent them around in circles, leaving them frustrated and no nearer to a solution than when they started.
Few said they would choose a chatbot over other support channels.
So, given widespread scepticism, why did we build one? Demand for support from HMRC, while always considerable, typically ebbs and flows with the movement of the financial year. Our digital assistant was designed to complement telephone and webchat support, helping customers self-serve by offering quick and queue-free answers to simple questions and providing much wanted 24-7 support for an increasingly online audience.
To date, it has handled more than 3 million customer queries but is still very much in its infancy in terms of capability and a work in progress. Here are some of the content design and user research lessons we have learnt along the way.
1. Know your users
Before you add new content to a chatbot, make sure you have plenty of time for research.
Always ask yourself is the chatbot the right solution:
- for this user?
- in this situation?
Is it the quickest or simplest way to help the user? Or could the problem you’re looking to solve be better fixed by a tweak to the service itself, its guidance, or communications?
You may also want to reconsider if, for example, you know you have a high number of users who:
- need help in very complex or very emotional situations where asking them to interact with a chatbot may be ineffective or inappropriate
- don’t speak English as a first language - if there is a lot of variation in how queries are expressed, you’d need to think whether the chatbot would be able to identify the correct answer
- lack confidence or access online
We have chosen not to use a digital assistant on our HMRC extra support pages but instead direct these potentially vulnerable users directly to a webchat adviser.
2. Know your topic
Not all queries are equally well suited to a chatbot.
The ideal chatbot query:
- is commonly asked
- leads to one, simple answer or action
- or a small number of tailored answers or actions
We’ve found a simple question and answer structure works best. By adding such queries to the chatbot you can reduce the amount of time your human advisers spend repeating simple information, instructions, or signposting. For example, ‘how do I tell HMRC about a change of address?’
Where one size doesn’t fit all, your chatbot can ask users one or more follow-on questions about their circumstances to help them understand how rules apply to them. We use this structure to, for example, help tax credit customers understand what action, if any, they need to take to renew their tax credits. This structure is particularly useful when you know that users are often anxious and looking for reassurance that they are doing the right thing or when new rules have been introduced.
Keep an eye out for opportunities for the chatbot to guide users through a simple series of steps that will allow them to complete a task without the need for human assistance. For example, the HMRC digital assistant has been the first port of call for more than 1.3 million customers who needed help to recover their Government Gateway username or password. This is a significant time saver for both customers and HMRC advisers who are freed up to help those who genuinely need human assistance.
3. Your chatbot needs to set expectations
Our user research found that when someone thinks they are talking to a human, they will often write long and complex questions, or conversely, questions that leave out key details that they expect to be self-evident. They may also share sensitive information. None of these are ideal scenarios for a chatbot.
It's important to make clear from the very start:
- that they are talking to a computer
- the specific topics it can help with
This also allows the user to make an informed decision over which support channel they use.
4. It needs to know what it doesn’t know
Expanding a chatbot’s scope and comprehension are the biggest challenges you will face.
You need a plan for how you will:
- ensure that it has sufficient content to be truly useful
- keep it up-to-date
- avoid duplicating existing content while still meeting user needs
If you had to choose between quality, expertly crafted and tailored answers on a small number of topics, or quantity, less tailored, perhaps automatically generated answers on a wide number of topics which would you choose? Which would better meet the needs of your users? The answer may depend on the complexity of your subject matter and the familiarity of your users with the policy and terminology.
Every time a chatbot doesn’t understand a question or offers up a wrong answer it reinforces negative expectations. We don’t want that, so we train ours to recognise common subject areas that are not currently within scope and redirect these users to appropriate help, reducing user frustration and ensuring they don’t hit a dead end.
5. It needs to be human-centred
Don’t forget the context in which it is used. Almost all your users will have already tried to resolve their query or complete their task themselves. No-one wants to get stuck and needs to ask for help. When they start a chat, it is because something has gone wrong for them or an event has prompted them to seek help. When writing this content, have the customer who is stressed, anxious or frustrated in mind.
Consider the user’s circumstances. How familiar is this audience likely to be with the content and terminology, or what devices do they use to access gov.uk?
Consider the core user need for each question. Do they need help finding information or understanding it, or are they looking for reassurance, as getting this right is essential to reduce re-contact?
The ideal chatbot answer respects the user’s time and emotions. It:
- asks for clarification if it doesn’t understand
- simplifies or tailors guidance, rather than just reiterating or linking to it
- presents the user with a definitive answer or action in the chat window wherever possible
- is clear and succinct but not brutal or dismissive when delivering bad news
- avoids leaving the user at a dead end
A genuinely helpful chatbot abides by the medical doctrine to ‘first, do no harm.’ No-one should leave the chatbot more confused or annoyed than they entered.
As we add new content areas to HMRC’s digital assistant, we are moving from pre-defined interactions to a more conversational approach. In the future, we hope it can offer a type of concierge service across a wide range of topics, helping both with navigation on gov.uk and tailored advice.
6. A chatbot’s education is never done
My colleague, Courtney Charles, used to say that a chatbot is like a child. This is a great analogy. You can’t teach it once and assume that that knowledge will carry it through to maturity. You need to keep on teaching it.
Our team carries out regular reviews of real-life chat transcripts. These allow us to:
- map any misunderstood customer queries to the correct answer, as this way the chatbot learns from real-life examples and is better prepared to help future customers
- identify new trending topics for inclusion
- identify opportunities to improve HMRC services or guidance - wherever possible solving the problem at source is preferable to creating further support materials
Quantitative data further helps us understand user behaviour and iterate the chatbot accordingly.
Finally, reviewing webchat interactions also helps us identify common questions that could be added to the digital assistant, freeing up the human advisers for users with more complex queries.
Through these three information sources, and ongoing user research, the team is continuously working to build a more effective and intelligent service.
Ultimately, to know if your chatbot is learning and helping, you will need to measure success. You need a plan for this.
Make sure that any feedback you get from your chatbot is actionable. For example, we are currently looking to add in-chat feedback to allow customers to flag helpful or unhelpful answers. That will help us learn what works and what doesn’t and improve the latter.
We’re still learning too
Building a genuinely helpful chatbot is challenging but we believe it can be done.
From our first day speaking to users, we have been aware of the common chatbot pitfalls and avoiding these has formed the bedrock of our content design strategy. Through a user-centred design process, we are incrementally building a greater understanding of what works well for our HMRC customers. With each round of research, analysis and design iteration, our users teach us a little more about what makes a chatbot genuinely useful.
The work we are doing forms part of HMRC’s wider ambition to transform the way it interacts with its customers, through increased choice and an improved user experience.
If you are also creating a chatbot for your service, we’d love to hear from you. What challenges have you faced? What have you learnt?
Based on the ConCon10 talk ‘The chatbot will see you now’ by myself, Harry Thompson and Courtney Charles.
Comment by Neil Lawrence posted on
Great post! This absolutely chimes with the user research we did in 2019 into chatbots and where they could/should be used