Training is a service. Ironically, it took us a while to realise this. It also took us time to realise that it’s one of the most agile live services we’ve ever worked on.
In this blog post, we’ll discuss why the service design training is a great case study of a user-centred agile service for internal users.
We piloted the ‘Introduction to service design’ course in January 2018, and it has been run 35 times since then – most recently during Services Week 2022. Over the years, we ran the training in 10 locations, training over 830 people. Feedback on the course is overwhelmingly positive, tickets sell out in hours, and there are hundreds of people on the waiting list for the course.
This has been possible because we have partnered with designers all over government to help us run and update the course – central and local government, many different departments and locations all over the UK.
The course is so successful for a few reasons. One is that we use active learning throughout the course, so learners try out new things, reflect, discuss and learn from others. These are the other reasons it works so well.
Focus on user needs
We created the course knowing there was a user need for it. The user-centred design (UCD) community across government was asking for it, and our research evidenced a strong need.
We knew that users of government services need those services to be better, and good service design is one way of doing that. We knew from our work with teams across government that service design wasn’t well enough understood or used across government. We knew that service designers often found the biggest blockers to doing their work was this lack of understanding and sometimes even suspicion about service design.
Primary users of the service design training are the civil servants we trained.
Secondary users are the teams and organisations these civil servants work in.
End users are the users who use the services that the civil servants design and deliver.
We piloted the course with a good understanding of user needs and have built on that understanding over time.
Keep improving your service
As mentioned, we have run the course 35 times, and it’s been different every single time.
We’ve taken out and added modules. We’ve changed the way exercises work. We’ve tweaked the language and structure of the course. We’ve added things to create a safe space. We’ve changed our biscuit selection.
Once, we had an email complaining that we always run the course during the school run, so we tried running it in 2 sections over 2 days.
At the beginning of the pandemic, we created a remote version of the course in a matter of days. We had to find the right tools, rework exercises, and develop a new structure for the course as we couldn’t ask people to spend more than 2 hours in front of their screens. We haven’t returned to in-person training since then.
We make these changes based on feedback. The main source of feedback is in the training room, where we are constantly on the lookout for what questions get asked, which exercises didn’t work so well, and how ideas could be explained better. We also have a retrospective at the end of every course to collect feedback; we send out a survey after the course, and people often send ad-hoc feedback.
We’ve seen the audience, and their needs change over time and have responded to that.
It has been an absolute joy to work in this flexible way, knowing that we can always do better and that it’s ok to try something new, learn from it and try again.
Keep improving how your team works
We regularly review the users’ feedback on the course and our own processes for delivering the course. We’ve iterated how we describe and promote the course. We’ve changed our ticketing and waitlist processes.
After co-running the training with people from 3 different departments, we had a retrospective for everyone involved in organising and running the course. This created a safe space and allowed people to share their experiences before, during and after the training session. From this, we were able to identify changes that could be made to make improvements for next time.
One of these was how we could communicate more efficiently on a platform, both beforehand and on the day, accessible across all government departments. We created a Slack platform in response that allows participants to discuss tasks during the daily sessions and stay in touch after the training is finished.
As the delivery team has grown across government, we maintain a planning board for scaling the training, including notes, actions, and stats.
Fail fast, learn quickly
We do learn quickly. Every time we run the course, it is a new release. The service’s policy, design and delivery all come from one (non-multidisciplinary) team.
Having a well-tested service gives us the confidence to onboard new trainers. We have detailed speaker and trainer notes, and everyone who runs it leaves notes for the next person and their future self on what works and how they’ve tweaked a detail.
Where we failed
We haven’t done enough of this: “demonstrating value to your senior stakeholders (for example, the senior responsible officer, director or deputy director)”.
We spent a lot of our energy thinking about meeting the needs of primary and end-users and not enough on our internal stakeholders.
We still don’t have detailed metrics or found a way to measure the long-lasting impact of the training on participants.
Keep planning and changing
As the pandemic made us transform the course from in-person to remote, the training became much more accessible to civil and public servants all over the country. In addition, the great success of our colleagues with a massive open online course for content design made us look into different ways of scaling training. And based on the questions from the hundreds of course participants, we created spin-off formats and more advanced training offers.
Specific questions from participants we could not answer during the training were addressed adequately in the monthly cross-government Discuss a design challenge session. Remaining open questions from people with a more profound interest in service design led to a 2-day ‘Service design in practice’ and a more focused 1-day ‘Service mapping for practitioners’ course. As all service design training is modularised, designers have been taking singular exercises and using them in other contexts like away-days, conferences, or meet-ups.
By running training on a broader scale, we can meet the high demand for different courses. The trainers running these find value by representing the cross-government user-centred design community and contributing to overarching design community objectives. And they can iterate the course by using their own experiences to fit the training content.
The training continues throughout 2022 as an agile service offered by service designers and community leads from multiple government organisations.