We first talked about updating the service standard around a year ago. Since then, we’ve talked to hundreds of people in central and local government.
It’s still a work in progress, but we think we’re getting close to a final draft which supports the government’s ambition to deliver joined up, end to end services that meet user needs. So we thought it would be useful to provide some details about the direction it’s going in.
What’s not changing
For the most part, we’ll ask service teams to keep doing what they’re doing.
We’ll still ask service teams to start with user needs and to build services in an agile, iterative way with a multidisciplinary team – based on data and user research.
We’ll still ask service teams to make services that are simple to use, accessible and secure to make sure people can still get assisted digital support if they need it.
We’ll still promote open standards and ask service teams to open source new code.
And we’ll still ask service teams to avoid locking in particular technology solutions, reuse patterns and components where possible and to automate what can be automated.
While the intent behind large parts of the standard will not change, the format will.
Each point willhave a section explaining why it’s important and how it can help service teams make better decisions, deliver faster and build services that work for users.
We want the standard to be useful to all parts of government involved in providing services. People should not be following the standard because they have to: they should be following the standard because it helps them to deliver better services for users.
What’s been added
With the update we’re planning a name change – from the ‘Digital Service Standard’ to the ‘Government Service Standard’. It’s significant because we want it to be useful to the whole of government.
The Government Transformation Strategy is committed to delivering joined up, end to end services. That means bringing together isolated transactions into a service that makes sense from the user’s point of view.
That’s only possible if you understand – and can influence – the wider context for the service.
There are service teams in government doing this already, but we want to make it standard practice for teams to talk during service assessments about how they’re addressing challenges that make it difficult to meet user needs. In particular, the challenges they have with:
- making a transaction part of a wider service that solves a whole problem for users (even when that means working across organisational boundaries)
- delivering a joined up experience across different channels
- internal systems and processes
- technology platforms
It’s important to recognise that challenges to do with ‘hard’ constraints like legislation, contracts, governance or technology infrastructure can not be resolved overnight but we’ll be asking service teams to:
- explain the long term plan to address the issue
- explain how they’re working around it in the meantime
Making services more inclusive
As well as making services accessible to disabled people and those with limited access to technology, we’ll ask service teams to think about inclusion in the broader sense. For example, if you’re asking for proof of where someone lives, have you considered the needs of someone who does not have a fixed address?
Iterating and improving services
It’s probably not practical – or a good use of public money- to assign a full development team to every service, forever. But it’s important that services do not go into stasis.
So we’ll ask service teams to ensure their approach to continuous improvement meets the challenge of changing user needs and developments in technology.
What’s being removed
We’re looking at how we can make more time for service teams and assessors to discuss the things we’ve added. To do this we plan to consolidate some points and remove a couple altogether.
For example, we plan to remove the requirement to encourage people to use the service online. Not because it’s not important, but because we think the lower cost of providing services online means there’s already enough of an incentive to do it.
The scope for service assessments will not change as part of this update. The trigger for an assessment is still the same: technology spending on a digital service. We expect people to have questions about this, so we’re producing some clearer guidance on what’s in scope.
Moving to the new version of the standard
We will not expect people to start using the updated standard immediately – there will be at least a couple of months’ notice between the new version being published in the autumn and service teams having to use it.
If a team has their alpha service assessment before that cut-off date, or they’ve already had their alpha, they can continue to use the old version.
The future for the standard (after this version)
This new version of the standard is designed to nudge things in the direction of joined up, end to end services, without making unreasonable demands on service teams.
But in the future, we want to do more.
In particular, we want to work with the operational delivery and policy professions to develop a joined-up view on how the process of creating and operating services should work. And we want to draw on their expertise to provide more concrete guidance on things like designing for non-digital channels.
It’s been 3 years since we last updated the standard. In future, we want to make smaller, more frequent updates and we’ll provide plenty of notice when things change.
Stephen Gill is content lead for service design and standards at GDS.
If you’re working on a service team in the UK government and have a question about how the change might affect you, contact GDS at email@example.com.