Open data and open standards were given a welcome and practical revival in the early days of the Government Digital Service (GDS). This included the creation of the performance platform. It provided insight into the state of play of public services. Previously government often lacked the data to know how services were performing, or where improvements needed to be made.

However, as with earlier attempts to encourage open data, this promising initiative appears to be falling into decline.

Decline and fall …

graph decline-2According to the services data list, there seem to have been few data updates on the performance platform since September 2017, over two years ago – despite performance data being required by the government’s Service Standard. There are also some major gremlins – with inconsistencies between the data that appear on the list and data available elsewhere on the site. For example, the Dart Charge lacks any data on the list despite being available elsewhere on the site here. So too HMRC’s services appear to be lacking any data on the list despite also being published elsewhere on the site – the related HMRC Self Assessment data have not been updated since March 2017 and provide no up-to-date information on annual transactions. PAYE is in much the same state. 

The main performance services data list omits some major service data and much of what it does provide is out of date or incomplete. Or both. It also shows ‘775 services’, but the file available for download has only 734, with many of those blank or incomplete.

When I was advising the House of Commons Science and Technology Committee during their recent inquiry into ‘Digital Government‘, the lack of a baseline and open, reliable and meaningful data to assess progress and performance was frustrating. It made it difficult to assess progress against the definition of ‘digitisation’ provided by the Minister for Implementation, Oliver Dowden MP, during his evidence:

“I think that in general we have succeeded almost completely on digitisation in its simplest term, which is a digital interface. The challenge is how we ensure end-to-end digitisation; that is to say, that all the processes behind are done digitally and, in its simplest, it is not somebody taking something that is produced digitally, printing it out, processing it and then sending someone an email at the end.” 

Q.396

Meaningful service and performance data are essential and it’s important that the performance platform does not slip into further decline, but is instead rebooted, enhanced and routinely updated. As the Service Standard recognises, identifying the right metrics and openly publishing the data related to them can help publicly evidence how well a service is handling the problem it’s meant to solve, as well as informing decisions about how to fix problems and improve services.

Improving performance data

ideas smallThe performance platform was an important development. It made a promising start, but lacked features that would have made it even more useful. Here’s a few ideas of what I’d like to see in a revamped platform, drawn in part from the difficulties I encountered in collecting and analysing evidence for the Science and Technology Committee:

  • Establish an agreed baseline against which progress can be assessed. There never seems to have been a full discovery and published, objective baseline of what already existed at the time GDS came into existence – in terms of online services, government platforms, standards, take-up, etc. – and therefore no meaningful way of assessing progress. A baseline needs to be established against which progress can be assessed.
  • Mandate routine public performance data updates for all services. Ideally this should be an automated process and include the data being made available via open interfaces (APIs) in a standard format so anyone can access and analyse the data. As a minimum fallback, a quarterly manual upload should be an obligatory requirement of the Service Standard.
  • Agree and adopt a consistent, meaningful definition of ‘digital’. Even where a ‘digital’ figure is provided, it’s unclear whether this simply means the front end is digital (e.g. a front end digital / electronic form, service or information on a web page) or whether the entire service is digital end-to-end (as per the Minister’s ambition in his evidence referenced above). At least one digital service includes “automated phone” in scope. A definition is essential to enable progress to “fully digital” to be monitored, and to distinguish between the channel(s) used – website, via APIs (e.g. apps), phone, etc. – as well as an indication of whether they meet the Service Standard, including an assessment of a digital service’s impact on accessibility and social exclusion
  • Agree a taxonomy for the nature of a digital service – making a distinction between e.g. automation, optimisation, transformation – mapped against digital services to monitor and demonstrate progress, and the type of progress, including at the policy and operational levels. This topic is discussed in more detail in my co-authored Computer Weekly article ‘Escaping waterfall government and the myth of digital transformation‘.
  • Identify examples that break down organisation-centric services and span more than one provider. The services listed on the performance dashboard appear to all be operating within long-standing departmental service silos – and have effectively moved old paper service silos online. Where are the simplified, streamlined services redesigned around citizens’ and businesses’ needs rather than organisational boundaries?
  • Maintain a public audit log of changes and improvements. Many online services pre-date GDS and have not been fully updated with the Service Standard (there are examples of services even still using Flash for example). It would be useful to have dates showing when a service originally came online, when it was last improved, whether and when it passed the Service Standard, etc. Such records should include details of services that have been merged, or even removed, as a result of policy or other design improvements.
  • Include internal digital services too. Public employees are essential users too, relying on government technology on a daily basis to operate and deliver our public services. Develop a lightweight method of including internal digital services and organisational improvements, not just external ones.
  • Let users rate their public services. Routine public sector employee, citizen and business feedback should be a standard part of all services to help inform and improve them. These data should be openly and routinely published too.

… rise of a renewed performance platform?

feedback smallGood data, and routine user feedback, form an essential part of understanding the way services are operating and how they could be improved. The effective use of data lies at the heart of most modern, successful organisations. It’s important that the next government commits to the routine provision of open data, and re-energises the performance platform.

By updating the performance platform and ensuring better data that clarifies what has been delivered, the standards it meets, the quality of the citizen/business experience etc., it will become much clearer how well the digital government agenda is progressing and with what value to citizens, businesses and public sector workers. It will help enable inflight corrections and improvements. 

The performance platform and the open data that feeds it is an important function for a central team to oversee, helping drive visibility on cross-government progress. Without the publication of objective, routine data, it’s hard to assess how much of what has been done, or what is planned, tackles the real issues of improving policy, organisational and service quality. The performance platform is an essential tool in helping meet the challenge set out by the Minister in his evidence – of whether end-to-end digitisation is being achieved or simply yet more digital interfaces.

Original source – new tech observations from a UK perspective (ntouk)

Comments closed