As a Digital Performance Analyst, it’s my job to use analysis to show how the digital products and services in DWP are adding value and meeting user needs. I work alongside delivery teams to identify their service’s Key Performance Indicators (KPIs) and uncover ways that they can measure these goals. By using data and analytics, I provide insights on behaviours and trends to inform user centred designs and improvements to the services.
At the start of 2020, I began working to support the New Style JSA team. The service recently launched into Public Beta during the COVID-19 lockdown.
Supporting a digital service team
From early conversations with the team, one of the challenges they were faced with was a lack of evidence and rationale behind some of the key decisions and designs that had been made by the previous team working on the service.
To overcome this, the team set out to complete a review of the design, content and user research of the service. This developed the team’s understanding of the user needs, the business context and some of the constraints they would have to work with.
Building on this knowledge, I worked with the team to form their service’s measurement framework. We started with user needs, explored what success and failure would look like for the service and from there discussed what the success factors and KPIs should be.
Having different multidisciplinary people in the conversation really helped as we were able to cover viewpoints from user, business and data perspectives. As we finalised the framework to a ready state, the team had a better understanding of what we needed to measure, why it was important and how they were going to source this data.
“It’s so great to have data”
We then used this framework to help us plan and prioritise which metrics needed to be measured first. We worked together to have Google Analytics enabled on the service alongside a compliant cookie consent mechanism.
Once in place, dashboards were then created to give the service team visibility of the key measures and data that could be sourced from Google Analytics.
I worked with the team on the changes and iterations they had made to their service. Using analytics data, I was able to provide robust analysis and statistically sound insights on what had improved, how it had improved and whether further recommendations or iterations were needed.
It was great to see the team becoming more engaged and understand the benefits of using data. We’re now working together on further hypotheses and iterations in the backlog, so it’ll be exciting to see what happens next.
Telling the service’s story with data
As with most Digital Government services, the service assessment is typically the milestone to share what has been done by the team in the phase they’ve complete.
At the New Style JSA GDS Beta assessment, the team were able to tell a great story. They shared the hypotheses they worked from and showed the iterations they had made, along with the improvement and change that could be seen using both quantitative data and qualitive research.
By using data to validate the successes and failures of the changes and improvements that had been made, the team were able to demonstrate their learnings.
As the Digital Performance Analyst supporting the service, I feel proud to have been able to encourage and help the team to use data evidence to make effective decisions that have improved their service for users.
Of course, the cherry on the cake was the service getting a “Met” result from the assessment. Officially this means the service can now continue to the next phase of development in Public Beta. To me it’s validation (and a confidence boost) that our peers in government can see how the service has been built with the Service Standard in mind.