Qualitative and quantitative data are often seen as opposite sides of the research coin. However, maintaining a fundamental distinction between the two types of data limits the ability to achieve deep and lasting data driven change. While some organisations are willing to see numerical and thematic data as occasional companions, at the Driver and Vehicle Standards Agency (DVSA) we advocate a long-term relationship.
In this blog we show how DVSA is leveraging the full value of quantitative and qualitative data for better user outcomes. We bring the separate disciplines of performance analytics and user research together in an approach which is delivering benefit throughout the service lifecycle.
How the relationship started and why it is important
Changes to our team composition as we moved from a ‘project’ into ‘continuous improvement’ meant we began to look at how to bring qualitative and quantitative data closer together to meet the needs of our users. We quickly established that in user research and performance analytics there are large areas of overlap in approach and philosophy. Both disciplines are driven to:
- adopt an external focus – looking to our customers and their environment to identify and explain data patterns
- empower data consumers to use our outputs to improve services
- reflect on our practice to find better ways to obtain and use data
But even with so many factors in common, why work together? The answer came from considering how our data was challenged by stakeholders. Performance analysts are often asked to explain the context of a data pattern – what it would look like in the users’ world. User researchers are often asked the size of the issue they have uncovered – how often and for how long it occurs. It was a small step to realise that by working together, we could provide the whole picture.
Examples of our work
Vehicle operator licensing is a service for companies which operate heavy goods or passenger service vehicles. Here are a couple of examples of how we have combined performance analytics and user research. We start with a small self-contained piece of work, and end with a longer end-to-end example of working together.
Enhancing research artefacts
Personas are a common user research artefact, but they are sometimes criticised for being idealised representations of users, or for being based on too small a sample. We have a set of four core personas which reflect the different sizes of business that use our services. These contain the usual elements of demographic descriptors, motivation and pain points.
We used the information about fleet size to randomly select 50 user-accounts for each of our personas. These were stripped of any personal data and then examined to identify which of our service features were used, how often and for how long. This was like having access to two hundred very detailed and accurate diary studies.
From the data, we added task-level insights to enhance our personas, making them more robust. This has enabled us to have detailed discussions about design options informed by easy-to-comprehend, accurate and factually-based knowledge of users’ task-level behaviour.
From hypothesis to intervention and monitoring
Our next example illustrates end-to-end collaboration from observing a pattern to designing an intervention and monitoring its impact.
From the satisfaction survey integrated into our service, we found that users were frustrated with entering and resetting their passwords. We brainstormed the possible reasons for this. One hypothesis was that users were sharing accounts. This was backed up by anecdotal evidence from our call team who told us that when users rang up to reset their passwords, they often had to ask a colleague to provide the answer to the security question: ‘What is your date of birth?’.
We included questions about account access and security in our next round of user research and confirmed that small businesses tended to set up one account which was used by a number of staff from director to transport manager to office administrator. This is in direct contradiction to the instructions issued about operator licensing accounts, which emphasise that personal accounts are key to security and accountability.
By querying our database, our performance analyst determined the extent of the problem. We found that of the 115,000 organisations registered with our service 90% of them had a single registered user. User-account management seemed indeed to be an under-utilised feature. However, contextual knowledge of the profile of our user base meant that we had to take into account the fact that 49% of the organisations using our service are owner-operators with a single vehicle. This group of users might reasonably be expected to have a single user-account associated with their business.
Re-examining the data through the lens of ‘fleet size’ we identified that 46% of registered organisations had a single user account but multiple vehicles. As the size of a business increases, more staff roles can be expected to require access to the licensing service. Our data suggests that around 50,000 operators appear likely to be allowing multiple people to login in to their account through a single shared log-in.
This finding has serious implications for account security. Operators themselves have discussed with us the implications of former employees having access to their licence on-line. It is also bad for accountability, as it is paramount that it is clear who has taken action on a licence, for example changing the maintenance schedule for a fleet or appointing a new transport manager. This is directly related to compliance and would form an important part of any investigation of an operator by DVSA or the Traffic Commissioners. Our insights have resulted in a targeted communication campaign and we are using data to track behaviour change.
Top tips for developing a cross-disciplinary partnership
From our collaboration, we have learned that together we can make a bigger difference more quickly. We finish with our top tips for developing a similar relationship between user research and performance analytics.
- Do not be afraid to ask questions of each other: alternative perspectives are immensely valuable.
- Knowing how to query data is not all it takes: context and interpretation are required.
- Web reporting is not Web analysis: our aim is to make recommendations that our service teams can act on.
- Co-locate as much as possible to increase the interaction between user research and performance analytics, this increases synergy and makes interworking easier.
- Assume positive intent: our differing backgrounds mean that we approach problems differently, take time to hear what is being proposed and why.
- Agree collaborative items: this is not a relay race where one discipline hands off to the other.
- Over-communicate: use all available channels to keep talking about what you are working on and why. Give updates to the team regularly, and make sure to highlight the changes that occur when they have implemented your research.
- Openly appreciate each other: if we show confidence in each other, then our organisations will be quicker to accept this new collaborative partnership.
Have you seen how user research and performance analysts can work together to achieve better outcomes? Let us know in the comments section.