Unseen research is wasted research – Gregg Bernstein
Analysis which turns research data into valuable findings a team can act on, is the most important part of what we do as user researchers. But tends to be the least visible and least understood.
You can have well designed research questions, conduct lots of research, and do it well. But without enough and proper analysis, you won’t see the impact you’re hoping for. That detailed research report or slick journey map might just end up collecting ‘digital dust’.
At dxw digital, we believe that research is a team sport. And co-analysis is a big part of that.
Through collaborative analysis, teams can build collective knowledge about the people they’re designing for and the problem they’re trying to solve. And by doing that, create a sense of shared ownership which helps teams make better decisions and break out of silos.
It doesn’t matter whether you’re doing a quick data review, a round of usability testing, or a full-on discovery. Doing analysis together will always pay off. Helping your team understand, believe in, and act quickly on what they learn.
Developing our research practice
Last year the growing team of user researchers at dxw digital came together to document a more consistent way of working. We agreed a set of principles to guide the way we approach, do, and talk about research.
We’ve also set up a workflow that describes the types of things that user researchers usually do on projects. One of the steps is to “analyse research and produce findings”.
At dxw researchers carefully analyse the different kinds of data we collect. We analyse it in batches and record our findings from sprint to sprint. And we involve our colleagues and clients in analysis to make sure our findings are clear and relevant.
Some time ago, I shared why co-analysis is a golden trick for agile teams and included some tips on how to facilitate successful team sessions. These tips, principles, and workflow have been helping guide our work and include our teams along the way.
But we know we can do better.
Understanding the challenges of co-analysis
Over the last 2 years at dxw, our user research team has grown significantly. We now have a great balance of ‘older’ and ‘newer’ researchers, from different backgrounds and with different levels of experience.
As a company, we’re also working on bigger, more complex, and diverse projects, with more and more senior stakeholders. So we often have larger multidisciplinary teams, across several locations.
We continue to work with organisations that are new to user-centred design, and new to doing research in a more agile way. So we need to recognise and overcome blockers and respond to common objections to qualitative research.
In these contexts, getting research insights into the design of products and services can be tricky. While co-analysis is not a magic cure, it’s a vital part of establishing better practices.
As a team, we’ve realised that we’re running into similar challenges with analysis across projects and clients. We’re each using our own rules of thumb, methods, and preferred tools. So now is the time to reflect together, learn from each other, and introduce a bit more consistency.
We need to make sure we’re all doing co-analysis effectively as we continue to grow.
Turning collaborative analysis into a good habit for agile teams
Between the user research, strategy, and service design teams, we’ll run a set of internal workshops to map out how we do analysis now to understand what works and what doesn’t. It will be a safe space for the teams to:
- reflect on previous experiences of doing analysis in teams
- create an analysis kit that everyone can learn and use
- consider how to involve designers, developers, and delivery leads as we’re in this together
- think about how we’ll know things are working well
We’ll structure our workshop activities around 2 main questions:
- How have we approached analysis on recent projects and why? What has worked and what hasn’t in the past? What effect did that have on the team, the client, the outcomes?
- What’s the minimum a team needs to do analysis well for different kinds of work, mixed methods discovery or usability testing?
For both questions, we’ll prioritise effective methods, but also think about things like time and budget, ease of learning, repeatability, tools, and outputs.
We’ll then have a flexible, shared approach and toolkit for collective analysis, that we can document in our playbook.
We’ll be sharing how we get on so sign-up to our newsletter if you’re interested.
Need the details on how to do analysis well?
Sorry if you’re disappointed with the lack of dxw tips or guidance in this post! That’ll come soon. But here are some existing guides on doing analysis and presenting findings that we like:
- (Book) Practical Empathy by Indi Young. Chapters 5 and 6 describe a form of lightweight coding to identify concepts and then look for patterns. Lots of teams have used versions of this to do group analysis
- (Book) Exposing the Magic of Design by Jon Kolko. Chapter 6 includes examples of doing analysis with quite detailed steps
- How we do research analysis in agile is a short but useful introductory read with practical tips on how to do agile analysis when the team is together
- How we ran collaborative user research for a collaborative standard is an interesting case study about running a larger co-analysis session in government
- In User research is a team sport Will Myddelton describes how doing analysis together is only part of it, and why and how we need to think about where to involve our teams and where to work alone