Beyond measurement, towards real understanding
Why you should think more deeply than just chasing and counting page views – and some ideas for how to do it
We spend a lot of our time in the world of measurement, don’t we?
Most of us want to know that we’re working towards something that’s having an effect. Making a difference. A small dent in a small part of the world. Broadly, that involves two kinds of research or analysis.
What are the needs or issues?
How are we doing in addressing them?
In the not-for-profit world, there aren’t always obvious answers. Needs aren’t (usually) defined by market gaps, opportunities or risks, and success isn’t (often) defined by sales figures alone.
So, we measure. We evaluate. We set key performance indicators, we record numbers, and we compare our activities with benchmarks or targets. That often means analytics: website traffic, email open rates, social media engagement. We work to increase them. And, consciously or unconsciously, those increases themselves start to feel like the purpose of what we’re doing.
Analytics paint an incomplete picture
At CharityComms’s recent Charity Content Conference, Ruth Stokes talked about the importance of goals and metrics being meaningful. She pointed out that page views and email open rates do not tell you:
whether the information was what the person expected to get
whether it was useful
how it made someone feel
whether they found it easy to complete the task
why they dropped off part-way through
Even for the things that analytics can measure, numbers will always be incomplete. For websites, they shouldn’t include anybody who has not opted into measurement cookies being placed on their devices. Email systems, too, are increasingly blocking the tracking that measures open rates, to better protect users’ privacy.
If you’re using Google Analytics, Google’s new Consent Mode v2 can be set up to fill those gaps by estimating based on the data it has. But counting ‘unique users’ will always be problematic because its goal is something that device makers, developers and many users specifically try to prevent: knowing who you are, regardless of what device you’re using.
Incomplete data is an inherent part of respecting users
It’s easy to consider all this as an inconvenience, or a set of barriers to overcome as best we can. But actually, they’re small concessions in the ever-growing tension between corporate needs and citizens’ rights. At William Joseph we recently watched and discussed Carole Cadwalladr’s latest TED talk, on tech overreach and surveillance capitalism. We think it’s reasonable for users to opt out of feeding the machine with their data, and we take seriously the need to obey their wishes.
Activity-based KPIs don’t always help either
Another big theme that came out of the Charity Content Conference, mentioned by Lucy in her blog, was that we don’t always need to create and do more.
All too often, website KPIs are based on the idea that more activity is better, or that more activity than last year is the aim. But this ‘any-growth-is-the-aim’ mindset can lead to a competitive, zero-sum environment across the charity sector collectively. If we wanted to be grandiose, we could go further and reflect on the connections between the continual push for growth and the inequalities that many of us are working to dismantle.
Website KPIs, especially at the beginning of a programme or the launch of a product, are often arbitrary. An initial round number is chosen, and then a year-on-year percentage increase is applied. The degree to which they’re treated as important is usually a direct reflection of the power dynamic between the funder and the funded charity.
Sometimes, more can indicate a problem
There’s also the fact that, particularly for help-based content, it’s not clear whether higher is actually better at all. Is it good news that more people are clicking on your ‘urgent help’ button these days? Will you be happy if more people read the ‘I’m in a crisis’ page this year than last? These aren’t rhetorical questions, by the way – the answer to both might be yes. But it entirely depends.
For some charities, this issue could apply to any interaction with you at all. If website indicators are an end in themselves, you might come to one conclusion. If you take a step back and think about what real-world issues your metrics might be proxies for, it becomes much trickier.
Do you really need analytics? If so, why?
We have yet to work with anyone, ourselves included, who has said no to this question. We understand. But giving it some serious thought can help you to make intentional decisions rooted in your actual goals and values, not just in habits.
Interaction design is one area where we’ve found analytics helpful. They can help you to work out whether a navigation pattern is confusing, whether a page element’s purpose is obvious, or whether something is never used and so could be removed. They can help you to dive into questions like: of the people who reach page X, how many of them found it via the link on page Y?
They can also be a powerful weapon on the side of equity. If you are a grant-making organisation serious about getting help to those who need it, you wouldn’t want someone to rule themselves out based on finding the application process too arduous.
For example, The Film and TV Charity already knew how many people complete the application process for their financial support schemes. But we have added custom analytics that record how many people start the application as well as how many complete it. This won’t cover everyone, but we can compare the two and see what proportion of people seem to be making it to the end.
How else might you measure what’s important?
Let’s return to the examples at the top of the sorts of things that viewing figures don’t tell you.
To look at why people dropped out of the journey or task they were trying to complete, one possibility is the multi-step analytic approach described above. Break the journey into stages and record when each stage is reached. If 90% of people complete steps 1 and 2 but only 40% complete step 3, you have a clear clue about specifically what content or interaction was difficult.
Feedback mechanisms can be simple
If you’re looking to find out whether information was useful, or was what the person expected to receive, one option might be a one-click ‘Did you find this useful?’ feedback component, like the one used on the UK government website.
You might set these up to send custom events into Google Analytics, but then you’d be limited to only those who have accepted tracking cookies. It’s not beyond the realm of possibility that people who have different expectations of privacy, or different fluency with your cookie consent tool, might also differ in their success in using other parts of your website.
An alternative approach is to implement the feedback component as a single-question form, using whatever form system you already have. You could then record anonymous responses directly in your content management system, unaffected by people’s privacy choices.
Numbers are just a starting point
Systems like these will only produce crude numbers, without much of a hint of the story or experiences that led to them. At best, they might suggest that there’s something you want to improve or address. And at that point, the best way to get a real picture of the issue is to listen to your users or potential users – whether that be by focus groups, conversations, surveys or something else.
Not to measure, but to understand
What these approaches all have in common is that they’re not about measuring for its own sake, and they’re not rooted in the idea that more activity is better than less. They are rooted in the idea that the real challenge is not to measure, but to understand.
With this different framing in mind, you are not necessarily looking for numbers you can extrapolate from. You’re not claiming to be measuring everyone, or making any statements about quantity.
Instead, you’re looking for stories and themes that shed light on something. On the issue you’re trying to help with, or on how your website or service is received, or on how it relates to the other ways your charity supports its beneficiaries, or on whether and how you are making a difference to people.
That shift of mindset makes all sorts of different options possible too, such as:
surveys with open questions allowing free-text answers
one-to-one interviews
focus groups
comment cards (physical or digital)
speaking to frontline staff to learn from their experiences
data from helplines, emails and chat systems
even quick chats over lunch with supporters, users, staff or other stakeholders.
Choose your approach with intention
At William Joseph we place great value on research with users and other stakeholders, often conducted through one-to-one online conversations. As we have said before, you can learn so much from five to six well-structured research sessions.
We also pride ourselves in helping people to get their analytics setup working, always inline with privacy guidelines. Solving a tricky Google Tag Manager issue is one of the more satisfying parts of life.
As with all things in tech, there are lots of nuances to the decisions about what you use and the approach you take. There’s always an opportunity to pause and consider how your values and your goals – your real, true purpose, not just your current KPIs – relate to that. And that conversation is where the really interesting stuff happens.
Get in touch – we’re always here to talk it through with you.