Skip to main content

This experiment focused on how we might be able to define better ways of measuring content effectiveness.

Transcript

So this screencast is going to be a little bit different. I'm actually going to spend a little bit of time taking you through a bit of an experiment that we've been running over the last couple of months around how we might be able to define better ways of measuring content effectiveness.

Now user flows are interesting, but they're to present results to others. Navigation summary requires a lot of manual data collation. You actually have to spend a lot of time going backwards and forwards getting the numbers.

How do you actually measure content effectiveness? Is your content actually meeting user needs? So for the experiment today, I'm going to choose two pages from the Web Toolkit. We have the web accessibility standard and the guidance material about the standard. Now together, taken as a collection of information, this is providing all the information that people in agencies need to know in terms of how they comply with the standard and some of the explanations as to how they might go about testing things and links to the technical resources that will actually help them implement the standards on their own site.

So the challenge for us is to look at these pages and try and find the way to actually work out, are they helping people understand the things that they need to know. So I'm going to walk you through some of the thoughts we've been having on that.

If you look at the web accessibility standard, it actually has a lot of information on it and a lot of links out to other material that you may need to read. The same can be said about the guidance that goes with the accessibility standard. Now these pages on their own may meet user needs, but some users will need to follow the links to get to other sites.

To really understand the content, reading it's actually going to take a few minutes. How do we measure all of that and make a meaningful report? This is something that we've been thinking about. Well, custom segments come into play.

Here's the basic criteria we're using. Now tip, get prepared press the pause button, because there's a little bit of information coming up.

The way we think we're going to approach this is to create two custom segments. One is going to measure the interested users. So the people the might start to actually become more engaged with the content, and then the second segment will measure those users that we think are actually really engaged by the content. There are slight variations in how you define each one.

We're experimenting with what engagement actually means, what's the criteria that we would use. I don't think we've actually settled any golden rules right now. It still needs a bit of refining and adjustment as we go.

Now creating the first segment is simple. You simply create a condition that shows the number of visits to the side where the user has visited either one of the pages that we're interested in looking at. Then it gets a little tricky, again, get ready to press pause.

For the second segment-- or how we're going to measure engaged users-- you start off with the same basic definition for what an interested user would be-- so we have the same pages that we're looking at, but then we add some additional criteria that helps to define what the difference would be between those people that just looked at the content and those users that actually did something with the content. So in this example, we've indicated time on page, plus some very specific links to external websites that those users might have followed.

Once you've defined those two segments, you can then do a calculation that's the number of engaged users divided by the total number of interested users. So looking at a page overview report, I can see in this example, I've got 90 engaged users and 399 interested users. I could work out a proportion or a ratio from that. You use the unique pageviews view-- it's as close as you get to be able to measure actual unique individuals, in this instance.

I can now run that report across different time periods perhaps single months or even a couple of months as a sequence and see what happens to that particular ratio. Does it go up? Does it go down? Does it say more or less the same, meaning that there's no real changes in user behaviour-- it's fairly consistent most of the time. Now this is a repeatable, measurable, and actionable measure, and it's focused on outcomes. It's going well beyond just counting page views and visits to your website.

Here's an interesting demonstration of why we think measuring engagement as opposed to just traffic on your website is really important. What if you were making changes to your site and could show over time that the engagement rate of your users was increasing, but what happens or what if the web traffic for that particular period was actually trending downwards? Is that actually a problem? Well, I don't think it is, because what you're showing is, you're getting less traffic to your site, but even more of that traffic is actually engaging with the content and information that you're providing in the right kind of way.

So then the next question, what's the best thing to measure? What's more useful? Well, do visits and pageviews to a site mean that users are getting value? Well, maybe not. Measuring engagement reduces the impact of traffic changes on reports, but on its own, it's still not enough.

Does measuring engagement show comprehension of content or if your users are actually having their needs met? Is a measure of customer satisfaction? No, it's not. But this is just a part of how you measure the overall user experience.

What might a definition of engagement be for your site? Go have play. See what you come up with. I'd love to hear your thoughts on this.

If you’ve got questions or feedback, we’d love to hear from you. Email us at info@digital.govt.nz or visit the Web Toolkit for more information.

Other screencasts in this series

See the first post in this series for the full list of screencasts.

Give us your feedback

We’re looking for feedback from agencies on how well these meet their needs, and if they’re popular, we may be able to produce more. If there’s a specific topic you’d like to see covered, let us know. I’ve moved on to new opportunities in Australia, so email info@digital.govt.nz and someone from the Digital Engagement Team will get back to you.

Utility links and page information