Skip to main content

We recently completed our first round of user testing since publishing

We learned a lot. Seeing actual users use your site is always enlightening…and occasionally frustrating. The urge to shake them and yell “it’s right there!” can be strong. Things that seemed so obvious when you did them turn out not to be obvious at all.

Overall, the testing showed we’re on the right track. There’s still a few things we need to think about, but people were very positive. Task completion went up from 69% in the last round to 86% in this round, and 54% of those tasks were completed on the first attempt.

A lot of what we learned wasn’t as clear as “change this” or “tweak that”. If we had to sum it up with a theme, that theme was probably “assumptions”. Assumptions we’d made that turned out to be false, and how users’ assumptions influenced their behaviour.

Here are a few examples...


It’s generally accepted that you can break your audience down by age. Young people are internet-savvy and want to do things without talking to a person. The older generation still like to make a phone call or see someone face to face, are probably not sure how technology works, and might want a lot of help and extra information. Standard stuff, right?

Except it no longer seems to be true.

The internet has been around for long enough that even your grandmother has probably been using it for years, and people in their teens and 20s have never lived in a world without it. We had an elderly gentleman who was completely internet-savvy and happy to do everything online, followed by a young man in his early 20s who wanted to call someone at some point in every task.

Basic personality types seem to be overriding any age-based factors in how people behave. There are still older people who aren’t confident online, but equally there are young people who just want to talk to someone — and pensioners who’d rather not.

It was a good reminder that people will pick their own channels and ways of thinking, no matter what you present them with or what you think they’ll do. For many people, dealing with government is really daunting — our early 20s participant not only wanted to call someone to check he was doing the right thing, he said he’d call “to ask if it’s OK” to use the service in the first place.

Google search terms

We rely a lot on the keywords people are googling to help us work out what to call things and how to talk about them. What we hadn’t anticipated was that users were trying to anticipate us before they searched.

Over and over we saw users who didn’t search using natural language or terms that felt familiar to them — they’d say those terms, and then ponder a moment, before re-phrasing them into something that felt “governmenty” to them, and searching for that. The trouble was, it still didn’t help them find the content they were looking for.

Users are so used to government having strange terms and odd brand names for services that they think they need to preempt the language we might use before even trying to look for them.

One user, asked to google information on help with living costs, said “I wouldn’t expect to find this under something so simple as ‘help paying my rent’…” — that’s the language she was using, but it wasn’t what she was googling.

Preconceptions about government

We made some assumptions about ourselves, too. We’d assumed that users would approach our site as a fresh, new experience — but to them, it wasn’t. It was a government experience, and they brought all their historical and emotional baggage of dealing with government with them.

Do people want to talk to a real person because they expect a bad experience online? Because they expect government to be too hard for them to do themselves? We don’t know. But we’ve heard a lot of comments like “it’s government: it’s supposed to be hard”.

We also had one lady who told us she felt our information was unreliable. When pressed as to why, she said: “It seems too easy and too straightforward.”

We need to account for the fact that people’s behaviour is biased by the experiences they’ve had in the past. We want to change the experience — but we’ll also have to change their behaviour. And it’s hard to tell how the way they respond to us is influenced by their history and their personal context for “dealing with government stuff”.

Assumptions about assumptions

We’d also made some assumptions we weren’t even aware we’d made.

On one of our pages, we’d broken the steps in a process up with sub-headings, but since we hadn’t numbered them or labelled them explicitly as steps, we were relying on people reading the content underneath to grasp the order of events. But we break things up with headings exactly because people won’t read the content under the ones they don’t see as relevant.

We’d assumed the social media “share” links were an affordance everyone was pretty familiar with, so we hadn’t labelled our icons. Turns out, not only did people of all ages not know what they were or how they worked, but they assumed all sorts of things: that they were links to our own social media accounts, that clicking them would somehow log them in or ask them to share information, that they could use them to email or contact a department.

They also had no idea what departments were called, and not just the ones that’ve recently changed. We heard things like:

“Work and Income…or whatever they’re called now.”

“Transport…whatever their name is.”

Our content challenge? Be explicit. Be really, really explicit. Nothing is obvious.

One final assumption

When it came to looking for information, we’d assumed everyone would go straight to Google…and we were right.

Coming soon…

We'll post the full report — and our next steps — soon. In the meantime, we'd love to hear your user testing stories in the comments.

Utility links and page information