There is still a battle to fight within many organisations to convince them that spending time and money creating good web content will help their business. I’ve seen it many times: a fortune is spent on development and design, with content being an afterthought.
Recently, I’ve been able to demonstrate that user-focussed content that is well designed and written in plain English can increase uptake of online services.
I’ve been working with business units to redevelop some of the web content that surrounds Work and Income online services. This blog post is about the work we did on one of them.
The service: apply for New Zealand Superannuation
People can apply for NZ Superannuation either using an online or paper form. While the criteria are relatively simple (ie age and residency), the application process and the information required to complete an application varies depending on the person’s circumstances.
We want as many people using the online form as possible. It’s part of MSD’s Better Public Services Result 10 targets. Any way we can increase online uptake is a good thing.
We provided a lot of information about the online application process on one page. This page was long as it had to provide information for every type of person who could apply. This meant that almost 90% never completed the online form.
Our figures confirmed research which has shown that people lose interest in pages if they’re reading information not relevant to them. In our case it meant that they would leave the website before reading all the content and then use more expensive channels to complete their form.
I had done work on the content for the online benefit form and wanted to replicate what I’d done there with the NZ Super content. However, the team in Seniors had a better idea. They wanted to tailor content to specific groups. So we came together and came up with a solution that met all of our requirements.
No content had ever been presented on the Work and Income website using a decision tree, so it was exciting for me to build something different.
When doing this sort of work, especially if it’s a change to how things have been done before, it’s vital to spend the time with colleagues explaining what you’re doing and why.
Mapping out what your users actually need
We started with a white board and some post-it notes. OK, lots of post-it notes. We worked out the different client groups and what information they needed.
We then worked out what the various key decision points were (eg is the person single or married; current or non-current) and started to map out all of the scenarios. This worked really well and helped ensure we didn’t miss anything.
Refine (and refine and refine)
We created a high-level flowchart of the ‘information journey’ which helped identify the similarities and differences in information required for each client group. We met regularly and the user journeys were refined numerous times as we identified new scenarios. This approach took time but was really valuable and helped us to start to come up with content solutions.
So that the client only gets the information they need, we identified the decision points and developed a short set of questions. The answer from each question takes the person to their next relevant page. It’s similar to how our online forms work but using content rather than an online tool.
We reviewed the existing web page content to see what could be reused or what we’d need to re-write for each of the client groups.
What we built
People only read information relevant to them, so the decision points need to be logical and in the right order. We started with a number of decision points, but were able to refine these down and change the order as we went.
The first thing we ask is if they have a cell phone and an email address. If they don’t, they can’t create a RealMe login and use the online form. These users are sent to a page that tells them they need to use a paper form which they can download or request online. Having easy access to this information online should reduce call volumes to the contact centre.
If the client has a cell phone and email address, they select the circumstance that most closely matches theirs. This list was much longer when we started but we were able to cut it down as content was developed and we could see how similar some of the pages were becoming.
Clients then see only the information they need to complete the online form. Outlier information (useful or important for only a few clients) is linked away from pages so it’s still available but not taking up space on the main pages. Since this outlier content was only relevant to about 20% of clients, we didn't think it needed to be on the main page. After publishing these changes we’ve seen that about 10% of users read this information.
Testing the new user journeys
While it seemed to work well and looked good, we needed to try it out on real people to see if the approach did in fact work. We did some face-to-face user research and the feedback was very positive. As always, one of the testers identified a huge logic problem that we’d all completely missed. It was a really easy fix and with speedy approval from managers on the final approach and content, we were able to publish the pages earlier than our target launch date.
Evaluating the changes
I published the new pages on 19 December and have been monitoring usage since.
We now have 12 pages instead of the original single (but very long) page, but clients only read the pages that are relevant to them. In most cases, users now only visit 2 short pages before they get to their relevant content. You can check it all out here:
The new pages have a lot less words and a much higher reading ease score compared to the old page. Read my previous blog post "Measuring content improvement" for more details about this.
|Measurement||Old page||Average of new landing pages|
|Sentences per paragraph||1.7||1.3|
|Words per sentence||16.8||13.26|
|Characters per word||4.6||4.3|
|Flesch reading ease||60.3||75.9|
|Flesch-Kincaid grade level||9.0||5.9|
I’m still working with the Seniors team to review and refine the content here. The team has already signed off on some changes and I’m sure there will be more in the coming months. We’ve got monitoring in place so we can see what's working and make evidence-based decisions using real data.
We’re already seeing an increase in online forms submitted (around 20% from the same period last year) and hope that we’ll see that increase further as we make more improvements to the content.
I’m also taking this approach with other online services provided by Work and Income, developing new content mapped back to user journeys.