Demonstrating the value of
human centred design in
digital transformation

Imagine you're the federal government and you want to make evidence-based economic decisions. For example, you want to review the Cost of Living Index, or make service investment decisions, or even get some insight into what different populations might be eating.

Where do you get the evidence upon which to make your decisions?

One way is to run a massive data collection activity through the Australian Bureau of Statistics (ABS), like the Household Expenditure Survey, which they conduct every six years. Around 10,000 households participate in the survey, in which they complete a questionnaire with an Interviewer, followed by two weeks of tracking the details of every single transaction they make in a paper diary, before the Interviewers return to pick up the diaries. Over the course of a year, every submission is then manually processed by coders at the ABS to match each item listed to a preset code in order to create the data set.

It is a massive production for the ABS.

Now imagine you're the participant.

The information required is meticulous: you must keep track of every transaction you make in a day and enter it into the diary, with details of what you purchased, how much of it you bought, from where, for how much, and how you paid. This is all made more difficult by the fact the diary is an A4 paper booklet that is not easily transported. Despite a reasonable average response rate, the ABS reports a significant dropoff in the quality of data over the two week life of a diary.

So with a 2021 collection looming, and facing production and distribution of another 40,000+ booklets, training of facilitators, manual coding of data, the ABS decided it was ready to explore a digital approach.

We collaborated with the ABS on an 8 week project to create a testable digital Alpha prototype —something which might show that a digital collection tool could reduce costs while increasing quantity and quality of data, and make the case for further investment and development.

Understanding the needs

The first challenge in any design project is understanding exactly what you are trying to design. In HCD this is done by observing and talking to users for empathy and insight on how a solution might fit in their lives. 

Here however, the users (the participants) are not directly receiving value from completing the collection. There is a societal benefit resulting from the data and good decision making, but really the only need for the users is for their required participation to be as frictionless as possible.

So, we began with a deep dive into the entire process: how the collection worked and what data was gathered—but we did it from a user's point of view. We talked through week long spending scenarios for all members of a family, hunting down common and uncommon transactions, until we had a solid picture of the kind of behaviour and decision-making we needed to support, and the data we needed to capture.

So if I buy carrots, I just enter ‘carrots' in the item description...

Well you can't just say 'carrots'

Why not?

Well, are they fresh or frozen? Tinned? Or baby carrots? Or chopped up and mixed with corn and peas?

Is that important?

Yes. Very.

After getting the lay of the land and the scope of the problem, we began sketching out what a good solution might look like.

Designing from the ground up

An ABS data collection is not a one-off interaction "survey" — it requires multiple interactions over a number of weeks. This presents a unique behavioural design challenge: how could we make the user experience as frictionless and sticky as possible to maintain a good response rate?


Sometimes you end up with two user groups with competing needs, and you need to find the right balance.

In making the respondent experience easier, we made the coders job harder. This was a cost the ABS was okay with taking on if it meant a better response in the first place, but we still looked for every opportunity to help the coders.

There is a tension between wanting a simple interface for the core diary experience (adding and viewing entries) and being robust enough to handle the edge cases and many minor details. Most of the time, an expenditure entry is likely something small and simple: a morning coffee, a petrol top-up, takeaway meals. But things like grocery shops or non-routine purchases can quickly become tedious or complex.

So how would you enter something you have no idea how to describe?

Like what?

Like a dinosaur taco holder.

We went through many variations of the core functionality to ensure we were finding the right balance between elements. Adding simple entries was the priority for the path of least resistance, but we found we spent the majority of our time trying to solve for the 10% of more complex scenarios.

Often, we had to challenge what was actually needed, instead of just recreating what was in the paper diary. Trust from the ABS allowed us to take a whole new approach to the experience and keep it focused on the user.

"I'm so on-board with this approach. There are risks, but we have an opportunity to test the risks rather than take the constrained approach from the outset."

With a refined, lightweight design we were happy to take to testing, we had to solve the next big problem: how to get people to actually use it.

Gaining buy-in through orientation

Since selected participants are mandated by the government to respond, it's not hard to imagine some users feeling apathetic towards their participation at the best of times (and at worst, distrustful or resentful). With this in mind, the approach to onboarding became crucial: if they were obligated to participate, then we had to ensure they would be invested enough to do it right.

The diary and its complexities are daunting—in the paper diary there are half a dozen full A4 pages of instructions, examples and rules that the participant must read up front (which we had to constantly double check ourselves). We knew that it was crucial to ensure the participant felt like it was an achievable task.

The key thing we had to instill in them wasn't the knowledge of how to make full use of the diary or recall all of the business rules, but the confidence that they were on the right track. People would always use it incorrectly or forget some of the requirements; we felt a bigger risk was in overwhelming the user with so much instruction that they didn't try at all.

This led us to using a checklist format for the orientation: a series of defined tasks that they needed to complete before the survey began. We could break up the reams of information into manageable chunks this way, lay out the minimum requirements for the user, and provide a compelling reason to complete it. They also celebrate every step of progress made. When the participant makes it to the end of onboarding, they feel accomplished and have positive momentum, even before the survey begins.

Finding what works

Testing and iterating is a vital part of a HCD process, but getting useful results from our designs became one of the biggest challenges in the project.

The original plan for Alpha was to deliver bare bones functionality and design mockups for a baseline MVP. But we reached a point where the behaviour we were trying to capture was too complex. We needed something more technically rich to test this complex behaviour at scale. Clickable wireframe prototypes were not going to provide us with enough validation, in our own testing and in the Beta.

How well would people actually use the diary when they received next to no guidance? Plus, linking all the mockup screens in their complexity was a massive time drain for design.

The choice to focus on a high-fidelity build became crucial in the second round of user testing. With little time between the testing and final delivery, being able to turn around design reworks and deploy them immediately was vital to the success of the finished product.

Working Agile during COVID

This project had a small team with a short timeframe, so we opted to go with a lo-fi approach to our process. We trimmed as much as we could from the admin overhead: tracking tasks in a spreadsheet, working off a shared digital collaboration board, and regular screen sharing calls to discuss functionality and intent. We scheduled regular catch-ups with the ABS similar in structure to a standup, though we used the time to quick-fire questions and, where possible, get immediate answers and sign-offs. This allowed the team to remain aligned but progressing quickly.

The conventional approach is to lock the team in a room and see what comes out. This was our first time doing such a highly collaborative project entirely remotely. We soon found advantages in our new way of working, though. Our contact was more about frequency than length. Instead of muddling through multiple problems in a 2 hour meeting, we kept each other on speed dial, checking in at frequent intervals to update and resolve smaller chunks. This enabled a comfortable pattern of repeated divergence and convergence; we were continually engaged with each other but had the focus space to progress our own work.

Using a collaborative digital workspace was crucial to this success, particularly in early concepting. Working together in the digital space felt similar to being in the same room, and there were many other benefits. We plan to keep using this platform, even when we return to our routines.

Laying the foundation for future success

We'd promised packaged designs and bare bones tech functionality, but in just eight weeks we were able to deliver a polished prototype build, with two rounds of testing and many iterations behind it.

There were many factors that enabled this: a small and focused team, a clearly defined and suitable scope, work that was scaled to suit the budget, working closely with the client, and the right amount of ownership across the whole team. But underpinning it all was the ABS' willingness to take a HCD approach to solving a (relatively) small problem in order to create a model and the capability for solving more.

Ultimately, they were excited with the opportunities we presented. Even demonstrating the benefits of an improved admin workflow as a bonus got the gears whirring on how this could produce a scalable business impact.

If one business unit could improve their data collection and business efficiency with one stone, how might this create change in the rest of the agency?

When it comes to digital transformation, it's easy to lean into thinking about the big picture and the portfolio of programs that need to rollout. But sometimes figuring out how to do one small thing well is more valuable. Then you have the knowledge, pathway and capability to solve similar problems. With a fixed price and timeline, flexible scope, and working closely with a good team, you can pull off some incredible results to get the first brick into place.