How-Tos, Think-pieces

Overturning mistaken assumptions about behavioural problems with data

Authored by Research Fellows, Kun Zhao (one of our resident data experts), and Peter Slattery.

To help policymakers and program managers use the BehaviourWorks Method to design and deliver more effective behaviour change programs, throughout 2021 we are publishing a book, releasing one free chapter per month. In Chapter 3 we discussed how to use data to better understand a behavioural problem.

In this post, we explore a more fundamental issue – why bother looking at the numbers and investing time and money in collecting data, analysing it and interpreting the findings when the problem seems obvious and that we could just use common sense?

As we will see, intuitions and assumptions are not always reliable guides for designing behavioural interventions. Below, we share some learnings from projects where our data told us a very different story to what we thought was happening.

  • Text Hover
Lessons from the road. Who are Australia’s unsafe drivers?

Mental shortcuts, such as the availability and representativeness heuristics, lead us to estimate the likelihood of an event based on immediate events or pre-existing mental prototypes.

For example, imagine you were designing a behavioural intervention to reduce the number of unsafe drivers on the road.

One of the key steps in The Method is to determine the target audience for your intervention. Picture everyone who has been reported to a roads authority for being medically unfit to drive. Which age group do you think should be targeted in your intervention? (a) Those under 75, or (b) those over 75?

Most people think the age group to target should be those aged over 75, as they assume that older people are more likely to be medically unfit to drive. It turns out, however, that the correct answer is (a). When we examined the data in a previous project, it turned out that two-thirds of people who are reported are, in fact, under 75 simply because there are more younger drivers on the roads.

Checking this data, therefore, allowed us to select an appropriate target audience to deliver a behavioural intervention with the greatest impact for reducing the number of unsafe drivers on the road.

Inclusion confusion. Who is being left out?

Confirmation bias describes our tendency to seek out and prefer information that confirms our pre-existing beliefs. In a study of prejudice and social inclusion, we examined the experiences of a number of marginalised groups. 

  • Text Hover

One of the groups we initially chose to focus on were older Australians, based on beliefs and concerns that they were at risk of being left behind. It was not until we broadened our age groups of interest that we also noticed high levels of experienced prejudice and discrimination being reported at the other end of the lifespan – those aged 24 years and younger.

Looking at the data meant that we could overcome our confirmation bias to produce a far richer story of the multiple points where the problem lies.

  • Text Hover
Show me the data!

As our examples show, numbers have a way of taking the assumptions out of the problem. While relying purely on data is not foolproof – and data can be misused or misinterpreted, as the recent reproducibility crisis has highlighted – data analysis, when applied appropriately, provides a much more rigorous and objective means of understanding your problem than relying on intuition. This realisation has spurred a significant move in government and industry towards evidence-based policy and decision-making.

The 2012 US presidential election was known as the big data election. This, among other things, has led to a data-driven revolution that has highlighted the role of data science and analytics. Fortunately, data collection and analysis doesn’t always come at a high cost.

Another thing to note is that many organisations have more data than they realise, which might be collected through surveys or polls, or sitting within digital and administrative datasets. This is a potent resource.


In this post, we have shared some cases where the data overturned assumptions and beat intuition.

Along with systems mapping and evidence reviews, these are complementary approaches that can help direct us with greater precision to the heart of the behavioural problem during the Exploration phase of The  Method.

In Chapter 3 of The Method Book, we delve deeper into some tools you can use to analyse data and help you understand your behavioural problem. Our upcoming toolbox training course will teach you to apply some basic data analysis techniques and make sense of these results to support decision-making.