/ Resources
Behaviour change 101 series: How to do a Practice Review

Behaviour change 101 series: How to do a Practice Review

On the value of listening to those with direct experience

In this ‘Behaviour change 101’ series, we draw on our experience and educational resources to discuss approaches for planning and executing behaviour change research and projects.

Authored by Peter Slattery (with support from Jim Curtis and Mark Boulet).

In my last article, I discussed how BehaviourWorks Australia (BWA) uses Rapid Reviews to explore what the research says about getting audience Y to do behaviour X.

In this article, I discuss how we talk to experts in the problem domain to enhance our understanding of that problem using a method known as the Practice Review.

What is a Practice Review?

A Practice Review engages strategically-selected individuals and seeks to capture and collate their insights to help us overcome specific behavioural challenges.

The individuals selected usually have extensive experience in, or knowledge of, the problem and their insights can help us identify optimum behaviours and/or audiences to target.

Ideally, a range of stakeholders and experts are used, each of whom engages with the issue from different perspectives.

Why do a practice review?

Practice Reviews are particularly useful for situations where the stakeholders and experts have insights and experiences that have not been explicated in the literature, or where problems are new or under-researched. In such cases, practitioners working in that area might be one of the only sources of available knowledge.

Practice Reviews can also assist in ‘re-contextualising’ interventions to new areas; for instance between different countries, audiences or behavioural challenges.

Practice Reviews have several advantages over comparable social science research methods, like surveys and experiments, such as:

  • They offer opportunities for collecting rich data via reflexive questioning and probing.
  • They provide participants with an opportunity to express their honest opinions and be listened to (and, in some cases, to ‘vent’).
  • They help to develop a rapport between researchers and important participants. This can assist with later parts of the research process and dissemination.

What are the limitations of a Practice Review?

Practice Reviews have similar limitations to other types of qualitative, interview-based research:

  • You can’t easily generalise the findings to other contexts and populations.
  • The data can be harder and more time-consuming to synthesise and report.
  • The researcher’s views can bias data collection and results.
  • There can be issues of confidentiality – reporting on key insights can risk revealing the participant who provided those insights.

BehaviourWorks takes a variety of measures to strengthen our processes and avoid these limitations. We often pair Practice Reviews with other types of methods, such as Rapid Reviews, to overcome the limitations of both methods.

In this pairing, the Rapid Review provides a generalisable and broad overview of the general area, while the Rapid Review provides contextualised insights, nuance and exceptions that are often missed by literature.

Additionally, we use a semi-structured interview process, delivered by researchers with research training and use appropriate processes mandated by ethics to ensure that participant anonymity is maintained.

How we do a Practice Review

There are several ways that you can do a Practice Review. The most common approach, and the one which we generally use, is for the researcher (or the individual doing the data collection) to conduct in-depth, semi-structured, one-on-one interviews with strategically- selected practitioners. Other methods can also be used.

Observations of practice and practitioners can be used when:

  1. an objective third party assessment of the problem is desired
  2. practitioner’s personal biases and blind-spots are critical to minimise
  3. the aim is to understand work processes.

Surveys can be used in cases when it is hard to engage practitioners (e.g., due to geographic locations/time differences) or particularly important that participants feel anonymous.

In the remaining sections, we discuss how to do a Practice Review using in-depth, semi-structured, one-on-one interviews.

Plan who to interview

When planning recruiting, it is important to counteract the risk of response bias (i.e., only a narrow unrepresentative group, such as the unhappy employees, agree to be interviewed) as this can produce inaccurate results.

It is therefore important to draw on interviewees with different opinions, backgrounds and experiences. People and groups who you may want to include include:

  • Representatives of the group whose behaviour you want to change.
  • Individuals involved in the implementation or evaluation of prior programs who can offer recommendations for future programs.
  • Policy experts and researchers who have a high-level overview of this class of problem and the different actors, opportunities and challenges involved.

Plan the interview format

Practice interviews are typically short; 30-60 minutes is usually sufficient to make a valuable contribution. Additionally, long interviews can prevent key people from participating due to a lack of time. Because interviews are short, a ‘less is more’ approach should be taken when deciding what questions to ask and all questions should be directed by the research questions.

Key categories to ask about include:

  • The practitioner’s background and experience (e.g., their role, connection to the problem and experience with the problem).
  • The problem context (e.g., why it exists, how different groups and individuals perceive it and what is known and not known about it).
  • The actors involved (e.g., who contributes to the problem, who has the most influence, what are the key relationships and who is best to target).
  • The key behaviours (e.g., which behaviours cause or solve the problem, how easily can these be changed and which should be prioritised).
  • Relevant prior work done to address the problem (e.g., program design, assumptions, evaluation criteria, outcomes and lessons to be learnt).

These questions should be used to achieve a depth of understanding across a strategic selection of topics, as opposed to a shallow understanding across a broad range of topics.

Be clear on your priority questions and ensure that the interview is built around them.

Plan the number of interviews

Because the focus of Practice Reviews is ‘depth over breadth’, they are typically conducted until ‘saturation’ is reached. Saturation refers to the point when further interviews are unlikely to yield new high-level information (i.e. when you are not ‘hearing anything new’).

Based on our experience, 20 interviews are usually sufficient to reach saturation. However, this varies depending on the complexity of the problem, the diversity of practitioners and the purpose of the review.

In some cases, a smaller number of interviews may be sufficient, for instance, if the purpose of the interviews is just to localise the evidence gathered from a literature search.

Consider ethics

Because Practice Reviews involve engaging with people and recording their ideas, opinions and experiences, an ethics application may be required (mandatory for university research).

As part of the ethics application, assurances are typically sought around confidentiality, personal or cultural sensitivities, voluntary participation, how the collected data will be stored and used, and how participants will be recruited. Even if you don’t need to submit an ethics application, it is still worth considering the needs and sensitivities of review participants, as well as your own and your organisations.

If you are using project partners to assist with recruitment, it is important that the partner does not have an unequal relationship with potential interviewees who might feel obliged to participate (e.g., this unequal relationship might exist between a regulator and someone they licence). This might lead to skewed or biased outcomes from the review. It is also important to ensure that you do not allow participants who participate on condition of anonymity to be identified by your results.


Because it can be challenging to recruit the ‘right’ people for a Practice Review without existing networks and relationships, we recommend asking the project partner to help with recruitment by helping in identifying and contacting relevant individuals.

Running the interview

Practice Review interviews should meet the needs of the interviewees as this will result in better insights. The modality (e.g., video conferencing, telephone, or in-person) should be chosen to best fit the needs of the participants.

The person conducting the interviews should be a good listener, personable, impartial and empathetic. They should recognise when to probe or adapt and be able to avoid irrelevant topics. Being a good interviewer is a skill which takes time to develop. If it is your first time doing a practice interview, we recommend testing the questions and process with others.

We strongly recommend getting permission to record and transcribe interviews recorded. This captures the interview in full and avoids unnecessary paraphrasing, recall and interpretation. It also makes analysing the data easier and more robust.

Analyse the results

Practice Reviews can be analysed with different degrees of depth. A simple analysis can be a brief summary of the key themes that emerged and recurred across multiple practitioners.

A more rigorous analysis might, for example, use multiple people to thematically analyse the data and using packages such as NVivo to visualise the data.

Impartiality is especially important during the analysis, as there is risk of ‘confirmation bias’, where the interviewer only identifies themes or topics that align to their beliefs.

Examples of completed Practice Reviews

In 2019, we collaborated with the Safer Together program to complete a Practice Review examining the drivers and barriers to bushfire preparedness and response.

Other recent Practice Reviews include:

In summary

Practice Reviews provide a rigorous understanding of the applied knowledge and experience of practitioners and target populations within a particular problem domain.

They work particularly well when used in conjunction with reviews of published and academic literature to offer multiple perspectives on a problem issue.

In our training, we talk about Practice Reviews and other methods of evidence synthesis and we can tailor our training products to meet specific organisational needs.

Discuss your needs with us. Email: Peter.slattery@monash.edu

Sign up to the broadcast

Get monthly behaviour change content and insights

I'm an alumnus, friend or supporter (including donors, mentors and industry partners)
I'm a Monash student
I'm interested in studying at Monash
I recently applied to study at Monash
I'm a Monash staff member
I recently participated in research activities or studies with Monash

I agree to receive marketing communications from Monash University. Monash University values the privacy of every individual's personal information and is committed to the protection of that information from unauthorised use and disclosure except where permitted by law. For information about the handling of your personal information please see Data Protection and Privacy Procedure and our Data Protection and Privacy Collection Statements.

If you have any questions about how Monash University is collecting and handling your personal information, please contact our Data Protection and Privacy Office at dataprotectionofficer@monash.edu.

Education & training

Looking to upskill?

Check out our Monash University accredited courses, along with our short and bespoke training programs.


Have a project for us?

We offer a broad range of research services to help governments, industries and NGOs find behavioural solutions.


Explore our resources

We believe in building capacity and sharing knowledge through multiple channels to our partners, collaborators and the wider community.