In Australia and elsewhere, government and other funders increasingly require family services to adopt evidence-based programs. For example, Communities for Children—a federally funded program in 52 disadvantaged communities across Australia with a focus on improving early childhood development and wellbeing of children from birth to 12 years—now requires that 50% of the funds for direct service delivery should be used to “purchase high quality evidence-based programs” (p. 11).
This post reflects on the experience of nine children and parenting support programs in regional and rural NSW in relation to adopting and implementing evidence-based practice. It’s based on some work I did though the Family Action Centre (University of Newcastle) did funded by the Department of Social Services through the Children and Families Expert Panel. (There’s more information and posts related to this work here.)
Evidence-based programs are programs that have research evidence demonstrating they are effective. Generally there is an expectation that evaluations of programs have included:
- Rigorous, systematic, and objective procedures which obtain reliable and valid knowledge about the impact of the program
- The collection and analysis of adequate data to justify the research conclusions
- Appropriate measures that substantiate claims about improved outcomes as the result of participation in the intervention (Williams et al., 2015) .
There are a range of registers (e.g., the Australian Institute of Family Studies Program Profiles and the California Evidence-Based Clearinghouse for Child Welfare Program Registry) that list evidence-based programs meeting their criteria. The criteria for inclusion on these registers can vary widely.
The practitioners from rural services we worked with essentially supported the use of evidence-based programs and recognised the role they could play in service delivery. They were concerned, however, that many programs had been tested in other contexts and needed to be adapted to address the context of the families they worked with. For example evaluations undertaken in an urban setting with non-Aboriginal families, were not necessarily relevant to their work with remote Aboriginal families.
A number of the services were also concerned about the lack of parenting programs for parents who did not the care of their children. Even though the programs had been show to work with parents generally, they found many of them were inappropriate when the parents did not have care of their children.
I did have to facilitate Triple P to parents who didn’t have their kids in their care. They had to do Triple P to get their kids back. So with a program like Triple P you can imagine how isolating that is asking a parent who you know doesn’t have custody of their kids, to go home and do homework with their child, and monitor their behaviour and stuff like that – it doesn’t work. But if they didn’t go to the course and turn up, they didn’t get the opportunity to get their kids back…
As can be seen, the main hesitation related to the danger of taking programs “off the shelf” without considering the context. One of the staff involved in introducing evidence-based programs in one of the organisations suggested that:
The whole idea of adaptation needs to be looked at. A lot of the programs are generalist programs and they haven’t been developed with our families’ vulnerabilities in mind. For example parenting programs – a lot of parents are now being mandated to attend parenting programs, and we don’t address that before the parenting program in many cases… When you come in angry, that limits your ability to participate in any of the programs. So even having a couple of weeks where people can debrief and vent about how emotional they feel about and then have a conversation about how a parenting program can help you and how can you see this as a positive and learn something from the experience.
Some staff had been told in training for evidence-based programs that they should essentially not change anything in the program, and they found it quite liberating to discover that adaptation can be appropriate and, in fact, can be an important aspect of evidence-based programs [3-5]. (Click here for more on program fidelity and adaptation.) There were questions and debate about what constituted appropriate adaptation, but they were generally relieved to realise that their professional experience and judgement played an important role in implementing evidence-based programs.
Rural family services often struggled to employ people with relevant training, qualifications and experience, and evidence-based programs provided structure and guidance to new, inexperienced staff. Service managers were able to send new workers to training for relevant evidence-based programs with the knowledge that the programs could provide scaffolding as staff developed skills and experience.
The value of the training, particularly for new staff, depended on the quality of the training. Some services spoke about their staff receiving inadequate training so that once staff returned, they “had trouble actually implementing it.” Services spoke about using coaches, supervision and reflective practice to promote successful implementation. At time managers were “good at helping them [staff] do that [bring evidence-based programs back to the service]” and at helping staff “feel more comfortable and confident implementing it.” But, as one staff member bluntly stated it “others aren’t.”
Where existing staff were already trained in a program, newly trained staff could receive support from their colleagues. Where this was not available, some organisations found it helped to send multiple staff to training for specific programs. As one of the managers explained it:
[Names of evidence-based programs] have been delivered for quite a while now, so even as new people get trained in it we’ve got the old heads to be able to bounce off and get some direction and guidance. But then with the newer programs we tend to go away in either at least twos or more so we’ve got somebody to bounce off. Cos everyone’s got different learning styles so one person will pick up one aspect and then the other interpret it a different and then you can sort of feed off each other. .
The cost of training staff in evidence-based programs were significant, especially when travel, accommodation and meal expenses were included. Services in rural and regional NSW often had to send staff to Sydney to receive training which greatly increased the cost. It cost one organisation around $25,000 to train eight staff in one of the programs they offered and another spent $1500 in travel and accommodation to send two staff to “free” training. The managers of services were quite aware of a risk that once staff were trained they could find a new job, particularly as accreditation to implement programs is generally given to individuals rather than an organisations. One of the organisations thus had a requirement that if staff left the organisation within 12 months, they had to pay back the cost of the training.
The actual fees from running some of the programs were considerable as well, particularly when staff with specific qualifications were required. One organisation would like to offer a particular evidence-based program more frequently but were unable to do so because they had to pay $4500 each time they ran it.
In selecting programs, most services relied on registries of evidence-based programs, especially the Australian Institute of Family Studies Program Profiles. Only one service had a researcher who could investigate potential programs and consider the evidence-base of the programs. Most services did not check the actual evidence-base of programs listed in registries, but trusted the registries to have done the work of reviewing the evidence base. The registries provided them with a list of potential programs and they then relied on feedback from their own staff or other service providers, and feedback from families, to determine which programs they would deliver.
If funding bodies are to encourage a greater reliance of evidence-based programs, we need to ensure that services, particularly rural and regional ones, have the funding and resources they need to implement them, and that they receive appropriate support in introducing and adapting relevant programs. At the same time, we need to be careful of creating an industry in that produce evidence-based programs as products for sale. There is a risk that rather than funds going to the service deliverers, that funding will go towards buying evidence-based programs and paying for narrow, program-based training.
The post is based on work we did with nine children and parenting support programs in regional and rural NSW to assist them in enhancing their capacity to implement evidence-based programs and practice. The work was funded by the Department of Social Services through the Children and Families Expert Panel.
If you liked this post please follow my blog, and you might like to look at:
- What is evidence-based practice?
- What are evidence-based programs?
- What is evidence-informed practice?
- Rethinking the roles of families and clients in evidence-based practice
- Strengths-based measurement
- Posts from the expert panel work on evidence-based practice
If you find any problems with the blog, (e.g., broken links or typos) I’d love to hear about them. You can either add a comment below or contact me via the Contact page.
- Department of Social Services. (2012). Communities for children facilitating partner operational guidelines. Australian Government. Retrieved from https://www.dss.gov.au/sites/default/files/documents/09_2014/cfc_fp_operational_guidelines_-_v_1_1_5_september_2014.pdf
- Williams, K. E., Berthelsen, D., Nicholson, J. M., & Viviani, M. (2015). Systematic literature review: Research on supported playgroups. Brisbane: Queensland University of Technology. Available from http://eprints.qut.edu.au/91439/
- Lau, A. S. (2006). Making the case for selective and directed cultural adaptations of evidence-based treatments: Examples from parent training. Clinical Psychology: Science and Practice, 13, 295-310. doi: 10.1111/j.1468-2850.2006.00042.x Available from http://onlinelibrary.wiley.com/doi/10.1111/j.1468-2850.2006.00042.x/abstract
- O’Connor, C., Small, S. A., & Cooney, S. M. (2007). Program fidelity and adaptation: Meeting local needs without compromising program effectiveness. What Works, Wisconsin – Research to Practice Series(4). Available from http://fyi.uwex.edu/whatworkswisconsin/files/2014/04/whatworks_04.pdf
- Bernal, G., Jimenez-Chafey, M. I., & Domenech Rodriguez, M. M. (2009). Cultural adaptation of treatments: A resource for considering culture in evidence-based practice. Professional Psychology – Research & Practice, 40(4), 361-368. doi: 0.1037/a0016401 Available from http://ezproxy.newcastle.edu.au/login?url=http://ovidsp.ovid.com?T=JS&CSC=Y&NEWS=N&PAGE=fulltext&D=ovftk&AN=00001326-200908000-00009