Deb Hartman and I have just had an article on evidence-informed practice and the integration of research, policy, teaching and practice in family services published by developing practice. The following is the last version we sent them. The citation and published version (but you need to pay for it or have a subscription) is:
Stuart, G., & Hartman, D. (2019). Evidence-informed practice and the integration of research, policy, teaching and practice in family services. Developing Practice: The Child, Youth and Family Work Journal, (53), 34-53. https://search.informit.com.au/documentSummary;dn=972739968885256;res=IELHSS
There is an increasing emphasis on evidence-based programs and practice in the professions of family work, child protection, social work and related fields (Gray, Joy, Plath, & Webb, 2015; McArthur & Winkworth, 2013; Plath, 2017; Shlonsky & Ballan, 2011; Toumbourou et al., 2017; Walsh, Rolls Reutz, & Williams, 2015). In Australia, government funding for family and community services is increasingly dependent on the implementation of evidence-based or evidence-informed practice and evidence-based programs both nationally (Department of Families, 2012; Department of Social Services, 2014; Robinson & Esler, 2016) and at state level (Western Australian Department for Child Protection and Family Support, 2016; NSW Family and Community Services, 2016; State of Victoria‚ Department of Health and Human Services, 2016). Yet “evidence” is a contested term and there are a range of definitions and conceptualisations of evidence in family services.
This paper describes and analyses lessons from a project funded by the Department of Social Service Children and Families Expert Panel to support nine rural family services in NSW to build their capacity to implement evidence-based programs and practices. As academics who focus on family and community practice, we also wanted to explore strategies for creating dialogue and a deeper understanding between research, teaching, policy and practice. Drawing on discussions from a community of practice the formed part of the project we discuss how the practitioners used research evidence, their experience of evidence-based programs, their use of the experience and insights of themselves and other practitioners (practitioner wisdom) and, briefly, the importance they placed on the experience and insight from families, before exploring implications for evidence-informed practice, evidence-based programs, and integrating research evidence, practitioner wisdom and family experience and insights.
What counts as evidence?
It is important to recognise the difference between evidence-based practice as a decision-making process, and evidence-based programs as specific interventions. Although there is no universally accepted definition of evidence-based practice in social work and family work (Plath, 2013; Walsh et al., 2015), it is generally described, as shown in Figure 1, as a decision-making process that incorporates the best research evidence and the best clinical experience, and is consistent with family and client values (Austin & Claassen, 2010; Centre for Community Child Health, 2011; Gray et al., 2015; Levant, 2005; Shlonsky & Ballan, 2011; Walsh et al., 2015).
Evidence-based programs, on the other hand,are a collection of practices or activities that have been standardised so that they can be replicated, have been rigorously evaluated and are, usually, endorsed by a respected independent department or research organisation (Bernal, Jimenez-Chafey, & Domenech Rodriguez, 2009; Cooney, Huser, Small, & O’Connor, 2007; EPISCenter, 2015). These are often delivered to families or specific participants in a sequential package over a set number of sessions.
Increasingly, the term evidence-informed practice is being used rather than evidence-based practice (see for example NSW Family and Community Services, 2016; Plath, 2017, Webber & Carr, 2015). While the terms are sometimes used interchangeabley, evidence-informed practice is generally understood as adopting broader approaches to evidence and valuing practice experience as well as research evidence (Nevo & Slonim-Nevo, 2011; Plath, 2017, Webber & Carr, 2015). We return to the nature of evidence-informed practice later in the paper. Regardless of the terms used, there is an emerging consensus around the importance of evidence for the effective implementation of family interventions and for positive outcomes for families. In the project discussed below, we initially used the term evidence-based practice because it was the term used by the funding body, but as the project progressed, we started using evidence-informed practice because, (as discussed below) this was more consistent with the approach adopted by most of the services.
The traditional approach to evidence in evidence-based programs and practice involves a hierarchy of evidence (see Figure 2) which places greater value on systematic reviews, randomised control trials and quasi-experimental designs (Centre for Community Child Health, 2011; Corby, 2006; Gibbs & Gambrill, 2002; Hall, 2008).
Some authors, however, argue for a more inclusive approach to evidence. Webber & Carr (2015) suggest that evidence can be conceptualised in a “more inclusive and non-hierarchical” manner that, “equally values practice wisdom, tacit knowledge and all forms of knowing. It is thereby viewed as integrative, viewing practice and research less in opposition but more in support of one another” (p. 19). Rather than the hierarchy of evidence, Epstein (2011) proposes a Wheel of Evidence (see Figure 3) in which “all forms of research and information-gathering and interpretations would be critically assessed but equally valued” (p. 225).
This conceptualisation seems particularly useful in projects and processes that attempt to integrate research and practice. In addition, family-centred practice and strengths-based approaches suggest that researchers and practitioners need to work in ways that value the views and experiences of families. The traditional definition of evidence-based practice fails to place significant emphasis on the insights and experience of families (Nevo & Slonim-Nevo, 2011) by suggesting that evidence-based practice should merely be “consistent with client and family values” (Austin & Claassen, 2010; Walsh et al., 2015). Rather than recognising the value of the experience and insights of families as evidence about what works, or could work, in their specific contexts, the evidence comes from research and practitioners with only the requirement that practices be consistent with client and family values.
Despite a range of challenges in implementing evidence-based programs and evidence-based or evidence-informed practice (Gray, Plath, & Webb, 2009; Hall, 2008; Humphreys et al., 2011; Lau, 2006; Shlonsky & Ballan, 2011) and critiques of an increased emphasis on a narrowly defined evidence-based practice (Epstein, 2009, 2011; Nevo & Slonim-Nevo, 2011; Webb, 2001), many practitioners are supportive of an increased emphasis on evidence and are willing to look critically at their practice (Aarons & Palinkas, 2007; Gray et al., 2015; Plath, 2014). Evidence-based or evidence-informed practice is now considered essential for effective policy and practice, and are the basis of much professional development and academic programs for emerging and continuing practitioners. In this context, it is important to carefully examine the benefits and barriers to the use of evidence to inform programs and practices in the range of family service contexts in Australia and to critically appraise the impact of the recent moves towards evidence-based or evidence-informed programs and practice.
The rural NSW Family Services project
In changing policy directions towards the use of evidence, the Australian Department of Social Services (DSS) recognised the need for capacity building for implementing and evaluating evidence based programs and practices. In 2014 they funded the Australian Institute of Family Studies (AIFS) to establish the Children and Families Expert Panel to “help service providers deliver evidence-based programs and practices and to continue to build this evidence base through evaluation, with a focus on prevention and early intervention approaches” (Robinson & Esler, 2016, p. 67).
This paper discusses lessons from a project funded under this Expert Panel, in which the university academics who are the authors of this article, collaborated with staff from nine rural family services with the aim (as identified by the funding body) of building the capacities of service providers and practitioners in these services to implement evidence-based programs and practices. Between them, the nine services were based in 32 towns with populations from under 2000 to around 75,000, and were spread over much of the state. Some of the services were local independent services and others were connected to large state or national organisations.
The project activities included: a workshop with managers from each of the services at the start of the project; at least one site visit to each of the services including a workshop with staff about evidence-based practice; four webinars on relevant topics to implementation of evidence in a context (e.g., fidelity and adaptation, outcome measurement); the development of four brief discussion papers on topics suggested by the participants such as developing program logics and evidence in relation to playgroups; and monthly online community of practice (CoP) meetings (some with specific topics such as developing program logics and using research evidence).
Synergies in professional development and academic teaching approaches and content
The project design intentionally included some innovative approaches using technology to overcome the tyranny of distance between the rural locations and Newcastle. These approaches were drawn from our experience of delivering on-line undergraduate and master’s level courses and programs in Family Studies. We also made specific efforts to get together face-to-face as a whole group at the beginning of the project, and for Graeme Stuart from the University team to visit each location during the project to get a sense of the geographical and social context of the services. Technology was used as we conducted webinars and regular communities of practice sessions via web-based technology. These sessions allowed participants to participate in an online CoP and share ideas around specific topics of interest. The discussions led them to grow their confidence in using evidence to support the development and adaptation of their own evidence-informed programs and practices in rural settings.
Participants commented that they appreciated these opportunities for professional engagement with peers and with the researchers in the project. These e-communication experiences assisted them to overcome a sense of isolation in exploring evidence-informed practice and evidence-based programs. The participants suggested that this could be a useful approach in promoting ongoing discussion about evidence-informed practice and evidence-based programs. In addition, the materials prepared for the project and learnings from the CoP were incorporated into courses in the academic programs we teach, as relevant and contemporary materials from the field.
Incorporating research into capacity building
Although the project was primarily a capacity building initiative, consistent with more inclusive approaches to evidence and integrating research and practice, we took the opportunity to undertake qualitative research exploring the experience of practitioners in relation to evidence-based practice and programs. The research explored three questions:
- How do family workers in rural NSW attempt to incorporate evidence-based programs and practice into their work?
- What do they find are the challenges in adopting evidence-based programs and practice?
- What do they find helps in adopting evidence-based programs and practice?
The qualitative research involved collecting and analysing data from the CoP. The 15 staff who participated in the CoP consented to the CoP being audio recorded. The CoP participants were program managers and family practitioners working with families in a range of contexts including Aboriginal and young families, survivors of family violence and families seeking parenting support in the rural communities. One of the participants had a research role in a large service organisation which auspiced one of the rural services.
Data collected from the CoP was thematically analysed, to identify major themes from the CoP. The process involved re-listening to the recordings and looking again at the detailed notes taken during the COP to become familiar with the date. Based on the recordings, transcriptions and notes, we developed codes (or labels) that identified important features of the data related to the research questions and, with the assistance of NVivo, grouped the codes into broad themes. Themes were reviewed by looking for supporting or contradictory data, and initial themes were discussed with participants (through the CoP and in follow up discussions) to ensure the findings were consistent with the experience of the participants.
There were limitations to the research. Given the importance placed on the experience of families in the above discussion, it is a limitation that the research did not include the voices of families but this was beyond the scope of the funding for the project. It would have also been valuable if participants had been more involved in later stages of data analysis to check that our interpretation was consistent with their experience. This was partly addressed by co-presenting key findings from the research at a conference with one of the research participants.
Pyrko, Dörfler and Eden (2016) suggest CoPs work best when they are allowed to grow organically, and are less likely to meet their full potential if they are set up “in order to obtain knowledge as an output” (p. 390). While the CoP did not grow out of existing relationships (as Pyrko, Dörfler and Eden recommend), the purpose of the CoP was not purely research. It was a time for the participants to share their experiences, reflect on issues relating to evidence-informed practice and evidence-based programs and to learn from each other. The role of the CoP in the research was to explore the breadth of experience rather than to identify come to a common agreement about specific topics. The role of the academics in the CoP was to encourage and facilitate the sharing of ideas.
Key themes from the CoP discussion
In the following discussion we identify some of the main theme arising from the CoP in relation to how the practitioners used research evidence; some of the challenges, and what helped, in implementing evidence-based programs; the role of practitioner wisdom and the importance they placed on the experience and insight from families.
The use of research evidence
The practitioners involved in the project generally felt confident about their work and knew they were making a difference, but many were hesitant about describing themselves as adopting evidence-informed practice. Very rarely was research evidence part of their day-to-day practice. As one manager suggested,
I don’t think we use evidence very effectively and very well. Where we do use [research] evidence, as best as we can, is in the development of program logic where we try to create that link between short, medium outcomes to the bigger picture outcomes that are beyond our capacity to measure… The other way we like to use research is when we do develop a new model but I don’t think we’re very sophisticated in the way we find that research. (CoP 4)
The pressure of direct service delivery meant that practitioners struggled to make the time to search for research evidence and some saw it as a task for somebody else.
We don’t actually get a great deal of time to be innovative in our position at the minute. We don’t have enough time at the moment to be looking into research evidence either to help us because we’re too busy doing the day to day role. (CoP 4)
There’s no one we could flick off: “Oh can you research that this for us?” We’re just working to deliver our programs. (CoP 4)
Some of the CoP participants felt they did not have the necessary experience or skills to find and evaluate research literature, but even if staff were confident looking for research, most organisations did not have access to journal databases. Without free access to journal databases, the cost of downloading articles could be prohibitive.
Most of the managers would appreciate support finding and interpreting the latest research evidence.
Having somebody to find it [research evidence] for us, point it out, highlight it for us and send it through. That would be nice. (CoP 4)
Participants suggested that support could be external to the organisation and involve having somebody to run ideas past and get feedback or to get direction on where to look. One avenue identified for external support was through the Expert Panel or through developing partnerships with universities. They also valued easy to read summaries of research.
The lack of relevant research evidence was seen as a major barrier. For example, the practitioners spoke about the difficulty of finding Australian research; how “most of the evidence has been with mums” (CoP 6) so it was hard to find research that was specifically about fathers; there being “still limited evidence” (CoP 6) about working with Aboriginal families; and the difficulty of finding research evidence relating to specific approaches to working with Australian rural families.
There are some areas where there is so much research evidence around that you could spend years sifting through it and there are some areas that are really research poor and if you’re under a lot of pressure trying to put together a really good tender application, you then have to decide how much time you can invest to back the evidence you’ve gathered through practitioner wisdom and lived evidence to back that through really good solid evidence. (CoP 4)
While using research evidence may not have been part of their every-day practice, services made quite conscious use of it at times. One organisation had a researcher with a focus on evidence-based programs and practice, who had an important role in supporting staff within the organisation. One of the managers consciously incorporated research data (e.g., the Australian Early Development Index) in planning and program delivery.
For example if it tells us that the significant vulnerability for [area] is physical health and wellbeing in children, which is around their gross and fine motor skills, then I think in our programing should be looking at things like our supported playgroup have lots of activities that focus on gross and fine motor skills development. (CoP 5)
At the same time, this manager commented that the discussion generated through the Expert Panel work was the first time that they had “a lot of talk around evidence” with the whole project team.
For some of the services, the main, or only, time they actively looked for research evidence was when they were applying for funds and needed to demonstrate their proposal was evidence-based. Drawing on practitioner wisdom, they designed their proposal and then found research to justify their proposal.
It’s a mix of those four components [research evidence, practitioner wisdom, lived experience and policy] and it’s not just about research evidence. That is probably where we feel we lack the sophistication because we tend to lean more towards the lived experience and practitioner wisdom and when it comes to for example tendering we like to back, so to speak, the lived experience and practitioner wisdom with some clever sounding research evidence. (CoP 4)
While some participants spoke about wanting to make use of research evidence, the day-to-day pressures of service delivery, a perception that research wasn’t always relevant to their context and difficulties were all barriers to using research evidence on a regular basis.
The practitioners we worked with essentially supported the use of evidence-based programs and recognised the role these programs could play in service delivery. In selecting programs, most services relied on registries of evidence-based programs, especially the Australian Institute of Family Studies Program Profiles. Only one service had a researcher who could investigate potential programs and consider the evidence-base of the programs. Most of the practitioners did not check the actual evidence-base of programs listed in registries, but trusted the registries to have done the work of reviewing the evidence base. The registries provided them with a list of potential programs and they then relied on feedback from their own staff or other service providers, and feedback from families, to determine which programs they would deliver.
They were concerned, however, that many programs had been tested in other contexts and their perception was that many programs needed to be adapted to address the context of the families they worked with. For example, in the site visits, staff spoke about evaluations undertaken in an urban setting with non-Aboriginal families, not necessarily being relevant to their work with remote Aboriginal families .
The main hesitation related to the danger of taking programs off the shelf without considering the context. One of the CoP participants involved in introducing evidence-based programs in one of the organisations suggested that:
The whole idea of adaptation needs to be looked at. A lot of the programs are generalist programs and they haven’t been developed with our families’ vulnerabilities in mind. For example parenting programs – a lot of parents are now being mandated to attend parenting programs, and we don’t address that before the parenting program in many cases… When you come in angry, that limits your ability to participate in any of the programs. So even having a couple of weeks where people can debrief and vent about how emotional they feel about and then have a conversation about how a parenting program can help you and how can you see this as a positive and learn something from the experience. (CoP 6)
Some staff reported being told in training for evidence-based programs that they should essentially not change anything in the program, and they found it quite liberating to discover that adaptation can be appropriate and, in fact, can be an important aspect of evidence-based programs (Bernal et al., 2009; Lau, 2006; O’Connor, Small, & Cooney, 2007; Robinson, Tyler, Jones, Silburn, & Zubrick, 2012). There were questions and debate about what constituted appropriate adaptation, but they were generally relieved to realise that, with care, their professional experience and judgement could play a role in implementing evidence-based programs (Kemp, 2016; O’Connor, Small, & Cooney, 2007; Robinson, Tyler, Jones, Silburn, & Zubrick, 2012). For example, some parenting programs they offered had been developed in the days before the wide use of social media by children, and the practitioners believed it was important to include discussion of how parents could respond to issues relating to social media.
A number of the services were also concerned about the lack of parenting programs for parents with children in out-of-home care. Even when programs had been shown to work with parents generally, they believed many of them were inappropriate for parents with children not in their care.
I did have to facilitate [name of program] to parents who didn’t have their kids in their care. They had to do [the program] to get their kids back. So with a program like [name of program] you can imagine how isolating that is asking a parent who you know doesn’t have custody of their kids, to go home and do homework with their child, and monitor their behaviour and stuff like that – it doesn’t work. But if they didn’t go to the course and turn up, they didn’t get the opportunity to get their kids back. (CoP 8)
Participants suggested that a major benefit of evidence-based programs was that they provided structure and guidance to new, inexperienced staff, which was particularly important in rural contexts where family services often struggled to employ people with relevant training, qualifications and experience. Service managers were able to send new workers to training for relevant evidence-based programs with the knowledge that the programs could provide scaffolding for practice while the staff developed skills and experience.
The practitioners—especially managers—were aware, however, of the gap between being trained in an evidence-based program and being able to successfully implement it.
We know that if we send people [staff] off to training, only 10% are going to come back and be able to use that training immediately without any assistance. (CoP 8)
Some CoP participants suggested that the effectiveness of the training, particularly for new staff, was sometimes undermined by the poor quality of the training and spoke about staff from their service receiving inadequate training so that once staff returned, they “had trouble actually implementing it” (CoP 6). In addition to the actual training; coaching, supervision and reflective practice were thus seen as being important in supporting staff to ensure successful implementation of evidence-based programs. Managers were seen as having an important role in implementation but there could be variation in how well managers were able to provide that support. At time managers were “good at helping them [staff] do that [bring evidence-based programs back to the service]” and at helping staff “feel more comfortable and confident implementing it.” But, as one staff member bluntly stated it, “Others aren’t” (CoP 6).
The CoP discussions helped the practitioners explore their practice, and a similar approach was seen as having to potential to help bring staff together to explore implementation.
We need people coming together to have discussions about what’s working, what isn’t working, problem solving. You need like a team, an implementation team, that’s set up to monitor it. And so you have an understanding, you have your logic model, and you understand what the core components of it are and you’re making sure you’re not dropping any of those, but you’re adapting to the needs of the families. And that is a lot of work and a lot of discussion around how you do that well. (CoP 6)
Where existing staff were already trained in a program, newly trained staff could receive support from their colleagues. Where this was not available, some organisations found it helped to send multiple staff to training for specific programs. As one of the managers explained it:
[Names of evidence-based programs] have been delivered for quite a while now, so even as new people get trained in it we’ve got the old heads to be able to bounce off and get some direction and guidance. But then with the newer programs we tend to go away in either at least twos or more so we’ve got somebody to bounce off. Cos everyone’s got different learning styles so one person will pick up one aspect and then the other interpret it a different and then you can sort of feed off each other. (CoP 6)
The significant cost of training staff in evidence-based programs were sometimes a barrier to the services ability to offer a range of evidence-based programs, especially when travel, accommodation and meal expenses were included. Services in rural NSW often had to send staff to Sydney to receive training which greatly increased the cost. It cost one organisation around $25,000 to train eight staff in one of the programs they offered and another spent $1500 in travel and accommodation to send two staff to “free” training. Because accreditation to implement programs is generally given to individuals rather than an organisation, the managers of services were aware there was always a risk of staff turnover and that if this happened shortly after a staff member received expensive training, the investment in training was lost. In order to combat this, one of the organisations had a requirement that if staff left the organisation within 12 months, they had to repay the cost of the training.
In addition to the cost of training, the actual fees for running some of the programs could be considerable, particularly when staff with specific qualifications were required. One organisation would like to offer a particular evidence-based program more frequently but were unable to do so because they had to pay $4500 each time they ran it.
As has been discussed above, practitioners relied heavily on their own experience and insights, and that of other practitioners, in planning and undertaking their work. Where there were skilled staff, practitioner experience had a more solid foundation than when there were inexperienced staff with little training. In developing new programs, rather than drawing on published resources, most services relied on practitioner wisdom: they reflected on what had worked in the past, talked with other staff and designed a project based on this experience.
For example years ago we were funded to deliver a generic family worker service in rural NSW. It was an early intervention and prevention with core components very similar to CAPS [Children and Parenting Support] – supported playgroups, parent education program, home visiting – and through our process of reflective practice, through conversations, supervision, peer meetings and quarterly reviews the one thing that kept popping up was that in our community we saw an awful lot of very young women with children who weren’t accessing our services. So as part of this whole reflective process we then problem solved what the barriers were and in consultation with the local staff member we then came up with a whole new service model targeting especially those teenage mothers, which then, interestingly led to us putting forward a proposal [to another funding round] and we were successful with that. (CoP 4)
Drawing on practitioner wisdom was described as being more reflective than a purely gut reaction. Most of the services had systems in place to promote critical reflection by practitioners. Particularly through supervision and team meetings, practitioners were encouraged to reflect on their practice and to think about improvements.
The team member will evaluate what has worked, what doesn’t work and there’s a little bit of problem solving where people have to think about why something has worked or why something hasn’t worked. They have to submit a good news story and then we sort of link the pieces and based on that information and we try to make improvements to our service. (CoP 4)
One of the challenges identified by staff from the rural family services was that staff sometimes come without a great deal of training or experience, and so less experienced staff need support and guidance as they develop the skills and knowledge needed to form a solid basis for practitioner wisdom.
Experience and insights from families
Practitioners generally felt they were family-led in their work and were thus able to incorporate the experience and insights of families. All the services believed it was important to obtain data and feedback from families through more formal processes and CoP participants identified a range of strategies for doing so including:
- Using measurement tools that came with evidence-based programs
- Using standardised measurement tools, or published measurement tools that had not be standardised
- Using self-developed surveys and feedback sheets
- Following-up with families (in person or by phone) after they had participated in programs
- Using data about families to identify emerging or changing issues
While some of the services did use published tools (e.g., the Parent Empowerment and Efficacy Measure developed by Freiberg, Homel, & Branch, 2014, or ones they were required to use as a condition of offering specific evidence-based programs), a major focus of data gathering was on meeting funding requirements (e.g., the number or families supported and satisfaction data). With the increasing emphasis on outcome measurement (NSW Family and Community Services, 2016; Izmir, 2004; State of Victoria‚ Department of Health and Human Services, 2016) most of the services wanted more assistance in developing processes that could incorporate measurement into their practice and finding relevant measurement tools, especially brief, strengths-based ones.
While recognising they needed to obtain quantifiable data, they also valued more informal processes that helped to inform their service delivery. In particular, they wanted information that could help improve their practice.
We do a follow up after group just to touch base on how they thought it went, how they’re going implementing it, if they need any further advice or guidance around it and then if needing it, doing some one on one work more specific to their child. (CoP 6)
At times the service providers were disturbed by what they discovered from their follow up with families and saw the need to re-access the way in which they were implementing evidence-based programs.
I’ve followed up with families and had a look to see how they’re implementing the program [a commonly used parenting program] and how they’re utilising it, and very few families are. So I think we really need to be looking at what this program is designed to do and how we enable the family to actually be able to change their practice in the house. (CoP 6)
Even though this service was using an evidence-based program, conversations with families suggested they were not using the information provided in the program. The service was thus thinking about how they could make changes to increase the effectiveness of their work. As discussed below, it highlights the importance of combining research evidence, practitioner wisdom and the experience and insights of families.
Evidence-informed practice: Integrating research evidence, practitioner wisdom and family experience and insights
In exploring approaches to evidence-based practice, evidence-informed practice and evidence-based programs with the rural practitioners, an issue raised in the literature became evident. As identified by Shonkoff (2000), science (or research as it is referred to by others such as Arney, Bromfield, Lewig, & Holzer, 2009; McArthur & Winkworth, 2013), policy and practice, each have distinct cultures and there can be a range of challenges in working across these different cultures (e.g., different priorities, ways of thinking and timeframes). One of the aims of this project was to create a dialogue and a deeper understanding between these sectors of the field and thereby break down the barriers between them, to the benefit of the families attending the services or those in need of them.
While recognising the “imperative of combining the best of these three perspectives” (p. 187), Shonkoff (2000) emphasises the role of science in defining knowledge. For example he speaks of the “transmission of knowledge from the academy to the worlds of social policy and human service delivery” (p. 183) and suggests that “established knowledge is defined by the scientific community” (p. 183, his emphasis). Like Shonkoff, traditional approaches to evidence-based practice privilege research evidence and are often based on a hierarchy of evidence (e.g., Gibbs & Gambrill, 2002), and focus on developing the skills of practitioners so that they can incorporate this evidence into their practice. By adopting a broader approach to evidence, we argue that evidence-informed practice is more relevant to the experience of rural family workers and, with support, can assist them in using evidence from a range of sources in “a creative and discriminating way throughout the intervention process” (Nevo & Slonim-Nevo, 2011, p. 1178).
Humphreys et al. (2011) propose a knowledge diamond (see Figure 4) that suggests a less hierarchical approach to knowledge by giving equal prominence to research evidence, practitioner wisdom, lived experience and policy. In the workshops offered as part of this project, this broad understanding of knowledge was appreciated by the practitioners in the rural services. Like the broader family sector, the services we worked with described themselves as being strengths-based and, to a lesser extent, family centred, both of which recognise and value the skills, expertise and insights of the families they work with (Dunst, Trivette, & Hamby, 2007; Scerra, 2012). While more research and discussion is needed, we believe that approaches to evidence-informed practice that incorporate evidence from research, practice and lived experience are more likely to be supported by rural family practitioners than approaches to evidence-based practice that have a more narrow understanding of what constitutes evidence.
We thus propose that, in the context of family services, evidence-informed practice should value and incorporate research evidence, practitioner wisdom and family experience and insights (see Figure 5). In the workshops and CoP this understanding of evidence-informed resonated with the practitioners and could provide a useful framework for the work that services in rural areas undertake with diverse families.
There is real potential for family services to be a crucial part of a coordinated strategy to prevent a range of priority health and social problems, but this requires them to continue developing their expertise in evidence-informed practice and the delivery and evaluation of evidence-based programs (Toumbourou et al., 2017). This project was initiated by DSS in order to increase the capacity of rural family services to do so, because the effectiveness of evidence-informed practice is largely depended on not only the skills of practitioners but also on the organisational support of services they work in (Plath 2013, 2017). For example, skills and experience play an important role in practitioner wisdom (Epstein, 2009; Nevo & Slonim-Nevo, 2011) and so new practitioners with limited training need support in drawing on practitioner experience, although it is important not to undervalue the lived and other experience of these practitioners.
In order for family practitioners to be able to draw on research evidence, practitioner wisdom, and family experience and insights, there also needs to be changes to research, policy and teaching. Building on Epstein (2009) we suggest there is a need to:
- Recognise and value the experience and insights of families, practitioner wisdom, and research evidence, and to incorporate them into decision making by researchers, policy-makers and practitioners
- Support theoretical, applied and practice-based research
- Create and promote collaborative relationships between researchers, policy makers and practitioners
- Empower practitioners as co-creators of knowledge that informs practice and research
- Recognise and value a wide range of research methodologies and strategies.
As DSS demonstrated by funding projects such as the one described above, there is a recognition that there is a need for capacity building within family services. The learnings from this project on implementing evidence-based programs and evidence-informed practice indicate that practitioners generally support their use and agree that greater support and capacity building in the sector is required. While they generally felt able to draw on their knowledge of, and relationships with, families to incorporate their strengths and experience into practice, and to draw on their own practitioner wisdom, they felt less able to obtain, interpret and incorporate research evidence. Participants in this capacity building project felt that regular or ongoing support networks, such as communities of practice; easy-to-access resources (e.g., research summaries, measurement tools); and mentoring could support them in incorporating current research into their work, undertaking practice-based research and measuring the impact of their programs.
A potentially contentious issue is finding a balance between fidelity (staying true to the original design of an evidence-based program) and adaptation because there is debate about the extent to which programs can be adapted without impacting their effectiveness (see for example Horne, 2016; Kemp, 2016; O’Connor, Small, & Cooney, 2007; Walsh et al., 2015). The practitioners in this project believed that programs sometimes needed to be adapted to meet the contexts they were working in, and a commitment to integrating research evidence and practitioner wisdom would involve listening to these concerns and working with them. There are approaches to fidelity and adaptation that do allow some flexibility through strategies such as identifying core elements of a program where fidelity is important and, for other elements, facilitating adaptations to meet local contexts (Castro, Barrera & Martinez, 2004; Kemp, 2016; Moore, Bumbarger & Cooper, 2013) which could be explored further.
While increasing the capacity of practitioners in adopting evidence-informed practice and evidence-based programs is important, changes in research, teaching and policy could also assist. While more discussion is needed, the following are some starting suggestions. In research there could be a greater emphasis on partnering with practitioners in planning, designing and undertaking practice-based research; creating accessible, plain-language dissemination strategies that clearly articulate the implication for practice; create; develop strengths-based, user friendly measurement tools that are freely available and support practitioners to incorporate them into practice; and providing support to adapt evidence-based programs for local contexts. In tertiary level teaching there could be a greater emphasis on providing alternatives to traditional degree programs so that professional training is flexible, can be delivered in multiple modes and has ‘stackable’ modules that could lead to multiple pathways to tertiary awards; providing training in regional areas and visiting services; and providing specific training in measuring outcomes, adapting programs without reducing their effectiveness, and evidence-informed practice. In policy there could be a greater emphasis on supporting partnerships between family services and researchers (e.g., funding initiatives like the Expert Panel and free mentoring or consultation opportunities); providing information about programs that have been demonstrated to work in a range of settings (including identifying elements that should not be modified and elements that can); and supporting the creation and dissemination of publications discussing the implications of research for practice.
Collaborative processes involving academic researchers and teachers, and practitioners at various levels of service organisations, such as the one on which this paper is based, can help build capacity of multiple stakeholders. Our work with the services demonstrated that collaborations require capacity building and a willingness to change on behalf of all partners in the collaboration (Stuart, Hartman, & Crawlee, 2016). In particular there is value in researchers and academics working in partnership with service providers, and policy makers helping to create the opportunities for such collaborations to occur.
In closing we want to acknowledge the work done by the Australian Institute of Family Studies (AIFS), and the Government’s ongoing funding of it, in promoting many strategies that help translate research into practice; promote practice-based research; support family services in accessing relevant research and information about evidence-based programs; and answer questions practitioners may have. We often draw on their publications and refer students and practitioners to their website and services and believe they provide an extremely valuable service.
The reference list from the article is below. The full reference for the article is:
Stuart, G., & Hartman, D. (2019). Evidence-informed practice and the integration of research, policy, teaching and practice in family services. Developing Practice: The Child, Youth and Family Work Journal, (53), 34-53. https://search.informit.com.au/documentSummary;dn=972739968885256;res=IELHSS
If you liked this post please follow my blog, and you might like to look at:
- Evidence-informed practice, evidence-based programs and measuring outcomes
- Rethinking the roles of families and clients in evidence-based practice
- A strengths-based approach to collective impact
- Strengths-based measurement and collective impact
- Power and strengths-based practice
- Engaging Aboriginal fathers
If you find any problems with the blog, (e.g., broken links or typos) I’d love to hear about them. You can either add a comment below or contact me via the Contact page.
Aarons, G., & Palinkas, L. (2007). Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34(4), 411-419. doi: 10.1007/s10488-007-0121-3
Arney, F. M., Bromfield, L. M., Lewig, K., & Holzer, P. (2009). Integrating strategies for delivering evidence-informed practice. Evidence & Policy: A Journal of Research, Debate & Practice, 5(2), 179-191.
Austin, M. J., & Claassen, J. (2010). Implementing evidence-based practice in human service organizations: Preliminary lessons from the frontlines. In M. J. Austin (Ed.), Evidence for child welfare practice. London: Routledge.
Bernal, G., Jimenez-Chafey, M. I., & Domenech Rodriguez, M. M. (2009). Cultural adaptation of treatments: A resource for considering culture in evidence-based practice. Professional Psychology – Research & Practice, 40(4), 361-368. doi: 0.1037/a0016401
Castro, F. G., Barrera, M., & Martinez, C. R. (2004). The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prevention Science, 5(1), 41-45. doi: 10.1023/B:PREV.0000013980.12412.cd
Centre for Community Child Health. (2011). Evidence-based practice and practice-based evidence: What does it all mean? Policy Brief: Translating early childhood research evidence to inform policy and practice (21). Retrieved from http://ww2.rch.org.au/emplibrary/ecconnections/Policy_Brief_21_-_Evidence_based_practice_final_web.pdf
Cooney, S. M., Huser, M., Small, S., & O’Connor, C. (2007). Evidence-based programs: An overview. What Works, Wisconsin – Research to Practice Series (6). Retrieved from https://fyi.uwex.edu/whatworkswisconsin/files/2014/04/whatworks_06.pdf
Corby, B. (2006). Applying research in social work practice. Maidenhead, England: Open University Press.
Department of Families, Housing, Community Services and Indigenous Affairs. (2012). Family support program: Future directions discussion paper. Canberra: Department of Families, Housing, Community Services and Indigenous Affairs. Retrieved from https://www.dss.gov.au/sites/default/files/documents/10_2012/fsp_discussion_paper.pdf
Department of Social Services. (2014). Families and communities programme: Families and children guidelines overview. Retrieved from https://www.dss.gov.au/grants/grant-programmes/families-and-children
Dunst, C. J., Trivette, C. M., & Hamby, D. W. (2007). Meta-analysis of family-centered helpgiving practices research. Mental Retardation & Developmental Disabilities Research Reviews, 13(4), 370-378. doi: 10.1002/mrdd.20176
EPISCenter. (2015). Defining evidence based programs. Retrieved 5 April, 2018, from http://www.episcenter.psu.edu/ebp/definition
Epstein, I. (2009). Promoting harmony where there is commonly conflict: Evidence-informed practice as an integrative strategy. Social Work in Health Care, 48(3), 216-231. doi: 10.1080/00981380802589845
Epstein, I. (2011). Reconciling evidence-based practice, evidence-informed practice, and practice-based research: The role of clinical data-mining. Social Work, 56(3), 284-288.
Freiberg, K., Homel, R., & Branch, S. (2014). The Parent Empowerment And Efficacy Measure (PEEM): A tool for strengthening the accountability and effectiveness of family support services. Australian Social Work, 67(3), 405-418. doi: 10.1080/0312407X.2014.902980
Gibbs, L., & Gambrill, E. (2002). Evidence-based practice: Counterarguments to objections. Research on Social Work Practice, 12(3), 452-476. doi: 10.1177/1049731502012003007
Gray, M., Joy, E., Plath, D., & Webb, S. A. (2015). What supports and impedes evidence-based practice implementation? A survey of Australian social workers. British Journal of Social Work, 45, 667-684. doi: 10.1093/bjsw/bct123
Gray, M., Plath, D., & Webb, S. A. (2009). Evidence-based social work: A critical stance. Hoboken: Taylor & Francis.
Hall, J. C. (2008). A practitioner’s application and deconstruction of evidence-based practice. Families in Society: The Journal of Contemporary Social Services, 89(3), 385-393. doi: doi:10.1606/1044-3894.3764
Horne, C. S. (2016). Assessing and strengthening evidence-based program registries’ usefulness for social service program replication and adaptation. Evaluation Review. doi: 10.1177/0193841×15625014
Humphreys, C., Marcus, G., Sandon, A., Rae, K., Wise, S., Webster, M., & Waters, S. (2011). Informing policy with evidence: Successes, failures and surprises. In K. Dill & W. Shera (Eds.), Implementing evidence-informed practice: International perspectives. Toronto: Canadian Scholars’ Press.
Izmir, G. (2004). Measuring outcomes in child and family services. Developing Practice: The Child, Youth and Family Work Journal, 11, 13-16.
Kemp, L. (2016). Adaptation and fidelity: A recipe analogy for achieving both in population scale implementation. Prevention Science, 17(4), 429-438. doi:10.1007/s11121-016-0642-7
Lau, A. S. (2006). Making the case for selective and directed cultural adaptations of evidence-based treatments: Examples from parent training. Clinical Psychology: Science and Practice, 13, 295-310. doi: 10.1111/j.1468-2850.2006.00042.x
Levant, R. F. (2005). Report of the 2005 Presidential Task Force on Evidence-Based Practice. American Psychological Association. Retrieved from https://www.apa.org/practice/resources/evidence/evidence-based-report.pdf
McArthur, M., & Winkworth, G. (2013). Powerful evidence: Changing policy and practice through research. Developing Practice: The Child, Youth and Family Work Journal(35), 41-53.
Moore, J. E., Bumbarger, B. K., & Cooper, B. R. (2013). Examining adaptations of evidence-based programs in natural contexts. The Journal of Primary Prevention, 34(3), 147-161. doi: 10.1007/s10935-013-0303-6
Nevo, I., & Slonim-Nevo, V. (2011). The myth of evidence-based practice: Towards evidence-informed practice. British Journal of Social Work, 41(6), 1176-1197. doi: 10.1093/bjsw/bcq149
NSW Family and Community Services. (2016). Targeted Earlier Intervention Programs: Reform directions – local and client centred. Retrieved from https://www.fams.asn.au/sb_cache/associationnews/id/42/f/TEI%20Program%20Reform%20Directions%20-%20local%20and%20client%20centred%20%28002%29.pdf
O’Connor, C., Small, S. A., & Cooney, S. M. (2007). Program fidelity and adaptation: Meeting local needs without compromising program effectiveness. What Works, Wisconsin – Research to Practice Series (4). Retrieved from http://fyi.uwex.edu/whatworkswisconsin/files/2014/04/whatworks_04.pdf
Pyrko, I., Dörfler, V., & Eden, C. (2017). Thinking together: What makes Communities of Practice work? Human Relations, 70(4), 389–409. https://doi.org/10.1177/0018726716661040
Plath, D. (2013). Organizational processes supporting evidence-based practice. Administration in Social Work, 37(2), 171-188. doi: 10.1080/03643107.2012.672946
Plath, D. (2014). Implementing evidence-based practice: An organisational perspective. British Journal of Social Work, 44, 905-923. doi: 10.1093/bjsw/bcs169
Plath, D. (2017). Engaging human services with evidence-informed practice. Washington, DC: NASW Press.
Robinson, E., & Esler, M. (2016). The expert panel project: Towards better outcomes for families. Family Matters (97), 67-72.
Robinson, G., Tyler, W., Jones, Y., Silburn, S., & Zubrick, S. R. (2012). Context, diversity and engagement: Early intervention with Australian Aboriginal families in urban and remote contexts. Children & Society, 26(5), 343-355. doi: 10.1111/j.1099-0860.2010.00353.x
Scerra, N. (2012). Strengths-based practices: An overview of the evidence. Developing Practice: The Child, Youth and Family Work Journal (31), 43-52.
Shlonsky, A., & Ballan, M. (2011). Evidence-informed practice in child welfare: Definitions, challenges and strategies. Developing Practice: The Child, Youth and Family Work Journal (29), 25-42.
Shonkoff, J. P. (2000). Science, policy, and practice: Three cultures in search of a shared mission. Child Development, 71(1), 181-187. doi: doi:10.1111/1467-8624.00132
State of Victoria‚ Department of Health and Human Services. (2016). Roadmap for reform: Strong families, safe children: The first steps. Melbourne: Victorian Government. Retrieved from https://www.strongfamiliessafechildren.vic.gov.au/18389/documents/34590
Stuart, G., Hartman, D., & Crawlee, D. (2016). Planning and implementing evidence-based programs and practice in family service in rural and regional nsw. Paper presented at the Measuring success in the family and relationship sector, Canberra. Retrieved from https://sustainingcommunity.wordpress.com/2016/11/30/frsa-conference/
Toumbourou, J. W., Hartman, D., Field, K., Jeffery, R., Brady, J., Heaton, A., . . . Heerde, J. A. (2017). Strengthening prevention and early intervention services for families into the future. Deakin, ACT: FRSA and Deakin University. Retrieved from https://frsa.org.au/wp-content/uploads/2016/05/FRSA-Research-Report-Printable.pdf
Walsh, C., Rolls Reutz, J., & Williams, R. (2015). Selecting and implementing evidence-based practices: A guide for child and family serving systems (2nd ed.). San Diego, CA: California Evidence-Based Clearinghouse for Child Welfare. Retrieved from http://www.cebc4cw.org/files/ImplementationGuide-Apr2015-onlinelinked.pdf
Webb, S. (2001). Some considerations on the validity of evidence-based practice in social work. British Journal of Social Work, 31, 57-79.
Webber, M., & Carr, S. (2015). Applying research evidence in social work practice: Seeing beyond paradigms. In M. Webber (Ed.), Applying research evidence in social work practice. London: Palgrave.
Western Australian Department for Child Protection and Family Support. (2016). Building a Better Future: Out-of-Home Care Reform in Western Australia. Retrieved from http://mypeer.org.au/files/2013/05/My-Peer-Toolkit-V1-Constructing-a-Program-Logic-Model.pdf