In some ways we live in an exciting time where we are learning more and more about what works—and what doesn’t—in helping to nurturing strong families and communities. Research and research evidence play a crucial role in these discoveries and in ensuring that our work makes a difference.
When I started working with communities and families over 30 years ago, there were very few rigorously evaluated programs or approaches in family and community work. Youth and families workers often relied on their gut instincts. Now there are many programs and approaches that have been shown to make a difference and, as a field, we are thinking much more critically about how we know we make a difference.
There is no doubt that research and research evidence play an important role in innovation and new ways of working.
The greater emphasis on trauma informed care has been built on research and evidence from a range of fields and has led to a transformation in how many services approach their work. Some of my colleagues in the Family Action Centre’s fathers and families research team are exploring the importance of rough and tumble play in the development of children and the implications for parenting, and evaluating the use of SMS as a tool to support new fathers and in promoting parenting partnerships.
But here, my focus is more specifically on evidence-based (or evidence-informed) practice and evidence-based programs.
Evidence-based practice is generally defined as a decision-making process that incorporates the best research evidence and the best clinical experience, and is consistent with family and client values. Evidence-based practice grew out of the medical field, but the family and community sectors face quite different types of issues. We face a wide range of complex problems in which there are no clearly defined “solutions” and what might help one family may not help another.
In this context, it is important that families play a more active role in deciding what is likely to work. I thus argue that we should actively draw on the experience and insights of families, rather than simply ensuring that our approach is “consistent” with their values (as in the traditional definition).
Although it is often presented as a linear 5 step process, generally undertaken by individual workers, Debbie Plath1, argues that it is better understood as a cyclical process involving organisational processes. Her five phases are:
- Define and redefine practice questions
- Gather evidence from a range of sources
- Critically appraise the evidence for its relevance and reliability
- Make practice decisions regarding principles, interventions, programs and practices
- And evaluate the evidence-based practice process and client outcomes, again using a variety of sources of evidence.
It also involves organisational processes to engage staff and move programs through the various phases.
It’s important to be clear about what we mean by the term evidence. A narrow view of evidence sees evidence in a hierarchy with systemic reviews of multiple randomised control trials as the best evidence, going down to qualitative evidence which has far less weight.
Defining evidence too narrowly, however, can lead to important, useful information being ignored2. For innovation we need a much broader understanding of evidence. Irwin Epstein3 proposes a wheel of research evidence in which “all forms of research and information-gathering” are equally valued (p. 225). What isn’t included in this wheel is practitioner wisdom, and the insights and experience of families, which are also crucial.
When we think about evidence-based programs (which are programs that can be replicated after having been standardised and evaluated) there is a need to balance fidelity (staying true to the original program design) and adaptation (ensuring the program is appropriate for the context). A rigid approach to fidelity which doesn’t allow for any adaptation, is a significant barrier to innovation.
Adapting evidence-based programs is a crucial part of innovation and often leads to better outcome than a rigid application of a program4.
Just because a program has been shown to work in one context, there is no guarantee that it will work in another. We wouldn’t walk into a chemist or drug store and pick a medicine simply because it is evidence-based. We would need to make sure that it is appropriate for our particular circumstance.
We also need to make sure that programs can be successful in real life conditions. There is a difference between efficacy and effectiveness. Efficacy involves demonstrating that a program can work under controlled (often ideal) conditions, whereas effectiveness involves demonstrating that it works under the conditions typically encountered in the field.
There are a range of challenges in adopting evidence-based practice including:
- We work in a context where a great deal of policy is not evidence-based (take the law and order debate)
- What we measure becomes important (which is why strengths-based measurement is so important)
- Well researched programs (or programs that have the money for research) get supported—not necessarily the most effective (because they mightn’t have the money to do expensive evaluation studies)
- Evidence-based programs and practice take time and money
- There is a need to support organisations—particularly small ones—to develop expertise in evidence-based practice
- There can be a tension between the requirements of research and practice.
Evidence-based practice has many advantages including:
- It helps us to be as effective as we can
- It helps us to be sure that families and communities are better off because of our work
- Using data and evidence from a range of sources can help us achieve better results
- Critical reflection leads to better practice and innovation.
To finish, I want to suggest a few questions for services to consider as they explore and adopt evidence-based practice:
- How can we demonstrate (and actually be) evidence-informed?
- How can we tell if our innovations are really making a difference?
- How can we draw on research, practitioner wisdom and lived experience in our day to day work?
- How can we promote critical reflection?
- Are we willing to take risks?
This post is based on a talk I gave at a Social Innovation Summit organised by Community Compass.
If you liked this post please follow my blog, and you might like to look at:
- What is evidence-based practice?
- What are evidence-based programs?
- What is evidence-informed practice?
- What are complex problems?
- Rethinking the roles of families and clients in evidence-based practice
- What are program logic models?
If you find any problems with the blog, (e.g., broken links or typos) I’d love to hear about them. You can either add a comment below or contact me via the Contact page.
- Plath, D. (2014). Implementing evidence-based practice: An organisational perspective. British Journal of Social Work, 44, 905-923. doi: 10.1093/bjsw/bcs169 Available from http://bjsw.oxfordjournals.org/content/44/4/905
- Gray, M., Plath, D., & Webb, S. A. (2009). Evidence-based social work a critical stance. Hoboken: Taylor & Francis.
- Epstein, I. (2011). Reconciling evidence-based practice, evidence-informed practice, and practice-based research: The role of clinical data-mining. Social Work, 56(3), 284-288.
- Levant, R. F. (2005). Report of the 2005 presidential task force on evidence-based practice: American Psychological Association. Available from https://www.apa.org/practice/resources/evidence/evidence-based-report.pdf