Program fidelity is an important concept in evidence-based programs. It is the “extent to which an enacted program is consistent with the intended program model” [1, p. 202]. In other words, it’s about ensuring we stay true to the original design of an evidence-based program when we implement it . In the above short video about program fidelity, I like the analogy of fidelity being similar to ensuring you follow a recipe when baking a cake.
When we bake a cake, if we want to get the cake in the recipe, we need to essentially follow the recipe. But at times we need to adapt the recipe. For example if we don’t have self-raising flour we can use plain flour and baking powder or we might need to change how long we cook it because our oven is hotter or cooler than the one used in developing the recipe.
There are somethings you can change without changing the end result, but if you change the recipe too much, or change key components, you end up making something completely different. It won’t necessarily be worse, but it won’t be the cake in the recipe.
When we first start learning to cook, it is probably important to follow the recipe closely. As we become more experienced, we can begin to make changes without ruining it.
It is the same when working with families and communities. It is quite appropriate for experienced, skilled practitioners to modify programs to meet the specific context they are working in. Less skilled or experienced practitioners need to be more careful. (When my daughter first started cooking she nearly used two tablespoons of cumin rather than two teaspoons of cinnamon in some muffins!) I sometimes cringe when I hear practitioners make off the cuff comments that undermine the key messages of a program. I’ve also heard people capture the essence of a program in a simple and clear example.
Programs cannot simply be taken from one context and placed into another without considering differences in the intended participants. I worry when I hear people say that you can’t change evidence-based programs: a parenting program developed and tested in Sydney with middle class Caucasians will need to be adapted to work with a remote Aboriginal community in Central Australia.
O’Connor, Small and Conney  discuss acceptable and unacceptable (or risky) adaptions to evidence-based programs. Acceptable adaptions include:
- Changing language – translating and/or modifying vocabulary
- Replacing images to show children and families that look like the target audience
- Replacing cultural references
- Modifying some aspects of activities such as physical contact
- Adding relevant, evidence-based content to make the program more appealing to participants (p. 2).
Unacceptable or risky adaptions include:
- Reducing the number or length of sessions or how long participants are involved
- Lowering the level of participant engagement
- Eliminating key messages or skills learned
- Removing topics
- Changing the theoretical approach
- Using staff or volunteers who are not adequately trained or qualified
- Using fewer staff members than recommended (p. 20).
The analogy of following a recipe made me think of complex problems. Glouberman and Zimmerman  highlight the difference between simple problems, complicated problems and complex problems by comparing following a recipe (a simple problem), sending a rocket to the moon a (complicated problem) and raising a child (a complex problem). Running a parenting program or supporting families is not as simple as following a recipe: we can’t just follow a formula. Because raising children is a complex problem, what will work in one context will not work in another, and what worked one time may not work in the same way at another time.
Evidence-based practice is built on the intersection between research evidence, practitioner wisdom and experience, and the experience and insights of families.Evidence-based programs are based on research evidence, but in implementing them we also need to draw on practitioner wisdom in ensuring the program is appropriate for the specific contexts of the families involved.
When we do adapt a program it is important to monitor the changes and to ensure that our adaptions still mean the program makes a difference to the families or communities we work with. Collecting evidence that indicates the difference we are making to those involved needs to be part of our work.
[Thanks to Cathy Stirling for passing on this video and for our ongoing discussion about evidence-based practice.]
This post came from a project I’m working on supporting nine children and parenting support programs in regional and rural NSW to enhance their capacity to implement evidence-based programs and practice. The project was funded by the Department funded by the Department of Social Services through the Children and Families Expert Panel. You can see other posts relating to this work at https://sustainingcommunity.wordpress.com/resources-for-students/expert-panel-caps/.
If you liked this post please follow my blog (top right-hand corner of the blog), and you might like to look at:
- What are program logic models?
- Research evidence for family (and community) workers
- Finding literature on working with families
- A literature review on supported playgroups
- What works in connecting families, communities and schools?
- Some good articles/links – evidence-based programs and practice
 Century, J., Rudnick, M., & Freeman, C. (2010). A Framework for Measuring Fidelity of Implementation: A Foundation for Shared Language and Accumulation of Knowledge. American Journal of Evaluation, 31(2), 199-218. doi: 10.1177/1098214010366173
 O’Connor, C., Small, S. A., & Cooney, S. M. (2007). Program fidelity and adaptation: Meeting local needs without compromising program effectiveness. What Works, Wisconsin – Research to Practice Series(4). Available from http://fyi.uwex.edu/whatworkswisconsin/files/2014/04/whatworks_04.pdf (Last accessed 13 April 2016)
 Glouberman, S., & Zimmerman, B. (2002). Complicated and complex systems: What would successful reform of medicare look like? Ottawa: Commission on the Future of Health Care in Canada. Available from http://publications.gc.ca/collections/Collection/CP32-79-8-2002E.pdf (Last accessed 13 April 2016)