Article Text

other Versions

Download PDFPDF

Building on what we know: moving beyond effectiveness to consider how to implement, sustain and spread successful health interventions
  1. Celia Laur1,2,
  2. Lauren Ball1,3,
  3. Heather Keller4,5 and
  4. Noah Ivers2,6
  1. 1 NNEdPro Global Centre for Nutrition and Health, St John’s Innovation Centre, Cambridge, UK
  2. 2 Women's College Research Institute, Women's College Hospital, Toronto, Ontario, Canada
  3. 3 Menzies Health Institute Queensland, Griffith University, Brisbane, Queensland, Australia
  4. 4 Faculty of Applied Health Sciences, University of Waterloo, Waterloo, Ontario, Canada
  5. 5 Schlegel-University of Waterloo Research Institute for Aging, Waterloo, Ontario, Canada
  6. 6 Department of Family and Community Medicine and Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, Ontario, Canada
  1. Correspondence to Dr Celia Laur, NNEdPro Global Centre for Nutrition and Health, St John’s Innovation Centre, Cambridge, UK; c.laur{at}nnedpro.org.uk

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Health research needs to demonstrate impact. It is no longer sufficient to claim that a treatment or behavioural intervention is only ‘efficacious’. There is little sense in building evidence of efficacy without systematically figuring out how to ensure it works in the real world and ensuring it continues to work over time and across contexts. This editorial provides a brief introduction into why considering implementation, sustainability and scalability can help achieve impact in health research and encourages submission of articles to a Special Collection on Implementing Effective Interventions in Healthcare. Details for submission are included in Box 1.

Box 1

Special Collection: Implementing effective interventions in healthcare

To have an impact on individual and population health, we need to consider how to implement effective interventions and practice changes. This collection encourages submission of articles about how to effectively change practice with a focus on prevention across all areas of health. Articles can be from any settings (ie, community, primary care, and hospital) and any health-related topic area, particularly nutrition and physical activity, as long as there is a focus on prevention and changing practice.

Types of articles that are encouraged include:

  • Knowledge translation studies.

  • Implementation and quality improvement initiatives.

  • Interventions applied at scale (throughout a country, region, etc).

  • Studies where implementation was not successful (lessons learnt should be discussed).

  • Pilot projects will only be considered if they have given consideration to the feasibility of real-world implementation.

  • Theory-based interventions, with particular emphasis on behaviour change.

  • All study designs, including quantitative, qualitative and mixed method.

Terminology used to describe the process of putting evidence into practice varies across disciplines and over time.1 Knowledge Translation (KT) is a commonly used umbrella term that encompasses both the science of how to implement a new intervention, building on evidence from previous implementation research (implementation science), and the practice of putting interventions into place (implementation practice). KT also includes the science and practice of sharing new information, more specifically called dissemination science and practice. Evidence generated by KT scientists and practitioners leads to interventions that work in the ideal setting and makes them a reality.1 2 It can also help ensure that scientific advances lead to changes that improve (rather than exacerbate) health disparities.3

It is time for nutrition research to move beyond pilot studies and even multisite efficacy studies to build on what we already know and reliably apply it to benefit people and populations. We need to use relevant theory, include the people who will be involved and impacted, follow an equity-oriented approach and learn from our mistakes and successes to develop sustainable interventions with strong impact.

Use theory and frameworks

To build on what we know, we need to understand the evidence. Theories provide a way to understand results and incorporate them into what has already been learned by others. A ‘good’ theory provides a clear explanation of how and why specific relationships lead to specific events. Theory-informed nutritional interventions tend to reflect mechanisms of action related to how the treatment enables better outcomes and helps develop new understandings of a phenomenon. Likewise, implementation theory can provide a structured way to think through an implementation plan and to put treatments into place across contexts in the real world and thus support the sustainability and scalability of interventions.

Unfortunately, theory is often an afterthought. As mentioned by Nilsen, ‘Poor theoretical underpinning makes it difficult to understand and explain how and why implementation succeeds or fails, thus restraining opportunities to identify factors that predict the likelihood of implementation success and develop better strategies to achieve more successful implementation’.4 In implementation work, we need ‘strategies, not solutions’ and theory helps inform those strategies.

Theories and frameworks also have an important role to play in understanding the core mechanisms by which an intervention works, allowing it to be adapted or reproduced in another facility or setting. For example, the Theoretical Domains Framework (TDF) can be used to systematically and comprehensively understand why a desired behaviour is not occurring as desired.5 The key issues identified can then be systematically mapped to appropriate Behaviour Change Techniques.6 For example, if interviews conducted with doctors about the nutrition training they receive demonstrated a gap in ‘Beliefs about Capabilities’ (a TDF domain similar to self-efficacy) regarding delivery of nutrition counselling, evidence suggests that Behaviour Change Techniques such as ‘Problem Solving’, ‘Instructions on How to Perform the Behaviour’, ‘Demonstrations of the Behaviour’ and ‘Behavioural Practice and Rehearsal’ would be useful approaches to improve the nutrition training intervention.7

Involve relevant people

Working together with those who understand the context and are impacted by the intervention is a significant facilitator in implementation. It is important to understand what is needed, if it is the right time, and other factors that may be unknown to the researcher. This type of collaborative work has many names including Integrated Knowledge Translation.8

The More-2-Eat implementation project is a strong example of Integrated Knowledge Translation.9–11 In this project, five hospitals across Canada were supported to improve nutrition care. Champions, typically dietitians, from each hospital worked closely with researchers, with each champion driving the nutrition care changes that made sense for their hospital. Relevant people were brought in at key times, such as involving the admission nurse when setting up nutrition screening, and the food service team when organising food intake monitoring. This integrated approach contributed to the sustained improvement of nutrition care in these hospitals.11

When involving relevant people, individuals with lived experience should also be included in the partnership. These individuals act as advocates and typically have had personal experience as a patient or care partner and provide valuable insight. For example, a patient who has received nutrition care from a doctor or has spent time in a hospital will have first-hand experience of nutrition care within the healthcare system. Ideally, these partners will be involved from initial development of the research question or project plan to completion of the study; however, there is still much to learn about how to appropriately build these partnerships.12

Consider equity

Implementation research has an important role to play in decreasing health disparities.3 When working with communities, it is important to acknowledge the diverse experiences that impact implementation efforts and account for social influence, such as sociopolitical forces, physical structures and economics.3 To build on existing evidence, an equity lens should be applied to implementation research. For example, Woodward and colleagues developed the Health Care Disparities Framework by adding an equity focus to an existing implementation framework.3

Healthcare inequities research and implementation science have a common goal to improve the quality and outcomes of services and make treatments and services available to multiple communities and settings.13 14 To achieve this goal, Baumann and Cabassa14 have listed five strategies: (1) focus on reach from the very beginning; (2) design and select interventions for vulnerable populations and low-resource communities with implementation in mind; (3) implement what works and develop implementation strategies that can help reduce inequities in care; (4) develop the science of adaptations; and (5) use an equity lens for implementation outcomes.14

With nutrition and prevention as cross-cutting fields with strong and direct impacts on patients and populations, these fields provide an ideal environment to combine implementation and health equity research. For example, a study to understand how to implement a falls and nutrition risk screening programme for older adults living in rural communities in Northern Ontario identified barriers and facilitators unique to that context.15 By working with these providers and patients, the local health authorities gained a deeper understanding of how to support these underserved communities.

Try, try and try again

Building off the work of Kohlmeier that nutrition research is ‘not for the faint hearted’, neither is implementation.16 Projects fail. An initially great idea may not become part of the routine, or another pilot project is implemented with the previous plan forgotten. We need to learn from this. Why didn’t it work? What changed from the initial plan? Was the intervention implemented as intended? Were the right outcome measures used? Were the right process measures used?

In implementation science and practice you learn and adapt as you go. There is a constant balancing act between following rigorous research methods while adapting to the changing context and applying what is learned. This balance needs to be acknowledged in practice and publications. For example, more researchers could create tables like the one by Marshall et al that outline the intervention component, whether it was part of the original plan or added later, the ‘ways in which the component were implemented’ and the ‘extent to which component was used’. This colour-coded table shows the reality of implementation.17 Acknowledgement and publication of these changes can help us learn for next time.

To spread and scale a treatment or behavioural intervention, understanding whether and why and how an intervention works is crucial. It is not easy to know why something works and how it might work differently in different contexts. For this reason, it is important to use multiple types of methods, including quantitative, qualitative and mixed methods approaches. Using multiple types of methods allows for contributions from various perspectives. For example, if a quantitative survey evaluating an intervention to improve nutrition knowledge of healthcare providers found that confidence to deliver nutrition care decreased after training delivery, this may initially be seen as a failure of the intervention. However, interviews with attendees may show that confidence decreased because the training demonstrated there was a lot more to consider when delivering nutrition care, and therefore, they had more to learn.18 Within all implementation work, context is always key. What works in one place may not work in another and using multiple methods will build understanding on how and why an intervention works or not.

Conclusion

Implementation research is challenging but important in order to have an impact. Without considering feasibility, fidelity, implementation, sustainability or scalability, a health-related intervention is unlikely to succeed outside of a controlled situation and is unlikely to have an impact. In the field of nutrition, we need to move beyond effectiveness studies by building on the implementation evidence and theory; involving the right people early on; making sure interventions apply an equity lens; and acknowledging and learning from what may not have gone as planned. Authors are encouraged to keep these factors in mind when submitting papers to the Special Collection on Implementing Effective Interventions in Healthcare.

Acknowledgments

CL is Associate Director of the NNEdPro Global Centre for Nutrition and Health as well as lead of the Global Innovation Panel. CL is funded by the Canadian Institutes for Health Research Health System Impact Fellowship (postdoctoral). LB is Global Strategic Lead for NNEdPro and is a National Health and Medical Research Council Fellow. NI is supported by a Canada Research Chair (Tier 2) in Implementation of Evidence Based Practice and as a clinician scholar by the Department of Family and Community Medicine at the University of Toronto. HK is the Schlegel-UW Research Chair in Nutrition & Aging

References

Footnotes

  • Twitter @Celia_Laur, @laurenball01

  • Contributors CL led on the writing. All authors contributed to and approved the final version.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Provenance and peer review Commissioned; internally peer reviewed.