1932

Abstract

Interventional researchers face many design challenges when assessing intervention implementation in real-world settings. Intervention implementation requires holding fast on internal validity needs while incorporating external validity considerations (such as uptake by diverse subpopulations, acceptability, cost, and sustainability). Quasi-experimental designs (QEDs) are increasingly employed to achieve a balance between internal and external validity. Although these designs are often referred to and summarized in terms of logistical benefits, there is still uncertainty about () selecting from among various QEDs and () developing strategies to strengthen the internal and external validity of QEDs. We focus here on commonly used QEDs (prepost designs with nonequivalent control groups, interrupted time series, and stepped-wedge designs) and discuss several variants that maximize internal and external validity at the design, execution and implementation, and analysis stages.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-publhealth-040617-014128
2018-04-01
2024-03-28
Loading full text...

Full text loading...

/deliver/fulltext/publhealth/39/1/annurev-publhealth-040617-014128.html?itemId=/content/journals/10.1146/annurev-publhealth-040617-014128&mimeType=html&fmt=ahah

Literature Cited

  1. Alonso PL, Lindsay SW, Armstrong-Schellenberg JRM, Konteh M, Keita MK. 1.  et al. 1993. Special Issue: Malaria Control Trial Using Insecticide-Treated Bed Nets and Targeted Chemoprophylaxis in a Rural Area of The Gambia, West Africa. Trans. R. Soc. Trop. Med. Hyg. 87:21–60 [Google Scholar]
  2. Ammerman A, Smith T, Calancie L. 2.  2014. Practice-based evidence in public health: improving reach, relevance, and results. Annu. Rev. Public Health 35:47–63 [Google Scholar]
  3. Bailet LL, Repper K, Murphy S, Piasta S, Zettler-Greeley C. 3.  2013. Emergent literacy intervention for pre-kindergarteners at risk for reading failure: years 2 and 3 of a multiyear study. J. Learn. Disabil. 46:133–53 [Google Scholar]
  4. Basu S, Meghani A, Siddiqi A. 4.  2017. Evaluating the health impact of large-scale public policy changes: classical and novel approaches. Annu. Rev. Public Health 38:351–70 [Google Scholar]
  5. Beard E, Lewis JJ, Copas A, Davey C, Osrin D. 5.  et al. 2015. Stepped wedge randomised controlled trials: systematic review of studies published between 2010 and 2014. Trials 16:353 [Google Scholar]
  6. Brown CA, Lilford RJ. 6.  2006. The stepped wedge trial design: a systematic review. BMC Med. Res. Methodol. 6:54 [Google Scholar]
  7. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB. 7.  et al. 2017. An overview of research and evaluation designs for dissemination and implementation. Annu. Rev. Public Health 38:1–22 [Google Scholar]
  8. Brown CH, Ten Have TR, Jo B, Dagne G, Wyman PA. 8.  et al. 2009. Adaptive designs for randomized trials in public health. Annu. Rev. Public Health 30:1–25 [Google Scholar]
  9. Brownson RC, Diez Roux AV, Swartz K. 9.  2014. Commentary: Generating rigorous evidence for public health: the need for new thinking to improve research and practice. Annu. Rev. Public Health 35:1–7 [Google Scholar]
  10. Campbell DT, Stanley JC. 10.  1963. Experimental and quasi-experimental designs for research on teaching. Handbook of Research on Teaching NL Gage. Chicago: Rand McNally [Google Scholar]
  11. Cissé B, Ba EH, Sokhna C, NDiaye JL, Gomis JF. 11.  et al. 2016. Effectiveness of seasonal malaria chemoprevention in children under ten years of age in Senegal: a stepped-wedge cluster-randomised trial. PLOS Med 13:e1002175 [Google Scholar]
  12. 12. Cochrane Effective Practice Organ. Care (EPOC). 2017. Interrupted time series (ITS) analyses. EPOC resources for review authors. EPOC, London. http://epoc.cochrane.org/epoc-specific-resources-review-authors
  13. Cook TD, Campbell DT. 13.  1976. The design and conduct of true experiments and quasi-experiments in field settings. Handbook of Industrial and Organizational Psychology MD Dunnette 223–326 Washington, DC: Am. Psychol. Assoc. [Google Scholar]
  14. Cousins K, Connor JL, Kypri K. 14.  2014. Effects of the Campus Watch intervention on alcohol consumption and related harm in a university population. Drug Alcohol Depend 143:120–26 [Google Scholar]
  15. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. 15.  2008. Developing and evaluating complex interventions: the new Medical Research Council guidance. Br. Med. J. 337:1655 [Google Scholar]
  16. Craig P, Katikireddi SV, Leyland A, Popham F. 16.  2017. Natural experiments: an overview of methods, approaches, and contributions to public health intervention research. Annu. Rev. Public Health 38:39–56 [Google Scholar]
  17. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. 17.  2012. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med. Care 50:3217–26 [Google Scholar]
  18. Dainty KN, Scales D, Brooks SC, Needham DM, Dorian P. 18.  et al. 2011. A knowledge translation collaborative to improve the use of therapeutic hypothermia in post-cardiac arrest patients: protocol for a stepped wedge randomized trial. Implementation Sci 6:4 [Google Scholar]
  19. Fan E, Laupacis A, Pronovost PJ, Guyatt GH, Needham DM. 19.  2010. How to use an article about quality improvement. JAMA 304:2279–87 [Google Scholar]
  20. Fernald LCH, Gertler PJ, Neufeld LM. 20.  2008. Role of cash in conditional cash transfer programmes for child health, growth, and development: an analysis of Mexico's Oportunidades. . Lancet 371:828–37 [Google Scholar]
  21. Fok CCT, Henry D, Allen A. 21.  2015. Research designs for intervention research with small samples II: stepped wedge and interrupted time-series designs. Prev. Sci. 16:967–77 [Google Scholar]
  22. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. 22.  2012. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am. J. Public Health 102:71274–81 [Google Scholar]
  23. Grant AD, Charalambous S, Fielding KL, Day JH, Corbett EL. 23.  et al. 2005. Effect of routine isoniazid preventive therapy on tuberculosis incidence among HIV-infected men in South Africa: a novel randomized recruitment study. JAMA 293:2719–25 [Google Scholar]
  24. Green LW, Brownson RC, Fielding JE. 24.  2017. Introduction: How is the growing concern for relevance and implementation of evidence-based interventions shaping the public health research agenda?. Annu. Rev. Public Health 38:v–vii [Google Scholar]
  25. Green LW, Glasgow RE. 25.  2006. Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology. Eval. Health Prof. 29:1126–53 [Google Scholar]
  26. Handley MA, Schillinger D, Shiboski S. 26.  2011. Quasi-experimental designs in practice-based research settings: design and implementation considerations. J. Am. Board Fam. Med. 24:589–96 [Google Scholar]
  27. Hawe P. 27.  2015. Lessons from complex interventions to improve health. Annu. Rev. Public Health 36:307–23 [Google Scholar]
  28. Hemming K, Haines TP, Chilton PJ, Girling AJ, Lilford RJ. 28.  2015. The stepped wedge cluster randomised trial: rationale, design, analysis, and reporting. Br. Med. J. 350:h391 [Google Scholar]
  29. Hussey MA, Hughes JP. 29.  2007. Design and analysis of stepped wedge cluster randomized trials. Contemp. Clin. Trials 28:182–91 [Google Scholar]
  30. Killam WP, Tambatamba BC, Chintu N, Rouse D, Stringer E. 30.  et al. 2010. Antiretroviral therapy in antenatal care to increase treatment initiation in HIV-infected pregnant women: a stepped-wedge evaluation. AIDS 24:85–89 [Google Scholar]
  31. Kontopantelis E, Doran T, Springate DA, Buchan I, Reeves D. 31.  2015. Regression based quasi-experimental approach when randomisation is not an option: interrupted time series analysis. BMJ 350:h2750 [Google Scholar]
  32. Krass I. 32.  2016. Quasi experimental designs in pharmacist intervention research. Int. J. Clin. Pharm. 38:647–54 [Google Scholar]
  33. Leviton LC. 33.  2017. Generalizing about public health interventions: a mixed-methods approach to external validity. Annu. Rev. Public Health 38:371–91 [Google Scholar]
  34. Lopez Bernal J, Cummins S, Gasparrini A. 34.  2017. Interrupted time series regression for the evaluation of public health interventions: a tutorial. Int. J. Epidemiol. 46:1348–55 [Google Scholar]
  35. McNeish DM, Harring JR. 35.  2014. Clustered data with small sample sizes: comparing the performance of model-based and design-based approaches. Commun. Stat. 46:855–69 [Google Scholar]
  36. Mdege ND, Man MS, Taylor CA, Torgerson DJ. 36.  2011. Systematic review of stepped wedge cluster randomized trials shows that design is particularly used to evaluate interventions during routine implementation. J. Clin. Epidemiol. 64:936–48 [Google Scholar]
  37. Morrison LJ, Brooks SC, Dainty KN, Dorian P, Needham DM. 37.  et al. 2015. Improving use of targeted temperature management after out-of-hospital cardiac arrest: a stepped wedge cluster randomized controlled trial. Crit. Care Med. 43:954–64 [Google Scholar]
  38. Murray DM, Pennell M, Rhoda D, Hade EM, Paskett ED. 38.  2010. Designing studies that would address the multilayered nature of health care. J. Natl. Cancer Inst. Monogr. 40:90–96 [Google Scholar]
  39. Naci H, Soumerai SB. 39.  2016. History bias, study design, and the unfulfilled promise of pay-for-performance policies in health care. Prev. Chronic. Dis. 13:160133 [Google Scholar]
  40. Pellegrin KL, Krenk L, Oakes SJ, Ciarleglio A, Lynn J. 40.  et al. 2017. Reductions in medication-related hospitalizations in older adults by hospital and community pharmacists: a quasi-experimental study. J. Am. Geriatr. Soc. 65:212–19 [Google Scholar]
  41. Peters DH, Adam T, Alonge O, Agyepong IA, Tran N. 41.  2013. Implementation research: what it is and how to do it. BMJ 347:f6753 [Google Scholar]
  42. Prost A, Binik A, Abubakar I, Roy A, De Allegri M. 42.  et al. 2015. Logistic, ethical, and political dimensions of stepped wedge trials: critical review and case studies. Trials 16:351 [Google Scholar]
  43. Ratanawongsa N, Handley MA, Quan J, Sarkar U, Pfeifer K. 43.  et al. 2012. Quasi-experimental trial of diabetes Self-Management Automated and Real-Time Telephonic Support (SMARTSteps) in a Medicaid managed care plan: study protocol. BMC Health Serv. Res. 12:22 [Google Scholar]
  44. Robinson TE, Zhou L, Kerse N, Scott JDR, Christiansen JP. 44.  et al. 2015. Evaluation of a New Zealand program to improve transition of care for older high risk adults. Australas. J. Ageing 34:269–74 [Google Scholar]
  45. Shadish WR, Cook TD, Campbell DT. 45.  2002. Experimental and Quasi-Experimental Designs for Generalized Causal Inference New York: Houghton Mifflin
  46. Tabak RG, Khoong EC, Chambers DA, Brownson RC. 46.  2012. Bridging research and practice: models for dissemination and implementation research. Am. J. Prev. Med. 43:3337–50 [Google Scholar]
  47. Turner EL, Li F, Gallis JA, Prague M, Murray DM. 47.  2017. Review of recent methodological developments in group-randomized trials: part 1—design. Am. J. Public Health 107:907–15 [Google Scholar]
  48. Turner EL, Prague M, Gallis JA, Li F, Murray DM. 48.  2017. Review of recent methodological developments in group-randomized trials: part 2—analysis. Am. J. Public Health 107:1078–86 [Google Scholar]
  49. Weinhardt LS, Galvao LW, Yan AF, Stevens P, Mwenyekonde TN. 49.  et al. 2017. Mixed-method quasi-experimental study of outcomes of a large-scale multilevel economic and food security intervention on HIV vulnerability in rural Malawi. AIDS Behav 21:712–23 [Google Scholar]
  50. Wertz D, Hou L, Devries A, Dupclay L Jr, McGowan F. 50.  et al. 2012. Clinical and economic outcomes of the Cincinnati Pharmacy Coaching Program for diabetes and hypertension. Manag. Care 21:44–54 [Google Scholar]
  51. West SG, Duan N, Pequegnat W, Gaist P, Des Jarlais DC. 51.  et al. 2008. Alternatives to the randomized controlled trial. Am. J. Public Health 98:1359–66 [Google Scholar]
  52. White H, Sabarwal S. 52.  2014. Quasi-experimental design and methods Methodol. Briefs: Impact Eval. 8. UNICEF Off. Res., Florence. https://www.unicef-irc.org/publications/pdf/brief_8_quasi-experimental%20design_eng.pdf
  53. Wyman PA, Henry D, Knoblauch S, Brown CH. 53.  2015. Designs for testing group-based interventions with limited numbers of social units: the dynamic wait-listed and regression point displacement designs. Prev. Sci. 16:956–66 [Google Scholar]
  54. Zombré D, Allegri MD, Ridde V. 54.  2017. Immediate and sustained effects of user fee exemption on healthcare utilization among children under five in Burkina Faso: a controlled interrupted time-series analysis. Soc. Sci. Med. 179:27–35 [Google Scholar]
/content/journals/10.1146/annurev-publhealth-040617-014128
Loading
/content/journals/10.1146/annurev-publhealth-040617-014128
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error