Poor implementation causes significant reductions in effect sizes when interventions with demonstrated efficacy are translated to real-world settings (Elliott& Mihalic, 2004; Henggeler, 2004). Although recent reviews have identified multiple dimensions of program implementation that influence outcomes (e.g., Durlak& DuPre, 2008), four dimensions occur within the delivery of program sessions and, as a result, constitute potential modifiable sources of disconnect between the program as designed and that which is implemented in real world settings: 1) provider fidelity (amount of a program curriculum delivered), 2) provider additive adaptations (unscripted additions made during delivery), 3) provider quality of delivery (the skill with which providers deliver material and interact with participants), and 4) participant responsiveness (attendance, active participation, and use of program skills). The PI for this proposal and her colleague (both Early Stage Investigators) have submitted an R01 to NIDA to study implementation in the NIDA-funded effectiveness trial (R01 DA026874- 01A1) of the New Beginnings Program (NBP). The grant received promising scores for resubmission (score = 30; percentile = 22). That R01 has two goals with important implications for implementation research, 1) to examine effective and efficient ways of collecting implementation data (Schoenwald, et al., 2011) and 2) to test two theoretical models of implementation (Berkel, et al., 2011). These aims have implications for provider training and technical assistance and for maintaining effects of EBPs in the real world. The proposed pilot research seeks to lay the foundation for the R01, by examining effective and efficient methods of assessing fidelity, adaptations, and quality. First, we plan to pilot the measures on a subsample of videos collected as part of the efficacy trial. Ce-PIM support is requested to facilitate working with a consultant with long-standing expertise in the coding of these dimensions in a school-based substance use prevention program (e.g., Hansen, et al., 1991). Second, we request the assistance of Ce-PIM faculty, Hendricks Brown, to develop an analytic plan to assess the most efficient and effective sampling plan for identifying sessions/activities to code. References Berkel, C., Mauricio, A. M., Schoenfelder, E.,& Sandler, I. N. (2011). Putting the pieces together: An integrated model of program implementation. Prevention Science, 12, 23-33. Durlak, J.,& DuPre, E. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327-350. Hansen, W. B., Graham, J. W., Wolkenstein, B. H.,& Rohrbach, L. A. (1991). Program integrity as a moderator of prevention program effectiveness: Results for fifth-grade students in the adolescent alcohol prevention trial. Journal of Studies on Alcohol, 52, 568-579. Schoenwald, S. K., Garland, A. F., Chapman, J. E., Frazier, S. L., Sheidow, A. J.,& Southam-Gerow, M. A. (2011).
|Effective start/end date||6/1/12 → 8/30/13|
- HHS-NIH: National Institute on Drug Abuse (NIDA): $12,802.00