Show Summary Details

Page of

PRINTED FROM OXFORD HANDBOOKS ONLINE (www.oxfordhandbooks.com). © Oxford University Press, 2018. All Rights Reserved. Under the terms of the licence agreement, an individual user may print out a PDF of a single chapter of a title in Oxford Handbooks Online for personal use (for details see Privacy Policy and Legal Notice).

date: 06 March 2021

Contextual Influences and Strategies for Dissemination and Implementation in Mental Health

Abstract and Keywords

Implementation science has emerged to bridge the gap between research and practice. A number of conceptual frameworks have been developed to advance implementation research and illuminate the contextual influences that can facilitate or impede the implementation of evidence-based practices. Contextual factors that may be important in the dissemination and implementation of evidence-based practice may occur at the system-, organizational-, and provider-levels. System-level barriers may include external policies, incentives, and peer pressure. Organizational-level factors that are important in implementation include organizational culture and climate and implementation climate. At the individual provider-level, barriers may occur around provider attitudes, knowledge, and self-efficacy. Finally, additional barriers such as client-level factors and characteristics of the intervention itself can exist. Implementation strategies have been developed that can be used to overcome contextual barriers when attempting to implement evidence-based practices into new settings. Several exemplar implementation strategies are discussed, including the Availability, Responsiveness, and Continuity intervention, Community Development Team model, and Interagency Collaborative Team Model.

Keywords: dissemination, implementation, organizational culture and climate, mental health services, implementation strategies

Introduction

Evidence-based practices (EBPs) in mental health are specific interventions with demonstrated efficacy based on controlled research with a defined population (Chambless & Hollon, 1998). A wealth of research documents the existence of numerous EBPs in mental health, such as cognitive-behavioral therapy for a variety of client populations and presenting problems (Butler, Chapman, Forman, & Beck, 2006; Chorpita et al., 2011). Nevertheless, most people do not have access to EBPs in their communities (New Freedom Commission on Mental Health, 2003; U.S. Department of Health and Human Services, 1999). Although the need to disseminate and implement EBPs from research to practice settings has been identified (Institute of Medicine, 2001), progress has been slow and previous efforts have had limited success (Stewart & Chambless, 2007; Weissman et al., 2006). The limited efficacy of many dissemination and implementation efforts to date is likely due, at least in part, to a lack of attention paid to contextual factors at the individual, organizational, and system levels that affect the delivery of EBPs in mental health (Aarons, Hurlburt, & Horwitz, 2011; Damschroder et al., 2009). The purpose of this chapter is threefold: to provide a brief introduction to implementation science; to describe the many contextual factors that can facilitate or impede the dissemination and implementation of EBPs; and to provide an overview of some of the implementation strategies that can be used to overcome anticipated barriers to the integration of EBPs into new settings.

Brief Introduction to Implementation Science

Implementation science, “the scientific study of methods to promote the systematic uptake of research findings and other EBPs into routine practice, and, hence, to improve the quality and effectiveness of health services” (Eccles & Mittman, 2006, p. 1), has emerged to bridge the gap between research and practice. Implementation science also recognizes the importance of the study of contextual influences on professional and organizational behavior (Eccles & Mittman, 2006).

Guiding Conceptual Frameworks

Best practice recommendations advocate that investigators conducting implementation research select a framework to guide study design (Proctor, Powell, Baumann, Hamilton, & Santens, 2012). Frameworks have proliferated in the field of implementation science in recent years, and at present there are at least 61 implementation frameworks to choose from (Tabak, Khoong, Chambers, & Brownson, 2012). In this chapter we discuss three conceptual frameworks in some detail. The first is a framework devised by Proctor et al. (2009) that helps to frame the central interventions and outcomes of implementation research. The second and third frameworks are particularly well suited to illuminating the contextual influences that can facilitate or impede the implementation of EBPs (Aarons et al., 2011; Damschroder et al., 2009). (For an in-depth discussion of frameworks, see Chambers [2014]).

Conceptual framework of implementation research in mental health.

Proctor and colleagues (2009) have advanced a conceptual framework for implementation research in mental health. Their framework makes a distinction between two types of intervention: EBPs and the strategies that are used to implement them. Implementation strategies have been defined as the “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice” (Proctor, Powell, & McMillen, 2013). The conceptual framework designed by Proctor and colleagues (2009) also distinguishes between three sets of core outcomes: implementation outcomes, service system outcomes, and client outcomes. Implementation outcomes are “the effects of deliberate and purposive actions to implement new treatments, practices, and services” (Proctor et al., 2011, p. 65). Examples of implementation outcomes include acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration, and sustainment (for more detail, see Proctor & Brownson [2012]; Proctor, Powell, & Feely [2014]; Proctor et al. [2011]). Implementation outcomes can serve as: (a) indicators of implementation success, (b) proximal indicators of implementation processes, and (c) key intermediate outcomes in relation to service system or clinical outcomes in treatment effectiveness and quality of care research (Proctor et al., 2011). Service system outcomes are defined by the quality of care standards set forth by the Institute of Medicine: efficiency, safety, effectiveness, equity, patient-centeredness, and timeliness (Institute of Medicine, 2001). It is assumed that improvements in implementation and service system outcomes ultimately will result in better client outcomes in such areas as level of functioning and symptomology. It would be helpful for every implementation effort to specify the intervention to be implemented, the implementation strategies being employed, and the outcomes that will be assessed. This framework can facilitate that process.

Consolidated Framework for Implementation Research.

In an effort to synthesize the health services implementation literature, Damschroder and colleagues (2009) developed the Consolidated Framework for Implementation Research (CFIR). The CFIR provides “an overarching typology—a list of constructs to promote theory development and verification about what works where and why across multiple contexts” (Damschroder et al., 2009, p. 2). The five major domains of the CFIR are intervention characteristics, the outer setting, the inner setting, characteristics of individuals involved, and the implementation process. Intervention characteristics are the features of an intervention being implemented in a particular setting, and they involve consideration of factors such as the need to adapt an existing intervention to improve fit. The outer setting includes the economic, political, and social context within which an organization exists, including client needs and resources, cosmopolitanism, peer pressure, and external policies and incentives. The inner setting, or organizational-level, includes the structural, political, and cultural processes through which the implementation proceeds. The inner setting consists of tangible and intangible structural characteristics, networks and communication, culture, implementation climate, and readiness for implementation that can interrelate and influence implementation. The relationship between the inner and outer settings is dynamic and it involves considerable overlap because of the hierarchical structures of most health care organizations and the interrelationships within and between other organizations. The fourth domain focuses on the individuals implementing the intervention, and includes the impact of individual choices and their influence on implementation. The final domain is the implementation process itself, which involves planning, engaging relevant stakeholders, executing the plan, and reflecting on and evaluating the effort.

Exploration, Preparation, Implementation, and Sustainment (EPIS) model.

Supporting the presence of attending to the inner and outer contexts, Aarons and colleagues (2011) have developed a multilevel ecological model that describes factors thought to have an impact on implementation during four distinct stages: Exploration, Adoption/Preparation, Implementation, and Sustainment (the EPIS model). This model was specifically intended to inform implementation in public service sectors (e.g., publicly funded mental health, child welfare, alcohol, and drug systems). Contextual aspects of the inner setting (i.e., internal to the provider) and the outer setting (i.e., external to the provider—the system level) may vary in their prominence and manifestation within each of the EPIS model’s phases. The use of a framework, like the CFIR or EPIS, can facilitate consideration of common contextual factors that influence success or failure of dissemination and implementation efforts.

We now consider contextual influences in greater detail, highlighting specific examples and considerations from the extant literature.

Contextual Influences on Dissemination and Implementation

To date, dissemination and implementation efforts have generally been carried out under the assumption that the factor that most commonly hampers implementation of EBPs in the community is the limitation in provider knowledge and skill. As a result, training is often used as the primary strategy to increase the successful use of EBPs in community settings (Beidas, Edmunds, Marcus, & Kendall, 2012). There is, however, a growing consensus that training is a necessary but insufficient implementation strategy (Herschell, Kolko, Baumann, & Davis, 2010) given that most training strategies neglect the critical role of the context of the therapists, which likely affects implementation of EBPs regardless of the therapists’ attained knowledge or skills (Beidas & Kendall, 2010). Many providers who have received training in EBPs do not use them any more than non-EBPs (Stark, Arora, & Funk, 2011), which may reflect a lack of attention to broader contextual factors within dissemination and implementation efforts. The following sections outline a number of contextual factors at the system, organizational, and provider levels as outlined in ecological frameworks such as CFIR and EPIS that may serve as mediators and moderators of implementation effectiveness (Lee & Mittman, 2012).

The System Level

System-level factors refer to characteristics of the system within which implementation operates (e.g., community mental health clinics nested within a public mental health system). System-level factors that may influence implementation research include cosmopolitanism, peer pressure, and external policies and incentives (Damschroder et al., 2009). Cosmopolitanism is the extent to which an organization is connected to other organizations (Damschroder et al., 2009). Organizations that support such connections and boundary spanning may implement EBPs more quickly due to earlier awareness and assimilation of innovations and/or competition (Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004). Information sharing and a shared vision across organizations, or social capital, may also be important in EBP implementation (Damschroder et al., 2009; Greenhalgh et al., 2004). Peer pressure may also play a role in influencing implementation, such that organizations that are late adopters in systems may be more motivated to implement EBPs (Damschroder et al., 2009).

External strategies to increase implementation may include a range of factors such as policy or regulatory authority, external mandates for EBP use, practice guidelines or recommendations, and pay-for-performance (Damschroder et al., 2009; Mendel, Meredith, Schoenbaum, Sherbourne, & Wells, 2008). Barriers may exist when policies or mandates for EBP use are enacted without incentives or other changes to the system to support their adoption, integration, and sustainment. While scant, the literature on funding and financing EBPs in healthcare indicates that funding plays an important role in obtaining training in EBPs, its implementation, and sustainment (Bond et al., 2014; Isett et al., 2007; Magnabosco, 2006; Massatti, Sweeney, Panzano, & Roth, 2008; Swain, Whitley, McHugo, & Drake, 2010). Incentives may be useful and may include financial incentives as well as the benefits associated with an improved work environment or reputation associated with EBP use (Mendel et al., 2008).

Systems that mandate EBP use or that require participation in EBP training initiatives may benefit from identifying barriers in their system and experimenting with strategies to encourage therapist completion of training requirements and integration of EBPs into the system (Raghavan, Bright, & Shadoin, 2008). Systems that are committed to integrating EBPs into the standard of care may benefit from considering approaches such as revamping progress notes to be more consistent with EBPs and reimbursing agencies for lost wages as their trainees participate in EBP training (Benjamin, Taylor, Goodin, & Creed, 2014). Reimbursement of EBP delivery at a higher rate than usual care may also be considered. Research that examines the relative value of such system-level implementation strategies is needed.

The City of Philadelphia’s Department of Behavioral Health and Intellectual disAbility Services (DBHIDS) is an example of a system that has committed to EBP use and has instituted policies and practices directed toward transforming the public system into one in which EBPs are the standard of care (see also Beidas et al., 2013). For example, the Evidence-Based Practice & Innovation Center (EPIC), a task force of DBHIDS leadership and academic partners, was established to coordinate and support the shift to a more evidence-based system. Pilot initiatives to roll-out various EBPs (e.g., cognitive therapy, trauma-focused cognitive-behavioral therapy, and dialectical behavior therapy) in Philadelphia have been funded and are now being evaluated. Additionally, DBHIDS is experimenting with incentives such as enhanced reimbursement rates for providers implementing EBPs. Systems like Philadelphia’s DBHIDS represent unique and important opportunities to study the dissemination and implementation of EBPs in action.

Organizational-Level Considerations

A growing body of research has emerged to suggest that organizational factors are important contextual characteristics that affect delivery of mental health services for youth in community settings (Glisson & James, 2002; Glisson et al., 2010; Hoagwood, Burns, Kiser, Heather, & Schoenwald, 2001). This research draws on the organizational literature, which suggests the importance of an individual’s social context in influencing one’s attitudes, beliefs, and subsequent behavior around adoption of innovation (Glisson, 2002; Rogers, 2003). In the case of youth mental health services, organizations within which treatment is delivered include settings such as community mental health clinics, schools, and child welfare agencies (i.e., organizational social context; Williams & Glisson, 2014). Three organizational factors, including organizational culture and climate, as well as implementation climate, are proximal indicators of important service delivery variables.

Organizational culture refers to shared employee perceptions around “the behavioral expectations and norms that characterize the way work is done in an organization” (Glisson, Dukes, & Green, 2006, p. 858), whereas organizational climate refers to shared employee perceptions around “the psychological impact of their work environment on their own personal well-being” (Williams & Glisson, 2014, p. 64). Organizational culture and climate have been associated with attitudes toward EBPs (Aarons et al., 2012), provider turnover (Glisson & James, 2002; Glisson, Schoenwald, et al., 2008), quality of services (Glisson & Hemmelgarn, 1998), sustainment of adoption of new practices (Glisson, Schoenwald, et al., 2008), and youth mental health outcomes (Glisson & Green, 2011; Glisson & Hemmelgarn, 1998). The gold standard for assessment of organizational culture and climate is the Organizational Social Context (OSC), a quantitative measure (Glisson, Landsverk, et al., 2008), developed over the past 37 years (Glisson, 1978).

Implementation climate refers to “targeted employees’ shared summary perceptions of the extent to which their use of a specific innovation is rewarded, supported and expected within their organization” (Klein & Sorra, 1996, p. 1060). Implementation climate is receiving considerable attention as a factor that warrants investigation. The organizational literature suggests that a strong implementation climate is associated with increased use of innovations because it ensures that employees are adequately skilled in their use. Such a climate also provides incentives for the use of the innovations and removes barriers to their use (Klein & Sorra, 1996). Alternatively, provider perceptions of a lack of organizational support to implement an EBP or lack of support to allocate time to training activities may contribute to decreased EBP use (Essock et al., 2003; Garland, Bickman, & Chorpita, 2010). Preliminary research on implementation climate has been found to be associated with fidelity of EBPs (Dingfelder, Mandell, Stahmer, & Marcus, 2011). At present, measures of implementation climate specific to health and social service settings are being developed and validated (Aarons, Ehrhart, & Dlugosz, 2012; Jacobs, Weiner, & Bunger, 2014; Weiner, Belden, Bergmire, & Johnston, 2011). This research will likely generate important evidence regarding this construct’s relationship to implementation, service system, and clinical outcomes (Proctor et al., 2011).

Individual Providers

A number of key characteristics of individuals involved with implementation have been hypothesized to be important barriers/facilitators to the adoption of innovative interventions. These include the person’s knowledge and beliefs about the intervention, self-efficacy, individual stage of change, individual identification with the organization, among other personal attributes (Damschroder et al., 2009). More recently, and relevant to the discussion of provider adoption of EBPs, The Integrated Model (Ajzen, 1991; Grant & Wrzesniewski, 2010; Henshaw & Freedman-Doan, 2009; Kiviniemi, Bennett, Zaiter, & Marshall, 2011) has been advanced to illuminate why some people engage in particular behaviors while others do not. The Integrated Model identifies intention to perform a behavior as a direct determinant of behavior change. Intentions are primarily influenced by three of the characteristics cited above: personal attitudes, normative pressure, and a sense of self-efficacy/control. Williams and Glisson (2014) propose that organizational culture and climate affect the development of provider EBP intentions and the present/absence of barriers to acting on those intentions. Organizational barriers are hypothesized to moderate the relationship between EBP intentions and behaviors. This represents a promising avenue for future research. Therefore, in the paragraphs that follow we review the evidence base around some of the key individual characteristics affecting the adoption of new treatment strategies.

Knowledge and beliefs about a given intervention, including factors such as individual attitudes and values, enthusiasm, and skill, have been highlighted in the literature as important factors in implementation (Ajzen, 1991; Damschroder et al., 2009; Rogers, 2003). Knowledge can be quite variable in EBP implementations. For example, in schools, there is often a great deal of variability in terms of who participates in EBP implementation. Providers may be master’s level social workers, guidance counselors, and even teachers. Those with no or little previous mental health background may need training in therapeutic underpinnings, such as rapport building. More seasoned mental health providers may be ready to move into EBP training more quickly, though they may also have less motivation to change their practice.

Providers may have considerable experience with the EBP being implemented, may have no experience in it, or may have been trained in an antithetical guiding theoretical orientation. A single training cohort may include providers with a broad range of knowledge about EBPs. This variability presents clear challenges to those planning implementation strategies.

Beliefs about EBPs can also be important factors to assess and plan for in implementation work. Providers may be concerned that EBPs may require them to be directive or structured in a way that will be detrimental to building and establishing rapport and being responsive to client needs, despite empirical evidence to the contrary (Garland et al., 2010). Barriers to acceptance of an innovation may come from providers’ perceptions of an EBP as challenging their autonomy or as a substitute for their clinical judgment (Essock et al., 2003; Garland et al., 2010). Attitudes toward change and innovation can affect EBP implementation by influencing providers’ adherence and skill (Beidas et al., 2013), their decisions to adopt interventions, and their commitment to actual implementation and use of the EBP (Aarons, 2004, 2005; Aarons, Cafri, Lugo, & Sawitzky, 2012; Candel & Pennings, 1999; Frambach & Schillewaert, 2005; Rogers, 2003). Some of the barriers created by providers’ lack of acceptance of an innovation may be addressed by correcting misperceptions about EBPs early in the implementation process (Garland et al., 2010). For example, positive attitudes have been shown to be related to training that leads to a more refined definition of what an EBP is (Lim, Nakamura, Higa-McMillan, Shimabukuro, & Slavin, 2012). However, positive attitudes toward EBPs are not sufficient to produce behavior change. Providers may need to be taught to be “evidence consumers” so that they can learn how to find relevant research, evaluate its merit, and use it to inform their clinical care (Beidas, Ditty, Downey, & Edmunds, 2014; Spring, 2007).

For our purposes. self-efficacy refers to a person’s belief in his or her ability to succeed in implementing a given EBP (Bandura, 1977), and this may be a key construct in implementation success. Individual stage of change and perceptions of the organization and one’s relationship to that organization have also been discussed as important individual-level considerations (Damschroder et al., 2009). Many community providers are confident in the usual care they provide, and this represents a barrier to changing practice (Garland et al., 2010; Garland, Kruse, & Aarons, 2003). Finally, Damschroder and colleagues discuss the importance of other personal attributes of individuals that may present barriers to implementation, including personality, intellect, competence, and experience. For example, with regard to experience, trainee-level providers in EBP implementation programs may still be developing their therapeutic orientation and style, and thus may be more open to learning a new EBP (Aarons, 2004). They may also have more time to devote to mastering the intervention. However, more experienced providers, who have extensive institutional knowledge, can be invaluable collaborators to researchers considering how best to implement an EBP in a new setting. In addition, seasoned clinicians may have established relationships with key stakeholders and with their colleagues, which can be an advantage when collaborative partnerships with researchers are established. In any work with practicing mental health providers, attention to these important individual-level characteristics from early in the planning process and throughout implementation may be key to navigating barriers. Collaborative relationships with providers and the establishment of partnerships early on can maximize the likelihood of success (Chambers & Azrin, 2013; Southam-Gerow, Hourigan, & Allin, 2009).

Additional Barriers

Intervention characteristics and client factors can also be barriers to EBP implementation. Intervention characteristics identified by Damschroder and colleagues (2009) that warrant consideration in implementation science work include the intervention source (e.g., was the EBP externally or internally developed?), the strength and quality of the evidence base, relative advantages of an intervention over alternative forms of treatment, adaptability to meet local needs, trialability (i.e., the opportunity to try out the intervention on a small scale within the organization), perceived complexity, perceptions of excellence in design quality and packaging, and cost (for a relatively comprehensive list of additional intervention characteristics, see Grol, Bosch, Hulscher, Eccles, & Wensing, 2007). The EPIS model also posits the importance of innovation–context fit (Aarons et al., 2011). Scheirer (2013) has proposed a helpful framework of six different intervention types that vary in complexity and scope, including: “(1) those implemented by individual providers; (2) programs requiring coordination among multiple staff; (3) new policies, procedures, or technologies; (4) capacity or infrastructure building; (5) community partnerships or collaborations; and (6) broad-scale system change” (p. 1). Different types of interventions may be more or less difficult to implement, and thus require unique considerations with regard to the strategies needed to implement and sustain them in a given organization. Understanding the organization’s resources, allegiances, preferences, and relative strengths at the outset can guide the research team in choosing an intervention that is more likely to be well-received and successful. Again, the importance of establishing a strong community–academic partnership early on cannot be overstated (Chambers & Azrin, 2013). By laying the groundwork for a collaborative implementation effort early, interventions can be strategically chosen to maximize fit with a specific organization.

Concerns are also often raised that clients in “real world” settings, like community mental health clinics, do not resemble those included in intervention trials (Baker-Ericzen, Hurlburt, Bookman-Frazee, Jenkins, & Hough, 2009; Garland et al., 2010; Southam-Gerow, Chorpita, Miller, & Gleacher, 2008). Some evidence suggests that discrepancies in client characteristics do exist. For example, youth with anxiety disorders from community treatment settings were shown to have more externalizing comorbidities (Southam-Gerow, Weisz, & Kendall, 2003). Additionally, youth served in community clinics are more likely to be from single-parent and low income families and to experience increased life stressors than those being treated in research clinics (Beidas, Suarez, et al., 2012; Ehrenreich-May et al., 2011; Southam-Gerow et al., 2008; Southam-Gerow et al., 2003). Providers may also believe that the research base lacks the breadth and complexity to be relevant to their practice and that today’s EBPs may not have the staying power to warrant the investment of time and resources to adopt a new practice (Essock et al., 2003). As discussed by Garland and colleagues (2010), the degree to which differences that may exist between clients in research trials and community care settings actually moderate intervention effectiveness is unknown. Finally, clients and their families may have concerns about EBPs leading to less individualization in treatment, whether the researchers who design EBPs value the same outcomes as they do, and that the allocation of resources to EBPs may detract from other family-based or client driven resources (Essock et al., 2003).

Implementation Strategies for Overcoming Contextual Barriers

Implementation strategies are “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice” (Proctor et al., 2013). Implementation may involve a single component or discrete strategy (e.g., disseminating treatment guidelines, reminders, audit and feedback); however, most implementation efforts involve multifaceted approaches that combine multiple discrete strategies (Powell et al., 2012). In fact, many scholars have emphasized the need to use multiple strategies that are carefully selected to address multilayer barriers (Flanagan, Ramanujam, & Doebbeling, 2009; Mittman, 2012; Novins, Green, Legha, & Aarons, 2013; Powell et al., 2012; Solberg et al., 2000). Multilevel strategies can be “built” for individual implementation efforts by combining discrete strategy components, many of which have been listed in published taxonomies (Cochrane Effective Practice and Organisation of Care Group, 2002; Mazza et al., 2013; Michie et al., 2013; Powell et al., 2012). Additionally, there are a number of multifaceted strategies that have been branded. We provide several examples here, including the Availability, Responsiveness, and Continuity (ARC) intervention; the Community Development Team model; and the Interagency Collaborative Team Model.

Availability, Responsiveness, and Continuity Model

The ARC model, developed by Glisson and colleagues (Glisson et al., 2006; Glisson et al., 2012; Glisson, Hemmelgarn, Green, & Williams, 2013; Glisson et al., 2010; Glisson & Schoenwald, 2005), is an organizational intervention that is particularly well suited to addressing contextual barriers at the organizational and interorganizational levels. It was designed to improve organizational culture, climate, efficiency, and effectiveness via a change agent who works extensively with organizations to overcome barriers to adopting EBPs, to develop new service strategies, and to design effective work processes (Glisson et al., 2006). Twelve manual-guided intervention components are delivered (Glisson et al., 2010). Stage 1, the collaboration stage, involves (1) supporting the organizational leadership’s use of the ARC model and principles to communicate a clear vision for change; (2) cultivating personal relationships with administrators, service providers, opinion leaders, and stakeholders to promote communication, information sharing, and problem solving; and (3) accessing or developing networks among administrators, service providers, and stakeholders. The participation stage involves (4) building teamwork within work units to bolster participation, information sharing, and support; (5) providing information and training about the ARC model, state and federal policies, best practices, and data management strategies to guide and support improvement efforts; (6) establishing a feedback system to provide performance data to involved stakeholders; (7) implementing participatory decision making; and (8) resolving conflicts at the interpersonal level, as well as the intraorganizational, and interorganizational levels. In the third stage, innovation is promoted by (9) developing goal-setting procedures, (10) using continuous quality improvement techniques to change policies and practice; (11) participating in job redesign (i.e., revising responsibilities and broadening skills) to eliminate service barriers; and (12) ensuring self-regulation and stabilization (i.e., sustainability) by facilitating the independent use of the previous components (Glisson et al., 2006). Although each of the twelve components of the ARC model could be viewed as an implementation strategy in and of itself, Palinkas and Soydan (2012) point out that the model actually “… builds an implementation strategy by studying, understanding, and operationalizing organizational and interorganizational factors in each given implementation context” (p. 70). The ARC model has been tested in three randomized controlled trials and has been found to create more positive organizational social contexts (Glisson, Landsverk, et al., 2008), to reduce staff turnover, to support the implementation of an EBP, and to improve clinical outcomes for youth (Glisson et al., 2010, 2012, & 2013).

Community Development Team Model (CDT)

The CDT model was developed by the California Institute for Mental Health. Like the ARC model, the CDT model uses consultants who work with teams and treatment developers to promote motivation, engagement, commitment, persistence, and competence in service of supporting fidelity and sustainment of an EBP (Chamberlain et al., 2008). The model involves seven core processes. Reflecting the importance of intervention characteristics (Grol et al., 2007; Rogers, 2003), the first core process is to promote commitment and fidelity to an EBP by ensuring that it has a relative advantage over existing interventions and that it is a good fit for the organization or system. The next several core processes involve planning for implementation, assessing and addressing barriers (at any level), and providing technical assistance. Contextual factors at the organizational and provider levels are also addressed by a core process focusing on procedural skill development to enhance the organizational, management, and personnel skills needed to generate and execute the implementation plan. Finally, at the interorganizational level, the development of peer-to-peer networks promotes engagement, commitment, and learning by reducing the risk of implementing novel practices, allowing for mutual assistance, and promoting a healthy level of competition between organizations (Chamberlain et al., 2008; Sosna & Marsenich, 2006). The CDT model has been used to support the implementation of several EBTs, such as Multidimensional Treatment Foster Care, Functional Family Therapy, and Aggression Replacement Training (Sosna & Marsenich, 2006), and empirical evaluations of this model are forthcoming (Chamberlain et al., 2008; Saldana & Chamberlain, 2012).

Interagency Collaborative Team Model

Scaling-up EBPs across multiple organizations or service systems is incredibly challenging given the scope of such change efforts and the substantial variations at the organizational, team, and individual levels that make it more or less difficult to implement EBPs. The Interagency Collaborative Team model (Hurlburt et al., 2014) was designed to support the implementation and scale-up of EBPs in large geographic areas. The model involves a number of structured steps designed to overcome contextual barriers by generating the types of structural and process supports needed to implement and sustain interventions. The process begins by identifying and convening stakeholders with interests in a shared improvement effort (e.g., preventing child neglect). This generally includes funders, administrators, and service-delivery organizations. The next step is to seek relevant expertise required to address the central question of the improvement effort, and to generate as much data as possible about potential EBP options. Once a commitment is made to pursue the implementation of a given EBP, interagency seed teams are developed. Seed teams intentionally include employees from a number of different organizations in order to build broad investment, commitment, and communication pertaining to the change effort. Seed teams are responsible for (1) learning the EBP, (2) conducting the initial delivery of the EBP, (3) training local EBP practitioners, (4) liaising with EBP developers, (5) monitoring and providing feedback regarding the quality of EBP delivery, (6) communicating a commitment to quality EBP delivery, and (7) communicating implementation progress to all stakeholders. Moreover the seed teams train additional interagency training teams that will deliver the EBP, provide feedback to the seed team, and share information with one another about implementation challenges and progress. Eventually, there is a phased reduction in EBP developer involvement as seed and interagency teams continue to deepen their expertise in the EBP (Hurlburt et al., 2014). The Interagency Collaborative Team model is particularly promising in that it is inherently participatory and strengthens collaborative ties between organizations. Indeed, the “cross-organizational membership on the seed team contributes to ensuring a continuing locus of expertise available to all organizations within the Interagency Collaborative Team partnership, reducing the kinds of expertise loss that regularly occur within individual organizations due to staff turnover and organizational changes” (Hurlburt et al., 2014, p. 4). Perhaps most important, the model provides a much needed opportunity for organizations to learn from one another, mitigating potential barriers associated with implementing EBPs without the benefit of role models (Powell, Hausmann-Stabile, & McMillen, 2013).

Commonalities and Differences

These three models have a number of features in common. Each of them has been used in mental health and social service settings, and all three are multilevel (e.g., addressing client, team, organizational, and system levels) and multifaceted (involving multiple discrete strategies). Moreover, all three models include opportunities for the assessment of potential implementation barriers and for the flexible application of discrete implementation strategies to address those barriers. They also all involve mechanisms to generate partnerships between relevant stakeholders, which are increasingly recognized as essential to the implementation process (Chambers & Azrin, 2013). Several differences are also important to note, particularly with regard their intended purpose. The Community Development Team and Interagency Collaborative Team models have been used to scale up individual EBPs and explicitly reference the inclusion of EBP developers in the implementation process. Conversely, the ARC model has been used for intensive work with a small number of organizations, and it has been used as a more general organizational improvement strategy (Glisson et al., 2012; Glisson et al., 2013), as well as a means to implement a specific individual EBP (Glisson et al., 2010). The ARC model focuses most heavily on changing intraorganizational processes, whereas interorganizational processes play a more central role in the Community Development Team and Interagency Collaborative Team models. Consequently, ARC may be more useful than the other two models in supporting innovation, including the ongoing use of successive new EBPs as they become available. Finally, the models differ in terms of evidentiary support, as some of these models have only recently been developed. At present the ARC model has the most robust support for its effectiveness (Glisson et al., 2012; Glisson et al., 2013; Glisson et al., 2010).

Conclusions and Future Directions

In an effort to bridge the gap between research and practice, the field of implementation science has emerged. A plethora of conceptual frameworks have been developed to advance implementation science research, including the framework advanced by Proctor et al. (2009) to frame the central interventions and outcomes of such research. In addition, conceptual frameworks, such as the CFIR (Damschroder et al., 2009) and EPIS (Aarons et al., 2011) can be especially useful in highlighting contextual influences that can facilitate or impede EBP implementation. We highlighted a range of contextual factors that may be important in the dissemination and implementation of EBPs at the system level, the organizational level, and the provider-level. Key system-level barriers may include external policies, incentives, and peer pressure, whereas organizational-level factors that are important in implementation include both organizational culture and climate and implementation climate. Barriers may exist at the level of the individual provider, including provider attitudes, knowledge, and self-efficacy. It is also important to consider client-level factors and characteristics of the intervention that may also represent barriers to EBP implementation. A number of implementation strategies have been developed and may be useful in overcoming these common contextual barriers, including the ARC intervention, the Community Development Team model, and the Interagency Collaborative Team Model.

The field of implementation science is relatively young. Therefore, work remains to integrate and advance this field that until recently has been fragmented and lacking in consistency, especially with regard to terminology and measurement. Future research should be focused on refining our understanding of contextual barriers to implementation, evaluating the relative value of discrete implementation strategies (e.g., comparing types of incentives), and advancing and validating comprehensive implementation strategies.

References

Aarons, G. A. (2004). Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS). Mental Health Services Research, 6, 61–74.Find this resource:

Aarons, G. A. (2005). Measuring provider attitudes toward evidence-based practice: Consideration of organizational context and individual differences. Child and Adolescent Psychiatric Clinics of North America, 14, 255–271. doi: 10.1016/j.chc.2004.04.008Find this resource:

Aarons, G. A., Cafri, G., Lugo, L., & Sawitzky, A. (2012). Expanding the domains of attitudes towards evidence-based practice: The Evidence Based Practice Attitude Scale-50. Administration & Policy in Mental Health & Mental Health Services Research, 39, 331–340.Find this resource:

Aarons, G. A., Ehrhart, M., & Dlugosz, L. (March, 2012). Implementation climate and leadership for evidence-based practice implementation: Development of two new scales. Presented at the NIH Conference on the Science of Dissemination and Implementation, Bethesda, MD.Find this resource:

Aarons, G. A., Glisson, C., Green, P. D., Hoagwood, K., Kelleher, K. J., & Landsverk, J. A. (2012). The Research Network on Youth Mental Health: The organizational social context of mental health services and clinician attitudes toward evidence-based practice: A United States national study. Implementation Science, 7, 1–15.Find this resource:

Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health & Mental Health Services Research, 38, 4–23.Find this resource:

Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50, 179–211.Find this resource:

Baker-Ericzen, M. J., Hurlburt, M. S., Bookman-Frazee, L., Jenkins, M. M., & Hough, R. L. (2009). Comparing child, parent, and family characteristics in usual care and empirically supported treatment research samples for children with disruptive behavior disorders. Journal of Emotional and Behavioral Disorders, 18, 82–99. doi: 10.1177/1063426609336956Find this resource:

Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84, 191–215.Find this resource:

Beidas, R. S., Aarons, G., Barg, F., Evans, A., Hadley, T., Hoagwood, K., … Mandel, D. S. (2013). Policy to implementation: Evidence-based practice in community mental health—Study protocol. Implementation Science, 8, 1–9.Find this resource:

Beidas, R. S., Ditty, M., Downey, M. M., & Edmunds, J. (2014). Professional evidence-based practice with children and adolescents. In E. S. Sburlati, H. J. Lyneham, C. A. Schniering, & R. M. Rapee (Eds.), Evidence-based treatment of child and adolescent anxiety and depressive disorders: A competencies-based approach. West Sussex, UK: Wiley-Blackwell.Find this resource:

Beidas, R. S., Edmunds, J., Ditty, M., Watkins, J., Walsh, L., Marcus, S., & Kendall, P. C. (2013). Are inner context factors related to implementation outcomes in cognitive-behavioral therapy for youth anxiety? Administration & Policy in Mental Health & Mental Health Services Research, 41, 788–799. doi: 10.1007/s10488–013–0529-xFind this resource:

Beidas, R. S., Edmunds, J. M., Marcus, S. C., & Kendall, P. C. (2012). Training and consultation to promote implementation of an empirically supported treatment: A randomized trial. Psychiatric Services, 63, 660–665.Find this resource:

Beidas, R. S., & Kendall, P. C. (2010). Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice, 17, 1–30.Find this resource:

Beidas, R. S., Suarez, L., Simpson, D., Read, K., Wei, C., Connolly, S., & Kendall, P. C. (2012). Contextual factors and anxiety in minority and European American youth presenting for treatment across two urban university clinics. Journal of Anxiety Disorders, 26, 544–554. doi: 10.1016/j.janxdis.2012.02.008Find this resource:

Benjamin, C. L., Taylor, K. P., Goodin, S. M., & Creed, T. A. (2014). Dissemination and implementation of Cognitive Therapy for depression in schools. In R. S. Beidas & P. C. Kendall (Eds.), Dissemination and implementation of evidence-based practices in child and adolescent mental health (pp. 277–293). New York, NY: Oxford University Press.Find this resource:

Bond, G. R., Drake, R. E., McHugo, G. J., Peterson, A. E., Jones, A. M., & Williams, J. (2014). Long-term sustainability of evidence-based practices in community mental health agencies. Administration & Policy in Mental Health & Mental Health Services Research, 41, 228–236. doi: 10.1007/s10488–012–0461–5Find this resource:

Butler, A. C., Chapman, J. E., Forman, E. M., & Beck, A. T. (2006). The empirical status of cognitive-behavioral therapy: A review of meta-analyses. Clinical Psychology Review, 26, 17–31.Find this resource:

Candel, M., & Pennings, J. (1999). Attitude-based models for binary choices: A test for choices involving an innovation. Journal of Economic Psychology, 20, 547–569. doi: 10.1016/S0167–4870(99)00024–0Find this resource:

Chamberlain, P., Brown, C. H., Saldana, L., Reid, J., Wang, W., Marsenich, L., …Bouwman, G. (2008). Engaging and recruiting counties in an experiment on implementing evidence-based practice in California. Administration & Policy in Mental Health & Mental Health Services Research, 35, 250–260. doi: 10.1007/s10488–008–0167-xFind this resource:

Chambers, D. A. (2014). Guiding theory for dissemination and implementation research: A reflection on models used in research and practice. In R. S. Beidas & P. C. Kendall (Eds.), Dissemination and implementation of evidence-based practices in child and adolescent mental health (pp. 9–21). New York, NY: Oxford University Press.Find this resource:

Chambers, D. A., & Azrin, S. T. (2013). Partnership: A fundamental component of dissemination and implementation research. Psychiatric Services, 64, 509–511.Find this resource:

Chambless, D. L., & Hollon, S. D. (1998). Defining empirically supported therapies. Journal of Consulting and Clinical Psychology, 66, 7–18.Find this resource:

Chorpita, B. F., Daleiden, E. L., Ebesutani, C., Young, J., Becker, K. D., Nakamura, B. J., … Starace, N. (2011). Evidence-based treatment of children and adolescents: An updated review of indicators of efficacy and effectiveness. Clinical Psychology: Science & Practice, 18, 154–172.Find this resource:

Cochrane Effective Practice and Organisation of Care Group (2002). Data collection checklist (pp. 1–30).Find this resource:

Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4, 1–15.Find this resource:

Dingfelder, H. E., Mandell, D. S., Stahmer, A. C., & Marcus, S. C. (May, 2011). Classroom climate, program fidelity, & outcomes for students with autism. Poster presented at the International Meeting for Autism Research (IMFAR), San Diego, CA.Find this resource:

Eccles, M. P., & Mittman, B. S. (2006). Welcome to implementation science. Implementation Science, 1, 1–3. doi: 10.1186/1748–5908–1-1Find this resource:

Ehrenreich-May, J., Southam-Gerow, M. A., Hourigan, S. E., Wright, L. R., Pincus, D. B., & Weisz, J. R. (2011). Characteristics of anxious and depressed youth seen in two different clinical contexts. Administration & Policy in Mental Health & Mental Health Services Research, 38, 398–411. doi: 10.1007/s10488–010–0328–6Find this resource:

Essock, S. M., Goldman, H. H., Van Tosh, L., Anthony, W. A., Appell, C. R., … Drake, R. E. (2003). Evidence-based practices: Setting the context and responding to concerns. Psychiatric Clinics of North America, 26, 919–938.Find this resource:

Flanagan, M. E., Ramanujam, R., & Doebbeling, B. N. (2009). The effect of provider-and workflow-focused strategies for guideline implementation on provider acceptance. Implementation Science, 4, 1–10.Find this resource:

Frambach, R. T., & Schillewaert, N. (2005). Organizational innovation adoption: A multi-level framework of determinants and opportunities for future research. Journal of Business Research, 55, 163–176. doi: 10.1016/S0148–2963(00)00152–1Find this resource:

Garland, A. F., Bickman, L., & Chorpita, B. F. (2010). Change what? Identifying quality improvement targets by investigating usual mental health care. Administration & Policy in Mental Health & Mental Health Services Research, 37, 15–26. doi: 10.1007/s10488–010–0279-yFind this resource:

Garland, A. F., Kruse, M., & Aarons, G. A. (2003). Clinicians and outcome measurement: What’s the use? Journal of Behavioral Health Services and Research, 30, 393–405.Find this resource:

Glisson, C. (1978). Dependence of technological routinization on structural variables in human service organizations. Administrative Science Quarterly, 23, 383–395.Find this resource:

Glisson, C. (2002). The organizational context of children’s mental health services. Clinical Child and Family Psychology Review, 5, 233–253.Find this resource:

Glisson, C., Dukes, D., & Green, P. (2006). The effects of the ARC organizational intervention on caseworker turnover, climate, and culture in children’s service systems. Child Abuse & Neglect, 30, 855–880.Find this resource:

Glisson, C., & Green, P. (2011). Organizational climate, services, and outcomes in child welfare systems. Child Abuse & Neglect, 35, 582–591.Find this resource:

Glisson, C., & Hemmelgarn, A. (1998). The effects of organizational climate and interorganizational coordination on the quality and outcomes of children’s service systems. Child Abuse & Neglect, 22, 401–421.Find this resource:

Glisson, C., Hemmelgarn, A., Green, P., Dukes, D., Atkinson, S., & Williams, N. J. (2012). Randomized trial of the Availability, Responsiveness, and Continuity (ARC) organizational intervention with community-based mental health programs and clinicians serving youth. Journal of the American Academy of Child & Adolescent Psychiatry, 51, 780–787. doi: 10.1016/j.jaac.2012.05.010Find this resource:

Glisson, C., Hemmelgarn, A., Green, P., & Williams, N. J. (2013). Randomized trial of the Availability, Responsiveness and Continuity (ARC) organizational intervention for improving youth outcomes in community mental health programs. Journal of the American Academy of Child & Adolescent Psychiatry, 52, 493–500. doi: 10.1016/j.jaac.2013.02.005Find this resource:

Glisson, C., & James, L. R. (2002). The cross-level effects of culture and climate in human service teams. Journal of Organizational Behavior, 23, 767–794.Find this resource:

Glisson, C., Landsverk, J. A., Schoenwald, S. K., Kelleher, K. J., Hoagwood, K. E., Mayberg, S., & Green, P. (2008). The Research Network on Youth Mental Health: Assessing the Organizational Social Context (OSC) of mental health services: Implications for research and practice. Administration & Policy in Mental Health & Mental Health Services Research, 35, 98–113.Find this resource:

Glisson, C., Schoenwald, S., Hemmelgarn, A., Green, P., Dukes, D., Armstrong, K., & Chapman, J. (2010). Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology, 78, 537–550.Find this resource:

Glisson, C., & Schoenwald, S. K. (2005). The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Mental Health Services Research, 7, 243–259. doi: 10.1007/s11020–005–7456–1Find this resource:

Glisson, C., Schoenwald, S. K., Kelleher, K. J., Landsverk, J. A., Hoagwood, K. E., Mayberg, S., & Green, P. (2008). The Research Network on Youth Mental Health: Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate and service structure. Administration & Policy in Mental Health & Mental Health Services Research, 35, 124–133.Find this resource:

Grant, A., & Wrzesniewski, A. (2010). I won’t let you down … or will I? Core self-evaluations, other-orientation, anticipated guilt and gratitude, and job performance. Journal of Applied Psychology, 95, 108–121.Find this resource:

Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. Milbank Quarterly, 82, 581–629.Find this resource:

Grol, R., Bosch, M. C., Hulscher, M. E. J., Eccles, M. P., & Wensing, M. (2007). Planning and studying improvement in patient care: The use of theoretical perspectives. The Milbank Quarterly, 85, 93–138. doi: 10.1111/j.1468–0009.2007.00478.xFind this resource:

Henshaw, E., & Freedman-Doan, C. (2009). Conceptualizing mental health care utilization using the Health Belief Model. Clinical Psychology: Science and Practice, 16, 420–439.Find this resource:

Herschell, A. D., Kolko, D. J., Baumann, B. L., & Davis, A. C. (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review, 30, 448–466.Find this resource:

Hoagwood, K. E., Burns, B. J., Kiser, L., Heather, R., & Schoenwald, S. K. (2001). Evidence-based practice in child and adolescent mental health services. Psychiatric Services, 52, 1179–1189.Find this resource:

Hurlburt, M., Aarons, G. A., Fettes, D., Willging, C., Gunderson, L., & Chaffin, M. J. (2014). Interagency collaborative team model for capacity building to scale-up evidence-based practice. Children and Youth Services Review, 39, 160–168. doi: 10.1016/j.childyouth.2013.10.005Find this resource:

Institute of Medicine. (2001). Crossing the quality chasm: A new health system for the 21st Century. Washington, DC: National Academies Press.Find this resource:

Isett, K. R., Burnam, M. A., Coleman-Beattie, B., Hyde, P. S., Morrissey, J. P., Magnabosco, J., … Goldman, H. H. (2007). The state policy context of implementation issues for evidence-based practices in mental health. Psychiatric Services, 58, 914–921.Find this resource:

Jacobs, S. R., Weiner, B. J., & Bunger, A. C. (2014). Context matters: Measuring implementation climate among individuals and groups. Implementation Science, 9. doi: 10.1186/1748–5908–9-46Find this resource:

Kiviniemi, M., Bennett, A., Zaiter, M., & Marshall, J. (2011). Individual-level factors in colorectal cancer screening: A review of the literature on the relation of individual-level health behavior constructs and screening behavior. Psycho-Oncology, 20, 1023–1033.Find this resource:

Klein, K. T., & Sorra, J. S. (1996). The challenge of innovation implementation. The Academy of Management Review, 21, 1055–1088.Find this resource:

Lee, M. L., & Mittman, B. S. (July, 2012). Quantitative approaches for studying context-dependent, time-varying, adaptable complex social interventions. Paper presented at the Veterans Affairs Health Services Research and Development Service (HSR&D), Los Angeles, CA.Find this resource:

Lim, A., Nakamura, B. J., Higa-McMillan, C. K., Shimabukuro, S., & Slavin, L. (2012). Effects of workshop trainings on evidence-based practice knowledge and attitudes among youth community mental health providers. Behavior Research and Therapy, 50, 397–406.Find this resource:

Magnabosco, J. L. (2006). Innovations in mental health services implementation: A report on state-level data from the U.S. Evidence-Based Practices Project. Implementation Science, 1. doi: 10.1186/1748-5908-1-13Find this resource:

Massatti, R. R., Sweeney, H. A., Panzano, P. C., & Roth, D. (2008). The de-adoption of Innovative Mental Health Practices (IMHP): Why organizations choose not to sustain an IMHP. Administration & Policy in Mental Health & Mental Health Services Research, 35, 50–65.Find this resource:

Mazza, D., Bairstow, P., Buchan, H., Chakraborty, S. P., Van Hecke, O., Grech, C., & Kunnamo, I. (2013). Refining a taxonomy for guideline implementation: Results of an exercise in abstract classification. Implementation Science, 8, 1–10. doi: 10.1186/1748–5908–8-32Find this resource:

Mendel, P., Meredith, L. S., Schoenbaum, M., Sherbourne, C. D., & Wells, K. B. (2008). Interventions in organizational and community context: A framework for building evidence on dissemination and implementation in health services research. Administration & Policy in Mental Health & Mental Health Services Research, 35, 21–37.Find this resource:

Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81–95. doi: 10.1007/s12160–013–9486–6Find this resource:

Mittman, B. S. (2012). Implementation science in health care. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 400–418). New York, NY: Oxford University Press.Find this resource:

New Freedom Commission on Mental Health. (2003). Achieving the promise: Transforming mental health care in America. Final Report. (DHHS Pub. No. SMA-03–3832). Rockville, MD: U.S. Department of Health and Human Services.Find this resource:

Novins, D. K., Green, A. E., Legha, R. K., & Aarons, G. A. (2013). Dissemination and implementation of evidence-based practices for child and adolescent mental health: A systematic review. Journal of the American Academy of Child & Adolescent Psychiatry, 52, 1009–1025. doi: 10.1016/j.jaac.2013.07.012Find this resource:

Palinkas, L. A., & Soydan, H. (2012). Translation and implementation of evidence-based practice. Oxford, UK: Oxford University Press.Find this resource:

Powell, B. J., Hausmann-Stabile, C., & McMillen, J. C. (2013). Mental health clinicians’ experiences of implementing evidence-based treatments. Journal of Evidence-Based Social Work, 10, 396–409.Find this resource:

Powell, B. J., McMillen, J. C., Proctor, E. K., Carpenter, C. R., Griffey, R. T., Bunger, A. C., … York, J. L. (2012). A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review, 69, 123–157.Find this resource:

Proctor, E. K., & Brownson, R. C. (2012). Measurement issues in dissemination and implementation research. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 261–280). New York, NY: Oxford University Press.Find this resource:

Proctor, E. K., Landsverk, J., Aarons, G., Chambers, D., Glisson, C., & Mittman, B. (2009). Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration & Policy in Mental Health & Mental Health Services Research, 36, 24–34. doi: 10.1007/s10488–008–0197–4Find this resource:

Proctor, E. K., Powell, B. J., Baumann, A. A., Hamilton, A. M., & Santens, R. L. (2012). Writing implementation research grant proposals: Ten key ingredients. Implementation Science, 7, 96. doi: 10.1186/1748–5908–7-96Find this resource:

Proctor, E. K., Powell, B. J., & Feely, M. (2014). Measurement in dissemination and implementation science. In R. S. Beidas, P. C. Kendall, (Eds.), Dissemination and implementation of evidence-based practices in child and adolescent mental health. New York, NY: Oxford University Press.Find this resource:

Proctor, E. K., Powell, B. J., & McMillen, J. C. (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8, 1–11. doi: 10.1186/1748–5908–8-139Find this resource:

Proctor, E. K., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G. A., Bunger, A., … Hensley, M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration & Policy in Mental Health & Mental Health Services Research, 38, 65–76. doi: 10.1007/s10488–010–0319–7Find this resource:

Raghavan, R., Bright, C. L., & Shadoin, A. L. (2008). Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science, 3, 26. doi: 10.1186/1748–5908-3–26Find this resource:

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.Find this resource:

Saldana, L., & Chamberlain, P. (2012). Supporting implementation: The role of community development teams to build infrastructure. American Journal of Community Psychology, 50, 334–346.Find this resource:

Scheirer, M. A. (2013). Linking sustainability research to intervention types. American Journal of Public Health, 103, e76–e80. doi: 10.2105/AJPH.2012.300976Find this resource:

Solberg, L. I., Brekke, M. L., Fazio, C. J., Fowles, J., Jacobsen, D. N., Kottke, T. E., … Rolnick, S. J. (2000). Lessons from experienced guideline implementers: Attend to many factors and use multiple strategies. Journal on Quality Improvement, 26, 171–188.Find this resource:

Sosna, T., & Marsenich, L. (2006). Community development team model: Supporting the model adherent implementation of programs and practices. Sacramento, CA: California Institute for Mental Health.Find this resource:

Southam-Gerow, M. A., Chorpita, B. F., Miller, L. M., & Gleacher, A. A. (2008). Are children with anxiety disorders self-referred to a university clinic like those from the public mental health system? Administration & Policy in Mental Health & Mental Health Services Research, 35, 168–180.Find this resource:

Southam-Gerow, M. A., Hourigan, S. E., & Allin, R. B. (2009). Adapting evidence-based mental health treatments in community settings: Preliminary results from a partnership approach. Behavior Modification, 33, 82–103.Find this resource:

Southam-Gerow, M. A., Weisz, J. R., & Kendall, P. C. (2003). Youth with anxiety disorders in research and service clinics: Examining client differences and similarities. Journal of Clinical Child and Adolescent Psychology, 32, 375–385.Find this resource:

Spring, B. (2007). Evidence-based practice in clinical psychology: What it is, why it matters, what you need to know. Journal of Clinical Psychology, 63, 611–631.Find this resource:

Stark, K. D., Arora, P., & Funk, C. L. (2011). Training in school psychologists to conduct evidence-based treatments for depression. Psychology in the Schools, 48, 272–282.Find this resource:

Stewart, R. E., & Chambless, D. L. (2007). Does psychotherapy research inform treatment decision in private practice? Journal of Clinical Psychology, 63, 267–281.Find this resource:

Swain, K., Whitley, R., McHugo, G. J., & Drake, R. E. (2010). The sustainability of evidence-based practices in routine mental health agencies. Community Mental Health Journal, 46, 119–129.Find this resource:

Tabak, R. G., Khoong, E. C., Chambers, D. A., & Brownson, R. C. (2012). Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventative Medicine, 43, 337–350. doi: 10.1016/j.amepre.2012.05.024Find this resource:

U.S. Department of Health and Human Services. (1999). Mental health: A report of the Surgeon General. Rockville, MD: National Institute of Mental Health.Find this resource:

Weiner, B. J., Belden, C. M., Bergmire, D. M., & Johnston, M. (2011). The meaning and measurement of implementation climate. Implementation Science, 6, 78. doi: 10.1186/1748–5908–6-78Find this resource:

Weissman, M. M., Verdeli, H., Gameroff, M. J., Bledsoe, S. E., Betts, K., Mufson, L., … Wickramaratne, P. (2006). National survey of psychotherapy training in psychiatry, psychology, and social work. Archives of General Psychiatry, 63, 925–934.Find this resource:

Williams, N. J., & Glisson, C. (2014). The role of organizational culture and climate in the dissemination and implementation of empirically supported treatments for youth. In R. S. Beidas & P. C. Kendall (Eds.), Dissemination and implementation of evidence-based practices in child and adolescent mental health (pp. 61–81). New York, NY: Oxford University Press.Find this resource: