Journal of the Society for Social Work and Research

journal of society for social work and research

Subject Area and Category

  • Social Sciences (miscellaneous)
  • Sociology and Political Science

Publication type

1948822X, 23342315

Information

How to publish in this journal

[email protected]

journal of society for social work and research

The set of journals have been ranked according to their SJR and divided into four equal groups, four quartiles. Q1 (green) comprises the quarter of the journals with the highest values, Q2 (yellow) the second highest values, Q3 (orange) the third highest values and Q4 (red) the lowest values.

The SJR is a size-independent prestige indicator that ranks journals by their 'average prestige per article'. It is based on the idea that 'all citations are not created equal'. SJR is a measure of scientific influence of journals that accounts for both the number of citations received by a journal and the importance or prestige of the journals where such citations come from It measures the scientific influence of the average article in a journal, it expresses how central to the global scientific discussion an average article of the journal is.

Evolution of the number of published documents. All types of documents are considered, including citable and non citable documents.

This indicator counts the number of citations received by documents from a journal and divides them by the total number of documents published in that journal. The chart shows the evolution of the average number of times documents published in a journal in the past two, three and four years have been cited in the current year. The two years line is equivalent to journal impact factor ™ (Thomson Reuters) metric.

Evolution of the total number of citations and journal's self-citations received by a journal's published documents during the three previous years. Journal Self-citation is defined as the number of citation from a journal citing article to articles published by the same journal.

Evolution of the number of total citation per document and external citation per document (i.e. journal self-citations removed) received by a journal's published documents during the three previous years. External citations are calculated by subtracting the number of self-citations from the total number of citations received by the journal’s documents.

International Collaboration accounts for the articles that have been produced by researchers from several countries. The chart shows the ratio of a journal's documents signed by researchers from more than one country; that is including more than one country address.

Not every article in a journal is considered primary research and therefore "citable", this chart shows the ratio of a journal's articles including substantial research (research articles, conference papers and reviews) in three year windows vs. those documents other than research articles, reviews and conference papers.

Ratio of a journal's items, grouped in three years windows, that have been cited at least once vs. those not cited during the following year.

Evolution of the percentage of female authors.

Evolution of the number of documents cited by public policy documents according to Overton database.

Evoution of the number of documents related to Sustainable Development Goals defined by United Nations. Available from 2018 onwards.

Scimago Journal & Country Rank

Leave a comment

Name * Required

Email (will not be published) * Required

* Required Cancel

The users of Scimago Journal & Country Rank have the possibility to dialogue through comments linked to a specific journal. The purpose is to have a forum in which general doubts about the processes of publication in the journal, experiences and other issues derived from the publication of papers are resolved. For topics on particular articles, maintain the dialogue through the usual channels with your editor.

Scimago Lab

Follow us on @ScimagoJR Scimago Lab , Copyright 2007-2024. Data Source: Scopus®

journal of society for social work and research

Cookie settings

Cookie Policy

Legal Notice

Privacy Policy

Child Care and Early Education Research Connections

Journal of the society for social work & research.

Alternate Title Journal of the Society for Social Work and Research Journal ISSN 2334-2315 Paper 1948-822X Electronic Full Text Availability https://www.journals.uchicago.edu/toc/jsswr/current Journal Site https://www.journals.uchicago.edu/toc/jsswr/current Peer Reviewed Yes Publication Freqency Quarterly Priority Low Journal ID 168056 Created On Tue, 10/24/2023 - 12:00 Updated On Tue, 10/24/2023 - 12:00 Journal Publisher(s) University of Chicago Press

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

The Pursuit of Quality for Social Work Practice: Three Generations and Counting

Enola proctor.

Shanti K. Khinduka Distinguished Professor and director of the Center for Mental Health Services Research at Washington University in St. Louis

Social work addresses some of the most complex and intractable human and social problems: poverty, mental illness, addiction, homelessness, and child abuse. Our field may be distinct among professions for its efforts to ameliorate the toughest societal problems, experienced by society’s most vulnerable, while working from under-resourced institutions and settings. Members of our profession are underpaid, and most of our agencies lack the data infrastructure required for rigorous assessment and evaluation.

Moreover, social work confronts these challenges as it is ethically bound to deliver high-quality services. Policy and regulatory requirements increasingly demand that social work deliver and document the effectiveness of highest quality interventions and restrict reimbursement to those services that are documented as evidence based. Social work’s future, its very survival, depends on our ability to deliver services with a solid base of evidence and to document their effectiveness. In the words of the American Academy of Social Work and Social Welfare (AASWSW; n.d.) , social work seeks to “champion social progress powered by science.” The research community needs to support practice through innovative and rigorous science that advances the evidence for interventions to address social work’s grand challenges.

My work seeks to improve the quality of social work practice by pursuing answers to three questions:

  • What interventions and services are most effective and thus should be delivered in social work practice?
  • How do we measure the impact of those interventions and services? (That is, what outcomes do our interventions achieve?)
  • How do we implement the highest quality interventions?

This paper describes this work, demonstrates the substantive and methodological progression across the three questions, assesses what we have learned, and forecasts a research agenda for what we still need to learn. Given Aaron Rosen’s role as my PhD mentor and our many years of collaboration, the paper also addresses the role of research mentoring in advancing our profession’s knowledge base.

What Interventions and Services Are Most Effective?

Answering the question “What services are effective?” requires rigorous testing of clearly specified interventions. The first paper I coauthored with Aaron Rosen—“Specifying the Treatment Process: The Basis for Effectiveness Research” ( Rosen & Proctor, 1978 )—provided a framework for evaluating intervention effectiveness. At that time, process and outcomes were jumbled and intertwined concepts. Social work interventions were rarely specified beyond theoretical orientation or level of focus: casework (or direct practice); group work; and macro practice, which included community, agency-level, and policy-focused practice. Moreover, interventions were not named, nor were their components clearly identified. We recognized that gross descriptions of interventions obstruct professional training, preclude fidelity assessment, and prevent accurate tests of effectiveness. Thus, in a series of papers, Rosen and I advocated that social work interventions be specified, clearly labeled, and operationally defined, measured, and tested.

Specifying Interventions

Such specification of interventions is essential to two professional responsibilities: professional education and demonstrating the effectiveness of the field’s interventions. Without specification, interventions cannot be taught. Social work education is all about equipping students with skills to deliver interventions, programs, services, administrative practices, and policies. Teaching interventions requires an ability to name, define, see them in action, measure their presence (or absence), assess the fidelity with which they are delivered, and give feedback to students on how to increase or refine the associated skills.

To advance testing the effectiveness of social work interventions, we drew distinctions between interventions and outcomes and proposed these two constructs as the foci for effectiveness research. We defined interventions as practitioner behaviors that can be volitionally manipulated by practitioners (used or not, varied in intensity and timing), that are defined in detail, can be reliably measured, and can be linked to specific identified outcomes ( Rosen & Proctor, 1978 ; Rosen & Proctor, 1981 ). This definition foreshadowed the development of treatment manuals, lists of specific evidence-based practices, and calls for monitoring intervention fidelity. Recognizing the variety of intervention types, and to advance their more precise definition and measurement, we proposed that interventions be distinguished in terms of their complexity. Interventive responses comprise discrete or single responses, such as affirmation, expression of empathy, or positive reinforcement. Interventive strategies comprise several different actions that are, together, linked to a designated outcome, such as motivational interviewing. Most complex are interventive programs , which are a variety of intervention actions organized and integrated as a total treatment package; collaborative care for depression or community assertive treatment are examples. To strengthen the professional knowledge base, we also called for social work effectiveness research to begin testing the optimal dose and sequencing of intervention components in relation to attainment of desired outcomes.

Advancing Intervention Effectiveness Research

Our “specifying paper” also was motivated by the paucity of literature at that time on actual social work interventions. Our literature review of 13 major social work journals over 5 years of published research revealed that only 15% of published social work research addressed interventions. About a third of studies described social problems, and about half explored factors associated with the problem ( Rosen, Proctor, & Staudt, 2003 ). Most troubling was our finding that only 3% of articles described the intervention or its components in sufficient detail for replication in either research or practice. Later, Fraser (2004) found intervention research to comprise only about one fourth of empirical studies in social work. Fortunately, our situation has improved. Intervention research is more frequent in social work publications, thanks largely to the publication policies of the Journal of the Society for Social Work and Research and Research on Social Work Practice .

Research Priorities

Social work faces important and formidable challenges as it advances research on intervention effectiveness. The practitioner who searches the literature or various intervention lists can find more than 500 practices that are named or that are shown to have evidence from rigorous trials that passes a bar to qualify as evidence-based practices. However, our profession still lacks any organized compendium or taxonomy of interventions that are employed in or found to be effective for social work practice. Existing lists of evidence-based practices, although necessary, are insufficient for social work for several reasons. First, as a 2015 National Academies Institute of Medicine (IOM) report—“Psychosocial Interventions for Mental and Substance Use Disorders: A Framework for Establishing Evidence-Based Standards” ( IOM, 2015 )—concluded, too few evidence-based practices have been found to be appropriate for low-resource settings or acceptable to minority groups. Second, existing interventions do not adequately reflect the breadth of social work practice. We have too few evidence-based interventions that can inform effective community organization, case management, referral practice, resource development, administrative practice, or policy. Noting that there is far less literature on evidence-based practices relevant to organizational, community, and policy practice, a social work task force responding to the 2015 IOM report recommended that this gap be a target of our educational and research efforts ( National Task Force on Evidence-Based Practice in Social Work, 2016 ). And finally, our field—along with other professions that deliver psychosocial interventions—lacks the kinds of procedure codes that can identify the specific interventions we deliver. Documenting social work activities in agency records is increasingly essential for quality assurance and third-party reimbursement.

Future Directions: Research to Advance Evidence on Interventions

Social work has critically important research needs. Our field needs to advance the evidence base on what interventions work for social work populations, practices, and settings. Responding to the 2015 IOM report, the National Task Force on Evidence-Based Practice in Social Work (2016) identified as a social work priority the development and testing of evidence-based practices relevant to organizational, community, and policy practice. As we advance our intervention effectiveness research, we must respond to the challenge of determining the key mechanisms of change ( National Institute of Mental Health, 2016 ) and identify key modifiable components of packaged interventions ( Rosen & Proctor, 1978 ). We need to explore the optimal dosage, ordering, or adapted bundling of intervention elements and advance robust, feasible ways to measure and increase fidelity ( Jaccard, 2016 ). We also need to conduct research on which interventions are most appropriate, acceptable, and effective with various client groups ( Zayas, 2003 ; Videka, 2003 ).

Documenting the Impact of Interventions: Specifying and Measuring Outcomes

Outcomes are key to documenting the impact of social work interventions. My 1978 “specifying” paper with Rosen emphasized that the effectiveness of social work practice could not be adequately evaluated without clear specification and measurement of various types of outcomes. In that paper, we argued that the profession cannot rely only on an assertion of effectiveness. The field must also calibrate, calculate, and communicate its impact.

The nursing profession’s highly successful campaign, based on outcomes research, positioned that field to claim that “nurses save lives.” Nurse staffing ratios were associated with in-hospital and 30-day mortality, independent of patient characteristics, hospital characteristics, or medical treatment ( Person et al., 2004 ). In contrast, social work has often described—sometimes advertised—itself as the low-cost profession. The claim of “cheapest service” may have some strategic advantage in turf competition with other professions. But social work can do better. Our research base can and should demonstrate the value of our work by naming and quantifying the outcomes—the added value of social work interventions.

As a start to this work—a beginning step in compiling evidence about the impact of social work interventions—our team set out to identify the outcomes associated with social work practice. We felt that identifying and naming outcomes is essential for conveying what social work is about. Moreover, outcomes should serve as the focus for evaluating the effectiveness of social work interventions.

We produced two taxonomies of outcomes reflected in published evaluations of social work interventions ( Proctor, Rosen, & Rhee, 2002 ; Rosen, Proctor, & Staudt, 2003 ). They included such outcomes as change in clients’ social functioning, resource procurement, problem or symptom reduction, and safety. They exemplify the importance of naming and measuring what our profession can contribute to society. Although social work’s growing body of effectiveness research typically reports outcomes of the interventions being tested, the literature has not, in the intervening 20 years, addressed the collective set of outcomes for our field.

Fortunately, the Grand Challenges for Social Work (AASWSW, n.d.) now provide a framework for communicating social work’s goals. They reflect social work’s added value: improving individual and family well-being, strengthening social fabric, and helping to create a more just society. The Grand Challenges for Social Work include ensuring healthy development for all youth, closing the health gap, stopping family violence, advancing long and productive lives, eradicating social isolation, ending homelessness, creating social responses to a changing environment, harnessing technology for social good, promoting smart decarceration, reducing extreme economic inequality, building financial capability for all, and achieving equal opportunity and justice ( AASWSW, n.d. ).

These important goals appropriately reflect much of what we are all about in social work, and our entire field has been galvanized—energized by the power of these grand challenges. However, the grand challenges require setting specific benchmarks—targets that reflect how far our professional actions can expect to take us, or in some areas, how far we have come in meeting the challenge.

For the past decade, care delivery systems and payment reforms have required measures for tracking performance. Quality measures have become critical tools for all service providers and organizations ( IOM, 2015 ). The IOM defines quality of care as “the degree to which … services for individuals and populations increase the likelihood of desired … outcomes and are consistent with current professional knowledge” ( Lohr, 1990 , p. 21). Quality measures are important at multiple levels of service delivery: at the client level, at the practitioner level, at the organization level, and at the policy level. The National Quality Forum has established five criteria for quality measures: They should address (a) the most important, (b) the most scientifically valid, (c) the most feasible or least burdensome, (d) the most usable, and (e) the most harmonious set of measures ( IOM, 2015 .) Quality measures have been advanced by accrediting groups (e.g., the Joint Commission of the National Committee for Quality Assurance), professional societies, and federal agencies, including the U.S. Department of Health and Human Services. However, quality measures are lacking for key areas of social work practice, including mental health and substance-use treatment. And of the 55 nationally endorsed measures related to mental health and substance use, only two address a psychosocial intervention. Measures used for accreditation and certification purposes often reflect structural capabilities of organizations and their resource use, not the infrastructure required to deliver high-quality services ( IOM, 2015 ). I am not aware of any quality measure developed by our own professional societies or agreed upon across our field.

Future Directions: Research on Quality Monitoring and Measure Development

Although social work as a field lacks a strong tradition of measuring and assessing quality ( Megivern et al., 2007 ; McMillen et al., 2005 ; Proctor, Powell, & McMillen, 2012 ), social work’s role in the quality workforce is becoming better understood ( McMillen & Raffol, 2016 ). The small number of established and endorsed quality measures reflects both limitations in the evidence for effective interventions and challenges in obtaining the detailed information necessary to support quality measurement ( IOM, 2015 ). According to the National Task Force on Evidence-Based Practice in Social Work (2016) , developing quality measures to capture use of evidence-based interventions is essential for the survival of social work practice in many settings. The task force recommends that social work organizations develop relevant and viable quality measures and that social workers actively influence the implementation of quality measures in their practice settings.

How to Implement Evidence-Based Care

A third and more recent focus of my work addresses this question: How do we implement evidence-based care in agencies and communities? Despite our progress in developing proven interventions, most clients—whether served by social workers or other providers—do not receive evidence-based care. A growing number of studies are assessing the extent to which clients—in specific settings or communities—receive evidence-based interventions. Kohl, Schurer, and Bellamy (2009) examined quality in a core area of social work: training for parents at risk for child maltreatment. The team examined the parent services and their level of empirical support in community agencies, staffed largely by master’s-level social workers. Of 35 identified treatment programs offered to families, only 11% were “well-established empirically supported interventions,” with another 20% containing some hallmarks of empirically supported interventions ( Kohl et al., 2009 ). This study reveals a sizable implementation gap, with most of the programs delivered lacking scientific validation.

Similar quality gaps are apparent in other settings where social workers deliver services. Studies show that only 19.3% of school mental health professionals and 36.8% of community mental health professionals working in Virginia’s schools and community mental health centers report using any evidence-based substance-abuse prevention programs ( Evans, Koch, Brady, Meszaros, & Sadler, 2013 ). In mental health, where social workers have long delivered the bulk of services, only 40% to 50% of people with mental disorders receive any treatment ( Kessler, Chiu, Demler, Merikangas, & Walters, 2005 ; Merikangas et al., 2011 ), and of those receiving treatment, a fraction receive what could be considered “quality” treatment ( Wang, Demler, & Kessler, 2002 ; Wang et al., 2005 ). These and other studies indicate that, despite progress in developing proven interventions, most clients do not receive evidence-based care. In light of the growth of evidence-based practice, this fact is troubling evidence that testing interventions and publishing the findings is not sufficient to improve quality.

So, how do we get these interventions in place? What is needed to enable social workers to deliver, and clients to receive, high-quality care? In addition to developing and testing evidence-based interventions, what else is needed to improve the quality of social work practice? My work has focused on advancing quality of services through two paths.

Making Effective Interventions Accessible to Providers: Intervention Reviews and Taxonomies

First, we have advocated that research evidence be synthesized and made available to front-line practitioners. In a research-active field where new knowledge is constantly produced, practitioners should not be expected to rely on journal publications alone for information about effective approaches to achieve desired outcomes. Mastering a rapidly expanding professional evidence base has been characterized as a nearly unachievable challenge for practitioners ( Greenfield, 2017 ). Reviews should critique and clarify the intervention’s effectiveness as tested in specific settings, populations, and contexts, answering the question, “What works where, and with whom?” Even more valuable are studies of comparative effectiveness—those that answer, “Which intervention approach works better, where, and when?”

Taxonomies of clearly and consistently labeled interventions will enhance their accessibility and the usefulness of research reports and systematic reviews. A pre-requisite is the consistent naming of interventions. A persistent challenge is the wide variation in names or labels for interventive procedures and programs. Our professional activities are the basis for our societal sanction, and they must be capable of being accurately labeled and documented if we are to describe what our profession “does” to advance social welfare. Increasingly, and in short order, that documentation will be in electronic records that are scrutinized by third parties for purposes of reimbursement and assessment of value toward outcome attainment.

How should intervention research and reviews be organized? Currently, several websites provide lists of evidence-based practices, some with links, citations, or information about dissemination and implementation organizations that provide training and facilitation to adopters. Practitioners and administrators find such lists helpful but often note the challenge in determining which are most appropriate for their needs. In the words of one agency leader, “The drug companies are great at presenting [intervention information] in a very easy form to use. We don’t have people coming and saying, ‘Ah, let me tell you about the best evidence-based practice for cognitive behavioral therapy for depression,’” ( Proctor et al., 2007 , p. 483). We have called for the field to devise decision aids for practitioners to enhance access to the best available empirical knowledge about interventions ( Proctor et al., 2002 ; Proctor & Rosen, 2008 ; Rosen et al., 2003 ). We proposed that intervention taxonomies be organized around outcomes pursued in social work practice, and we developed such a taxonomy based on eight domains of outcomes—those most frequently tested in social work journals. Given the field’s progress in identifying its grand challenges, its associated outcomes could well serve as the organizing focus, with research-tested interventions listed for each challenge. Compiling the interventions, programs, and services that are shown—through research—to help achieve one of the challenges would surely advance our field.

We further urged profession-wide efforts to develop social work practice guidelines from intervention taxonomies ( Rosen et al., 2003 ). Practice guidelines are systematically compiled, critiqued, and organized statements about the effectiveness of interventions that are organized in a way to help practitioners select and use the most effective and appropriate approaches for addressing client problems and pursuing desired outcomes.

At that time, we proposed that our published taxonomy of social work interventions could provide a beginning architecture for social work guidelines ( Rosen et al., 2003 ). In 2000, we organized a conference for thought leaders in social work practice. This talented group wrestled with and formulated recommendations for tackling the professional, research, and training requisites to developing social work practice guidelines to enable researchers to access and apply the best available knowledge about interventions ( Rosen et al., 2003 ). Fifteen years later, however, the need remains for social work to synthesize its intervention research. Psychology and psychiatry, along with most fields of medical practice, have developed practice guidelines. Although their acceptance and adherence is fraught with challenges, guidelines make evidence more accessible and enable quality monitoring. Yet, guidelines still do not exist for social work.

The 2015 IOM report, “Psychosocial Interventions for Mental and Substance Use Disorders: A Framework for Establishing Evidence-Based Standards,” includes a conclusion that information on the effectiveness of psychosocial interventions is not routinely available to service consumers, providers, and payers, nor is it synthesized. That 2015 IOM report called for systematic reviews to inform clinical guidelines for psychosocial interventions. This report defined psychosocial interventions broadly, encompassing “interpersonal or informational activities, techniques, or strategies that target biological, behavioral, cognitive, emotional, interpersonal, social, or environmental factors with the aim of reducing symptoms and improving functioning or well-being” ( IOM, 2015 , p. 5). These interventions are social work’s domain; they are delivered in the very settings where social workers dominate (behavioral health, schools, criminal justice, child welfare, and immigrant services); and they encompass populations across the entire lifespan within all sociodemographic groups and vulnerable populations. Accordingly, the National Task Force on Evidence Based Practice in Social Work (2016) has recommended the conduct of more systematic reviews of the evidence supporting social work interventions.

If systematic reviews are to lead to guidelines for evidence-based psychosocial interventions, social work needs to be at the table, and social work research must provide the foundation. Whether social work develops its own guidelines or helps lead the development of profession-independent guidelines as recommended by the IOM committee, guidelines need to be detailed enough to guide practice. That is, they need to be accompanied by treatment manuals and informed by research that details the effect of moderator variables and contextual factors reflecting diverse clientele, social determinants of health, and setting resource challenges. The IOM report “Clinical Practice Guidelines We Can Trust” sets criteria for guideline development processes ( IOM, 2011 ). Moreover, social work systematic reviews of research and any associated evidence-based guidelines need to be organized around meaningful taxonomies.

Advancing the Science of Implementation

As a second path to ensuring the delivery of high-quality care, my research has focused on advancing the science of implementation. Implementation research seeks to inform how to deliver evidence-based interventions, programs, and policies into real-world settings so their benefits can be realized and sustained. The ultimate aim of implementation research is building a base of evidence about the most effective processes and strategies for improving service delivery. Implementation research builds upon effectiveness research then seeks to discover how to use specific implementation strategies and move those interventions into specific settings, extending their availability, reach, and benefits to clients and communities. Accordingly, implementation strategies must address the challenges of the service system (e.g., specialty mental health, schools, criminal justice system, health settings) and practice settings (e.g., community agency, national employee assistance programs, office-based practice), and the human capital challenge of staff training and support.

In an approach that echoes themes in an early paper, “Specifying the Treatment Process—The Basis for Effectiveness Research” ( Rosen & Proctor, 1978 ), my work once again tackled the challenge of specifying a heretofore vague process—this time, not the intervention process, but the implementation process. As a first step, our team developed a taxonomy of implementation outcomes ( Proctor et al., 2011 ), which enable a direct test of whether or not a given intervention is adopted and delivered. Although it is overlooked in other types of research, implementation science focuses on this distinct type of outcome. Explicit examination of implementation outcomes is key to an important research distinction. Often, evaluations yield disappointing results about an intervention, showing that the expected and desired outcomes are not attained. This might mean that the intervention was not effective. However, just as likely, it could mean that the intervention was not actually delivered, or it was not delivered with fidelity. Implementation outcomes help identify the roadblocks on the way to intervention adoption and delivery.

Our 2011 taxonomy of implementation outcomes ( Proctor et al., 2011 ), became the framework for two national repositories of measures for implementation research: the Seattle Implementation Research Collaborative ( Lewis et al., 2015 ) and the National Institutes of Health GEM measures database ( Rabin et al., 2012 ). These repositories of implementation outcomes seek to harmonize and increase the rigor of measurement in implementation science.

We also have developed taxonomies of implementation strategies ( Powell et al., 2012 ; Powell et al., 2015 ; Waltz et al., 2014 , 2015) . Implementation strategies are interventions for system change—how organizations, communities, and providers can learn to deliver new and more effective practices ( Powell et al., 2012 ).

A conversation with a key practice leader stimulated my interest in implementation strategies. Shortly after our school endorsed an MSW curriculum emphasizing evidence-based practices, a pioneering CEO of a major social service agency in St. Louis met with me and asked,

Enola Proctor, I get the importance of delivering evidence based practices. My organization delivers over 20 programs and interventions, and I believe only a handful of them are really evidence based. I want to decrease our provision of ineffective care, and increase our delivery of evidence-based practices. But how? What are the evidence-based ways I, as an agency director, can transform my agency so that we can deliver evidence-based practices?

That agency director was asking a question of how . He was asking for evidence-based implementation strategies. Moving effective programs and practices into routine care settings requires the skillful use of implementation strategies, defined as systematic “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice into routine service” ( Proctor et al., 2013 , p. 2).

This question has shaped my work for the past 15 years, as well as the research priorities of several funding agencies, including the National Institutes of Health, the Agency for Healthcare Research and Quality, the Patient-Centered Outcomes Research Institute, and the World Health Organization. Indeed, a National Institutes of Health program announcement—Dissemination and Implementation Research in Health ( National Institutes of Health, 2016 )—identified the discovery of effective implementation strategies as a primary purpose of implementation science. To date, the implementation science literature cannot yet answer that important question, but we are making progress.

To identify implementation strategies, our teams first turned to the literature—a literature that we found to be scattered across a wide range of journals and disciplines. Most articles were not empirical, and most articles used widely differing terms to characterize implementation strategies. We conducted a structured literature review to generate common nomenclature and a taxonomy of implementation strategies. That review yielded 63 distinct implementation strategies, which fell into six groupings: planning, educating, financing, restructuring, managing quality, and attending to policy context ( Powell et al., 2012 ).

Our team refined that compilation, using Delphi techniques and concept mapping to develop conceptually distinct categories of implementation strategies ( Powell et al., 2015 ; Waltz et al., 2014 ). The refined compilation of 73 discrete implementation strategies was then further organized into nine clusters:

  • changing agency infrastructure,
  • using financial strategies,
  • supporting clinicians,
  • providing interactive assistance,
  • training and educating stakeholders,
  • adapting and tailoring interventions to context,
  • developing stakeholder relationships,
  • using evaluative and iterative strategies, and
  • engaging consumers.

These taxonomies of implementation strategies position the field for more robust research on implementation processes. The language used to describe implementation strategies has not yet “gelled” and has been described as a “Tower of Babel” ( McKibbon et al., 2010 ). Therefore, we also developed guidelines for reporting the components of strategies ( Proctor et al., 2013 ) so researchers and implementers would have more behaviorally specific information about what a strategy is, who does it, when, and for how long. The value of such reporting guidelines is illustrated in the work of Gold and colleagues (2016) .

What have we learned, through our own program of research on implementation strategies—the “how to” of improving practice? First, we have been able to identify from practice-based evidence the implementation strategies used most often. Using novel activity logs to track implementation strategies, Bunger and colleagues (2017) found that strategies such as quality improvement tools, using data experts, providing supervision, and sending clinical reminders were frequently used to facilitate delivery of behavioral health interventions within a child-welfare setting and were perceived by agency leadership as contributing to project success.

Second, reflecting the complexity of quality improvement processes, we have learned that there is no magic bullet ( Powell, Proctor, & Glass, 2013 ). Our study of U.S. Department of Veterans Affairs clinics working to implement evidence-based HIV treatment found that implementers used an average of 25 (plus or minus 14) different implementation strategies ( Rogal, et al., 2017 ). Moreover, the number of implementation strategies used was positively associated with the number of new treatment starts. These findings suggest that implementing new interventions requires considerable effort and resources.

To advance our understanding of the effectiveness of implementation strategies, our teams have conducted a systematic review ( Powell et al., 2013 ), tested specific strategies, and captured practice-based evidence from on-the-ground implementers. Testing the effectiveness of implementation strategies has been identified as a top research priority by the IOM (2009) . In work with Charles Glisson in St. Louis, our 15-agency-based randomized clinical trial found that an organizational-focused intervention—the attachment, regulatory, and competency model—improved agency culture and climate, stimulated more clinicians to enroll in evidence-based-practice training, and boosted clinical effect sizes of various evidence-based practices ( Glisson, Williams, Hemmelgarn, Proctor, & Green, 2016a , 2016b ). And in a hospital critical care unit, the implementation strategies of developing a team, selecting and using champions, provider education sessions, and audit and feedback helped increase team adherence to phlebotomy guidelines ( Steffen et al., in press ).

We are also learning about the value of different strategies. Experts in implementation science and implementation practice identified as most important the strategies of “use evaluate and iterative approaches” and “train and educate stakeholders.” Reported as less helpful were such strategies as “access new funding streams” and “remind clinicians of practices to use” ( Waltz et al., 2015 ). Successful implementers in Veterans Affairs clinics relied more heavily on such strategies as “change physical structures and equipment” and “facilitate relay of clinical data to providers” than did less successful implementers ( Rogal et al., 2017 ).

Many strategies have yet to be investigated empirically, as has the role of dissemination and implementation organizations—organizations that function to promote, provide information about, provide training in, and scale up specific treatments. Most evidence-based practices used in behavioral health, including most listed on the Substance Abuse and Mental Health Services Administration National Registry of Promising and Effective Practices, are disseminated and distributed by dissemination and implementation organizations. Unlike drugs and devices, psychosocial interventions have no Federal Drug Administration-like delivery system. Kreuter and Casey (2012) urge better understanding and use of the intervention “delivery system,” or mechanisms to bring treatment discoveries to the attention of practitioners and into use in practice settings.

Implementation strategies have been shown to boost clinical effectiveness ( Glisson et al., 2010 ), reduce staff turnover ( Aarons, Sommerfield, Hect, Silvosky, & Chaffin, 2009 ) and help reduce disparities in care ( Balicer et al., 2015 ).

Future directions: Research on implementation strategies

My work in implementation science has helped build intellectual capital for the rapidly growing field of dissemination and implementation science, leading teams to distinguish, clearly define, develop taxonomies, and stimulate more systematic work to advance the conceptual, linguistic, and methodological clarity in the field. Yet, we continue to lack understanding of many issues. What strategies are used in usual implementation practice, by whom, for which empirically supported interventions? What strategies are effective in which organizational and policy contexts? Which strategies are effective in attaining which specific implementation outcomes? For example, are the strategies that are effective for initial adoption also effective for scale up, spread, and sustained use of interventions? Social workers have the skill set for roles as implementation facilitators, and refining packages of implementation strategies that are effective in social service and behavioral health settings could boost the visibility, scale, and impact of our work.

The Third Generation and Counting

Social work faces grand, often daunting challenges. We need to develop a more robust base of evidence about the effectiveness of interventions and make that evidence more relevant, accessible, and applicable to social work practitioners, whether they work in communities, agencies, policy arenas, or a host of novel settings. We need to advance measurement-based care so our value as a field is recognized. We need to know how to bring proven interventions to scale for population-level impact. We need to discover ways to build capacity of social service agencies and the communities in which they reside. And we need to learn how to sustain advances in care once we achieve them ( Proctor et al., 2015 ). Our challenges are indeed grand, far outstripping our resources.

So how dare we speak of a quality quest? Does it not seem audacious to seek the highest standards in caring for the most vulnerable, especially in an era when we face a new political climate that threatens vulnerable groups and promises to strip resources from health and social services? Members of our profession are underpaid, and most of our agencies lack the data infrastructure required for assessment and evaluation. Quality may be an audacious goal, but as social workers we can pursue no less. By virtue of our code of ethics, our commitment to equity, and our skills in intervening on multiple levels of systems and communities, social workers are ideally suited for advancing quality.

Who will conduct the needed research? Who will pioneer its translation to improving practice? Social work practice can be only as strong as its research base; the responsibility for developing that base, and hence improve practice, is lodged within social work research.

If my greatest challenge is pursuing this quest, my greatest joy is in mentoring the next generation for this work. My research mentoring has always been guided by the view that the ultimate purpose of research in the helping professions is the production and systemization of knowledge for use by practitioners ( Rosen & Proctor, 1978 ). For 27 years, the National Institute of Mental Health has supported training in mental health services research based in the Center for Mental Health Services Research ( Hasche, Perron, & Proctor, 2009 ; Proctor & McMillen, 2008 ). And, with colleague John Landsverk, we are launching my sixth year leading the Implementation Research Institute, a training program for implementation science supported by the National Institute of Mental Health ( Proctor et al., 2013 ). We have trained more than 50 social work, psychology, anthropology, and physician researchers in implementation science for mental health. With three more cohorts to go, we are working to assess what works in research training for implementation science. Using bibliometric analysis, we have learned that intensive training and mentoring increases research productivity in the form of published papers and grants that address how to implement evidence-based care in mental health and addictions. And, through use of social network analysis, we have learned that every “dose” of mentoring increases scholarly collaboration when measured two years later ( Luke, Baumann, Carothers, Landsverk, & Proctor, 2016 ).

As his student, I was privileged to learn lessons in mentoring from Aaron Rosen. He treated his students as colleagues, he invited them in to work on the most challenging of questions, and he pursued his work with joy. When he treated me as a colleague, I felt empowered. When he invited me to work with him on the field’s most vexing challenges, I felt inspired. And as he worked with joy, I learned that work pursued with joy doesn’t feel like work at all. And now the third, fourth, and fifth generations of social work researchers are pursuing tough challenges and the quality quest for social work practice. May seasoned and junior researchers work collegially and with joy, tackling the profession’s toughest research challenges, including the quest for high-quality social work services.

Acknowledgments

Preparation of this paper was supported by IRI (5R25MH0809160), Washington University ICTS (2UL1 TR000448-08), Center for Mental Health Services Research, Washington University in St. Louis, and the Center for Dissemination and Implementation, Institute for Public Health, Washington University in St. Louis.

This invited article is based on the 2017 Aaron Rosen Lecture presented by Enola Proctor at the Society for Social Work and Research 21st Annual Conference—“Ensure Healthy Development for All Youth”—held January 11–15, 2017, in New Orleans, LA. The annual Aaron Rosen Lecture features distinguished scholars who have accumulated a body of significant and innovative scholarship relevant to practice, the research base for practice, or effective utilization of research in practice.

  • Aarons GA, Sommerfield DH, Hect DB, Silvosky JF, Chaffin MJ. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology. 2009; 77 (2):270–280. https://doi.org/10.1037/a0013223 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • American Academy of Social Work and Social Welfare (AASWSW) Grand challenges for social work (n.d.) Retrieved from http://aaswsw.org/grand-challenges-initiative/
  • Balicer RD, Hoshen M, Cohen-Stavi C, Shohat-Spitzer S, Kay C, Bitterman H, Shadmi E. Sustained reduction in health disparities achieved through targeted quality improvement: One-year follow-up on a three-year intervention. Health Services Research. 2015; 50 :1891–1909. http://dx.doi.org/10.1111/1475-6773.12300 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. Tracking implementation strategies: A description of a practical approach and early findings. Health Research Policy and Systems. 2017; 15 (15):1–12. https://doi.org/10.1186/s12961-017-0175-y . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Evans SW, Koch JR, Brady C, Meszaros P, Sadler J. Community and school mental health professionals’ knowledge and use of evidence based substance use prevention programs. Administration and Policy in Mental Health and Mental Health Services Research. 2013; 40 (4):319–330. https://doi.org/10.1007/s10488-012-0422-z . [ PubMed ] [ Google Scholar ]
  • Fraser MW. Intervention research in social work: Recent advances and continuing challenges. Research on Social Work Practice. 2004; 14 (3):210–222. https://doi.org/10.1177/1049731503262150 . [ Google Scholar ]
  • Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, Chapman JE. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology. 2010; 78 (4):537–550. https://doi.org/10.1037/a0019160 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Glisson C, Williams NJ, Hemmelgarn A, Proctor EK, Green P. Increasing clinicians’ EBT exploration and preparation behavior in youth mental health services by changing organizational culture with ARC. Behaviour Research and Therapy. 2016a; 76 :40–46. https://doi.org/10.1016/j.brat.2015.11.008 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Glisson C, Williams NJ, Hemmelgarn A, Proctor EK, Green P. Aligning organizational priorities with ARC to improve youth mental health service outcomes. Journal of Consulting and Clinical Psychology. 2016b; 84 (8):713–725. https://doi.org/10.1037/ccp0000107 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Gold R, Bunce AE, Cohen DJ, Hollombe C, Nelson CA, Proctor EK, DeVoe JE. Reporting on the strategies needed to implement proven interventions: An example from a “real-world” cross-setting implementation study. Mayo Clinic Proceedings. 2016; 91 (8):1074–1083. https://doi.org/10.1016/j.mayocp.2016.03.014 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Greenfield S. Clinical practice guidelines: Expanded use and misuse. Journal of the American Medical Association. 2017; 317 (6):594–595. doi: 10.1001/jama.2016.19969. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hasche L, Perron B, Proctor E. Making time for dissertation grants: Strategies for social work students and educators. Research on Social Work Practice. 2009; 19 (3):340–350. https://doi.org/10.1177/1049731508321559 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Institute of Medicine (IOM), Committee on Comparative Effectiveness Research Prioritization. Initial national priorities for comparative effectiveness research. Washington, DC: The National Academies Press; 2009. [ Google Scholar ]
  • Institute of Medicine (IOM) Clinical practice guidelines we can trust. Washington, DC: The National Academies Press; 2011. [ Google Scholar ]
  • Institute of Medicine (IOM) Psychosocial interventions for mental and substance use disorders: A framework for establishing evidence-based standards. Washington, DC: The National Academies Press; 2015. https://doi.org/10.17226/19013 . [ PubMed ] [ Google Scholar ]
  • Jaccard J. The prevention of problem behaviors in adolescents and young adults: Perspectives on theory and practice. Journal of the Society for Social Work and Research. 2016; 7 (4):585–613. https://doi.org/10.1086/689354 . [ Google Scholar ]
  • Kessler RC, Chiu WT, Demler O, Merikangas KR, Walters EE. Prevalence, severity, and comorbidity of 12-month DSM-IV disorders in the National Comorbidity Survey Replication. Archives of General Psychiatry. 2005; 62 (6):617–627. https://doi.org/10.1001/archpsyc.62.6.617 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kohl PL, Schurer J, Bellamy JL. The state of parent training: Program offerings and empirical support. Families in Society: The Journal of Contemporary Social Services. 2009; 90 (3):248–254. http://dx.doi.org/10.1606/1044-3894.3894 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kreuter MW, Casey CM. Enhancing dissemination through marketing and distribution systems: A vision for public health. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: Translating science to practice. New York, NY: Oxford University Press; 2012. [ Google Scholar ]
  • Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, Comtois KA. The Society for Implementation Research collaboration instrument review project: A methodology to promote rigorous evaluation. Implementation Science. 2015; 10 (2):1–18. https://doi.org/10.1186/s13012-014-0193-x . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Lohr KN. Medicare: A strategy for quality assurance. I. Washington, DC: National Academies Press; 1990. [ PubMed ] [ Google Scholar ]
  • Luke D, Baumann A, Carothers B, Landsverk J, Proctor EK. Forging a link between mentoring and collaboration: A new training model for implementation science. Implementation Science. 2016; 11 (137):1–12. http://dx.doi.org/10.1186/s13012-016-0499-y . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Straus SS. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: A tower of Babel? Implementation Science. 2010; 5 (16) doi: 10.1186/1748-5908-5-16. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McMillen JC, Proctor EK, Megivern D, Striley CW, Cabasa LJ, Munson MR, Dickey B. Quality of care in the social services: Research agenda and methods. Social Work Research. 2005; 29 (3):181–191. doi.org/10.1093/swr/29.3.181. [ Google Scholar ]
  • McMillen JC, Raffol M. Characterizing the quality workforce in private U.S. child and family behavioral health agencies. Administration and Policy in Mental Health and Mental Health Services Research. 2016; 43 (5):750–759. doi: 10.1007/s10488-0150-0667-4. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Megivern DA, McMillen JC, Proctor EK, Striley CW, Cabassa LJ, Munson MR. Quality of care: Expanding the social work dialogue. Social Work. 2007; 52 (2):115–124. https://dx.doi.org/10.1093/sw/52.2.115 . [ PubMed ] [ Google Scholar ]
  • Merikangas KR, He J, Burstein M, Swendsen J, Avenevoli S, Case B, Olfson M. Service utilization for lifetime mental disorders in U.S. adolescents: Results of the National Comorbidity Survey Adolescent Supplement (NCS-A) Journal of the American Academy of Child and Adolescent Psychiatry. 2011; 50 (1):32–45. https://doi.org/10.1016/j.jaac.2010.10.006 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • National Institute of Mental Health. Psychosocial research at NIMH: A primer. 2016 Retrieved from https://www.nimh.nih.gov/research-priorities/psychosocial-research-at-nimh-a-primer.shtml .
  • National Institutes of Health. Dissemination and implementation research in health (R01) 2016 Sep 14; Retrieved from https://archives.nih.gov/asites/grants/09-14-2016/grants/guide/pa-files/PAR-16-238.html .
  • National Task Force on Evidence-Based Practice in Social Work. Unpublished recommendations to the Social Work Leadership Roundtable 2016 [ Google Scholar ]
  • Person SD, Allison JJ, Kiefe CI, Weaver MT, Williams OD, Centor RM, Weissman NW. Nurse staffing and mortality for Medicare patients with acute myocardial infarction. Medical Care. 2004; 42 (1):4–12. https://doi.org/10.1097/01.mlr.0000102369.67404.b0 . [ PubMed ] [ Google Scholar ]
  • Powell BJ, McMillen C, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, York JL. A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review. 2012; 69 (2):123–157. https://dx.doi.org/10.1177/1077558711430690 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Powell BJ, Proctor EK, Glass JE. A systematic review of strategies for implementing empirically supported mental health interventions. Research on Social Work Practice. 2013; 24 (2):192–212. https://doi.org/10.1177/1049731513505778 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Kirchner JE. A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science. 2015; 10 (21):1–14. https://doi.org/10.1186/s13012-015-0209-1 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, Knudsen KJ, Fedoravicius N, Hovmand P, Rosen A, Perron B. Implementation of evidence-based practice in community behavioral health: Agency director perspectives. Administration and Policy in Mental Health and Mental Health Services Research. 2007; 34 (5):479–488. https://doi.org/10.1007/s10488-007-0129-8 . [ PubMed ] [ Google Scholar ]
  • Proctor EK, Landsverk J, Baumann AA, Mittman BS, Aarons GA, Brownson RC, Chambers D. The Implementation Research Institute: Training mental health implementation researchers in the United States. Implementation Science. 2013; 8 (105):1–12. https://doi.org/10.1186/1748-5908-8-105 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, Padek M. Sustainability of evidence-based healthcare: Research agenda, methodological advances, and infrastructure support. Implementation Science. 2015; 10 (88):1–13. https://doi.org/10.1186/s13012-015-0274-5 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, McMillen JC. Quality of care. In: Mizrahi T, Davis L, editors. Encyclopedia of Social Work. 20. Washington, DC, and New York, NY: NASW Press and Oxford University Press; 2008. http://dx.doi.org/10.1093/acrefore/9780199975839.013.33 . [ Google Scholar ]
  • Proctor EK, Powell BJ, McMillen CJ. Implementation strategies: Recommendations for specifying and reporting. Implementation Science. 2012; 8 (139):1–11. https://doi.org/10.1186/1748-5908-8-139 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, Rosen A. From knowledge production to implementation: Research challenges and imperatives. Research on Social Work Practice. 2008; 18 (4):285–291. https://doi.org/10.1177/1049731507302263 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, Rosen A, Rhee C. Outcomes in social work practice. Social Work Research & Evaluation. 2002; 3 (2):109–125. [ Google Scholar ]
  • Proctor EK, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Hensley M. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2011; 38 (2):65–76. https://doi.org/10.1007/s10488-010-0319-7 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rabin BA, Purcell P, Naveed S, Moser RP, Henton MD, Proctor EK, Glasgow RE. Advancing the application, quality and harmonization of implementation science measures. Implementation Science. 2012; 7 (119):1–11. https://doi.org/10.1186/1748-5908-7-119 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rogal SS, Yakovchenko V, Waltz TJ, Powell BJ, Kirchner JE, Proctor EK, Chinman MJ. The association between implementation strategy use and the uptake of hepatitis C treatment in a national sample. Implementation Science. 2017; 12 (60) http://doi.org/10.1186/s13012-017-0588-6 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rosen A, Proctor EK. Specifying the treatment process: The basis for effectiveness research. Journal of Social Service Research. 1978; 2 (1):25–43. https://doi.org/10.1300/J079v02n01_04 . [ Google Scholar ]
  • Rosen A, Proctor EK. Distinctions between treatment outcomes and their implications for treatment evaluation. Journal of Consulting and Clinical Psychology. 1981; 49 (3):418–425. http://dx.doi.org/10.1037/0022-006X.49.3.418 . [ PubMed ] [ Google Scholar ]
  • Rosen A, Proctor EK, Staudt M. Targets of change and interventions in social work: An empirically-based prototype for developing practice guidelines. Research on Social Work Practice. 2003; 13 (2):208–233. https://dx.doi.org/10.1177/1049731502250496 . [ Google Scholar ]
  • Steffen K, Doctor A, Hoerr J, Gill J, Markham C, Riley S, Spinella P. Controlling phlebotomy volume diminishes PICU transfusion: Implementation processes and impact. Pediatrics (in press) [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Videka L. Accounting for variability in client, population, and setting characteristics: Moderators of intervention effectiveness. In: Rosen A, Proctor EK, editors. Developing practice guidelines for social work intervention: Issues, methods, and research agenda. New York, NY: Columbia University Press; 2003. pp. 169–192. [ Google Scholar ]
  • Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JE. Expert Recommendations for Implementing Change (ERIC): Protocol for a mixed methods study. Implementation Science. 2014; 9 (39):1–12. https://doi.org/10.1186/1748-5908-9-39 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, Kirchner JE. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science. 2015; 10 (109):1–8. https://doi.org/10.1186/s13012-015-0295-0 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Wang PS, Lane M, Olfson M, Pincus HA, Wells KB, Kessler RC. Twelvemonth use of mental health services in the United States: Results from the National Comorbidity Survey Replication. Archives of General Psychiatry. 2005; 62 (6):629–640. doi: 10.1001/archpsyc.62.6.629. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wang PS, Demler O, Kessler RC. Adequacy of treatment for serious mental illness in the United States. American Journal of Public Health. 2002; 92 (1):92–98. http://ajph.aphapublications.org/doi/abs/10.2105/AJPH.92.1.92 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zayas L. Service delivery factors in the development of practice guidelines. In: Rosen A, Proctor EK, editors. Developing practice guidelines for social work intervention: Issues, methods, and research agenda. New York, NY: Columbia University Press; 2003. pp. 169–192. https://doi.org/10.7312/rose12310-010 . [ Google Scholar ]

Journal of the Society for Social Work and Research - WoS Journal Info

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • For authors
  • New editors
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Online First
  • Active workplace design: current gaps and future pathways
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • http://orcid.org/0000-0001-9384-5456 Mohammad Javad Koohsari 1 , 2 , 3 ,
  • Andrew T Kaczynski 4 ,
  • Akitomo Yasunaga 5 ,
  • Tomoya Hanibuchi 6 ,
  • Tomoki Nakaya 7 ,
  • Gavin R McCormack 8 ,
  • Koichiro Oka 2
  • 1 School of Advanced Science and Technology , Japan Advanced Institute of Science and Technology , Nomi , Japan
  • 2 Faculty of Sport Sciences , Waseda University , Tokorozawa , Japan
  • 3 School of Exercise and Nutrition Sciences , Deakin University , Geelong , Victoria , Australia
  • 4 Arnold School of Public Health , University of South Carolina , Columbia , South Carolina , USA
  • 5 Faculty of Health Sciences , Aomori University of Health and Welfare , Aomori , Japan
  • 6 Graduate School of Letters , Kyoto University , Kyoto , Japan
  • 7 Graduate School of Environmental Studies , Tohoku University , Sendai , Japan
  • 8 Department of Community Health Sciences , University of Calgary , Calgary , Alberta , Canada
  • Correspondence to Dr Mohammad Javad Koohsari, Japan Advanced Institute of Science and Technology, Nomi, Japan; koohsari{at}jaist.ac.jp

https://doi.org/10.1136/bjsports-2024-108146

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

  • Public health
  • Sedentary Behavior
  • Health promotion

Introduction

Insufficient physical activity and excessive sitting time among office-based workers have been linked to various health risks and economic consequences. While health promotion interventions are important, the role of workplace design in encouraging active behaviours is increasingly recognised. However, significant gaps exist in knowledge about how workplace design influences these behaviours. This paper identifies the need to investigate the interactive effects of workplace norms and culture and the role of building layouts on workers’ behaviours, as well as the need for more accurate behavioural measures. Bridging these gaps is crucial for designing workplace interventions and promoting active, healthy and productive work environments.

Workplace design: encouraging movement in workplace settings

Existing gaps and future directions, interactive effects of workplace social environments.

Workplace social environments such as norms and culture can significantly influence sedentary behaviours among office-based workers 4 and can affect how workplace design influences workers’ behaviour. Most previous studies have tested the effects of workplace design on employees’ active and sedentary behaviours within Western contexts, 5 leaving a gap in how these relationships vary in other geographical settings with unique workplace norms and cultures. For instance, in a workplace where extended sitting is a cultural norm, employees may still predominantly engage in sedentary behaviour, regardless of having activity-promoting features in their workplace. Conversely, an activity-promoting environment might help mitigate norms towards sitting or even produce multiplicative positive effects in contexts where activity in the workplace is already customary. Conducting studies across varied geographical settings is necessary to identify similarities and differences in the impact of workplace norms and design on workers’ active and sedentary behaviours. Cross-cultural studies can shed light on the generalisability of findings and help develop customised interventions that address specific norms and cultural challenges. Future research can also employ mixed methods to gain a more thorough understanding of the complex interplay between workplace design, norms and culture, and employees’ behaviour. Additionally, the rise of home and hybrid working arrangements indicates that office social norms could extend to home work environments. For example, a culture of regular stretch breaks in the office might encourage similar practices at home, influencing physical activity behaviours remotely. Understanding the detailed relationship between workplace design, norms and employee behaviour is critical for developing targeted contextually relevant interventions that promote active workplace environments.

Precision in tracking workplace behaviours

Accurately measuring employees’ active and sitting behaviours and identifying the ‘locations’ where these behaviours occur is essential to understand their relationships with workplace design attributes. Global positioning systems (GPS) have been commonly used in combination with accelerometer devices to measure and spatially track people’s active and sedentary behaviour in outdoor environments, such as neighbourhoods and cities. 6 Nevertheless, GPS signals have limited accuracy or can be disrupted within indoor environments, resulting in less precise location data.

An indoor positioning system (IPS) can address the limitations of GPS in indoor environments. 7 IPS is a wayfinding technology that uses existing low-cost WiFi and Bluetooth to provide precise locations of individuals inside buildings. The IPS can be integrated with activity-tracking wearable devices, such as accelerometers, pedometers and heart rate monitors, as well as traditional methods like behavioural mapping. This integration allows for the collection of employees’ location data, movement patterns, activity intensities and other biometric data within workplaces. Additionally, the synergy between IPS and wearable devices effectively differentiates between occupational and leisure physical activities in workplaces. This distinction is key to better understanding the health paradox of the different health effects of these two types of physical activities. 8 Furthermore, with the growth of artificial intelligence (AI), there has been a unique opportunity to employ geospatial AI (GeoAI) in workplace environments and health research. GeoAI techniques aim to integrate innovations in spatial sciences with AI, particularly deep learning. 9 The joint application of IPS and GeoAI would enable precise location data of individuals within the workplace while using the power of spatial analysis. GeoAI can analyse workers’ movement patterns derived from IPS in combination with geospatial layers such as spatial layouts, access to common places, and light conditions. For instance, a GeoAI trained by tracking data on people’s movements in various indoor environments would predict people’s movements and derive estimates of the amount of sedentary behaviour of employed people only from planned indoor layout. This analysis allows for identifying hotspots or areas within the workplace where active and sedentary behaviour is prevalent.

Beyond individual design elements: exploring the influence of building layout on workplace behaviour

Most previous studies have primarily examined individual design elements but fail to consider how the overall spatial layout influences movement and behaviour. Building layout encompasses the spatial arrangement of building elements such as walls, doors, windows, and access ways, and plays a fundamental role in defining the functionality of interior spaces. Once a building layout has been established, making substantial alterations to it becomes challenging or, in some cases, impossible. Therefore, designing (and, if feasible, retrofitting) building interiors to promote health is imperative, but it is still unclear which workplace layouts are most supportive of workers’ active behaviours.

The urban design theory of space syntax has the potential to partially address this gap in knowledge. Space syntax uses a set of graph-based estimators to quantify spatial layouts. 10 It offers a framework to investigate the impact of building layout factors, such as workstation arrangement, common area location, and space accessibility, on workers’ movement patterns and behaviours. It goes beyond isolated design elements and considers the spatial configuration as a whole ( figure 1 ). Additionally, more research on ‘how’ people use and perceive their workspaces could complement the space syntax evaluations of building design.

  • Download figure
  • Open in new tab
  • Download powerpoint

Space syntax examines building layouts as a whole, using the graph theory: (A)a schematic workplace layout, (B)space syntax axial lines (i.e., longest and fewest lines traversing all spaces) of the layout, and (C)the connectivity of all spaces based on the graph theory.

Conclusions

Future research should investigate the interactive effects of workplace norms and culture on behaviour and conduct cross-cultural studies to identify similarities and differences. Innovative measurement methods can also be employed to accurately measure behaviours and locations where those behaviours occur within workplaces. Additionally, exploring the influence of spatial layout, and using the urban design theory of space syntax, can offer valuable insights into the design of work environments that facilitate workers’ engagement in active behaviours.

Ethics statements

Patient consent for publication.

Not applicable.

Ethics approval

  • Sugiyama T ,
  • Eakin EE , et al
  • Yoshikawa A ,
  • Qiu L , et al
  • Koohsari MJ ,
  • Liao Y , et al
  • Waters CN ,
  • Chu AHY , et al
  • Hadgraft N ,
  • Clark BK , et al
  • Katapally TR ,
  • Pollard B ,
  • Engelen L ,
  • Held F , et al
  • Holtermann A ,
  • Hansen JV ,
  • Burr H , et al
  • Laden F , et al
  • Hillier B ,

Contributors MJK conceived the idea and wrote the initial draft of the manuscript. All authors contributed to the writing and assisted with the analysis and interpretation. All authors have read and approved the final manuscript and agree with the order of the presentation of authors.

Funding MJK is supported by the JSPS KAKENHI (grant 23K09701). KO is supported by the JSPS Grants-in-Aid for Scientific Research program (grant 20H04113).

Competing interests None declared. In particular, none of the authors has a financial interest in the Space Syntax Limited company.

Provenance and peer review Not commissioned; externally peer reviewed.

Read the full text or download the PDF:

  • Search Menu
  • Advance Articles
  • Author Guidelines
  • Submission Site
  • Open Access Policy
  • Self-Archiving Policy
  • Why publish with Series A?
  • About the Journal of the Royal Statistical Society Series A: Statistics in Society
  • About The Royal Statistical Society
  • Editorial Board
  • Advertising & Corporate Services
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

B. De Stavola and M. Elliott Editorial Board

About the journal

The Journal of the Royal Statistical Society, Series A (Statistics in Society) , publishes papers that demonstrate how statistical thinking, design and analyses play a vital role in life and benefit society.

Author resources

Author resources

Learn about how to submit your article, our publishing process, and tips on how to promote your article.

Find out more

journal of society for social work and research

Submit Your Research

Explore the author guidelines and submit your research. 

Access the guidelines

High Impact

High Impact Research Collection

Explore papers from the Journal of the Royal Statistical Society: Series A with high citations.

email alerts

Email alerts

Register for email alerts to stay up to date with the Royal Statistical Society journals.

Latest articles

Latest posts on x.

Discussion Papers

Discussion Papers

Explore past discussion papers from the Journal of the Royal Statistical Society: Series A .

Special Collections

Special Collections

  • Special issues
  • Virtual issues

Webinars

Journal webinars are held every few months and last about an hour. Learn more about upcoming webinars.

RSS Society Logo

About the Royal Statistical Society

Learn more about the RSS and explore the benefits of membership.

Accessibility statement panel image

  • Accessibility

Oxford University Press is committed to making its products accessible to and inclusive of all our users, including those with visual, hearing, cognitive, or motor impairments.

Find out more in our accessibility statement

COPE logo

Committee on Publication Ethics (COPE)

This journal is a member of and subscribes to the principles of the Committee on Publication Ethics (COPE)

publicationethics.org

Read and publish

Read and Publish deals

Authors interested in publishing in Journal of the Royal Statistical Society Series A: Statistics in Society may be able to publish their paper Open Access using funds available through their institution’s agreement with OUP.

Find out if your institution is participating

RSS Data Science

Coming soon:  RSS: Data Science 

A new fully Open Access journal in the RSS family,  RSS: Data Science  will welcome submissions from across the breadth of data science, including AI, statistics, machine learning, econometrics, bioinformatics, and beyond.

Related Titles

Journal of the Royal Statistical Society Series B: Statistical Methodology

  • Recommend to Your Librarian
  • Advertising & Corporate Services
  • Journals Career Network
  • Email Alerts

Affiliations

  • Online ISSN 1467-985X
  • Print ISSN 0964-1998
  • Copyright © 2024 Royal Statistical Society
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

SSWR — Society for Social Work and Research

Secure Online Signup/Renewal Form

Currently, the SSWR Membership forms are being updated and will be available as soon as possible.

Please contact us if you have any questions or if you want to join/renew ASAP. Phone: 703-352-SSWR (7797) Fax: 703-359-7562 Email: [email protected]

IMAGES

  1. Professional Social Work Journals

    journal of society for social work and research

  2. Journal of Social Work Education: Vol 56, No 4

    journal of society for social work and research

  3. Journal of the Society for Social Work and Research

    journal of society for social work and research

  4. Journal of Social Work Education, Research and Action(Vol.1-No.3

    journal of society for social work and research

  5. Social Work Journals

    journal of society for social work and research

  6. Journal of the Society for Social Work and Research

    journal of society for social work and research

VIDEO

  1. Vlog-3| social work research| @matinavlogs @ravindra_bharti_official @YouTube

  2. Introduction to Social Work Research by Dr A Alagarsamy

  3. Men Society Social Work

  4. SOCIAL WORK RESEARCH HUB (Ep.8)

  5. Introducing The Social Work Research- SWorldlb

  6. Social Work Research and Statistics

COMMENTS

  1. Journal of the Society for Social Work and Research

    ABOUT THE JOURNAL Frequency: 4 issues/year ISSN: 2334-2315 E-ISSN: 1948-822X 2022 CiteScore*: 1.9 Ranked #500 out of 1,415 "Sociology and Political Science" journals. Founded in 2009, the Journal of the Society for Social Work and Research (JSSWR) is the flagship publication of the Society for Social Work and Research (SSWR), a freestanding organization founded in 1994 to advance social ...

  2. Journal of the Society for Social Work and Research

    Journal of the Society for Social Work and Research. Editor in Chief: Todd I. Herrenkohl Published for the Society for Social Work and Research. ALL ISSUES. 2020s. 2024. ... Author Guidelines for Reporting Scale Development and Validation Results in the Journal of the Society for Social Work and Research. Peter Cabrera-Nguyen; Vol. 1, ...

  3. SSWR

    RESOURCES. 4/16/2024: 2025 SSWR Awards: Calls for Nominations Now Open! Deadline: June 30, 2024. March is Social Work Month: Empowering Social Workers. 3/1/2024: Abstract Submission Site Now Open! Submission Deadline: April 15, 2024. 1/5/2024: SSWR Strategic Plan 2024-2028: Learn about our new strategic plan set to inform how we address complex ...

  4. Journal of the Society for Social Work and Research

    The Journal of the Society for Social Work and Research ( JSSWR) is a peer-reviewed publication dedicated to presenting innovative, rigorous original research on social problems, programs, and policies. By creating a venue for research reports, systematic reviews, and methodological studies, JSSWR seeks to strengthen social work research and ...

  5. Journal of the Society for Social Work and Research

    The Journal of the Society for Social Work and Research is a peer-reviewed publication dedicated to presenting innovative, rigorous original research on social problems, intervention programs, and policies. By creating a venue for the timely dissemination of empirical findings and advances in research methods, JSSWR seeks to strengthen the ...

  6. About SSWR

    About SSWR. The Society for Social Work and Research was founded in 1994 as a free-standing organization dedicated to the advancement of social work research. SSWR works collaboratively with a number of other organizations that are committed to improving support for research among social workers. Our members include faculty in schools of social ...

  7. 2025 Conference Home

    Overview of the SSWR 2025 Annual Conference. The SSWR Annual Conference offers a scientific program that reflects a broad range of research interests. From workshops on the latest quantitative and qualitative research methodologies to symposia featuring studies in child welfare, aging, mental health, welfare reform, substance abuse, and HIV/AIDS.

  8. Journal of Social Work: Sage Journals

    The Journal of Social Work is a forum for the publication, dissemination and debate of key ideas and research in social work. The journal aims to advance theoretical understanding, shape policy, and inform practice, and welcomes submissions from all … | View full journal description. This journal is a member of the Committee on Publication ...

  9. Critical Qualitative Inquiry: One Attempt to Actualize a

    Critical qualitative scholar-activists' work toward greater justice is always situated within the neo-liberal structures of the academy and society: even as we strive to reconceptualize research, we exist within its current conceptualizations that instantiate and reify the very oppressions we seek to disrupt.

  10. Journal of the Society for Social Work & Research

    This project is supported by the Administration for Children and Families (ACF) of the United States Department of Health and Human Services (HHS) as part of a 5-year financial assistance award (Grant No. 90YE250) totaling $3,953,308, with 100 percent funded by ACF/HHS.

  11. The Pursuit of Quality for Social Work Practice: Three Generations and

    Our literature review of 13 major social work journals over 5 years of published research revealed that only 15% of published social work research addressed interventions. About a third of studies described social problems, and about half explored factors associated with the problem ( Rosen, Proctor, & Staudt, 2003 ).

  12. Social Work

    1948-1955 •. 1933-1947 •. Social Work is the premier journal of the social work profession. Widely read by practitioners, faculty, and students, it is the official journal of NASW and is provided to all members as a membership benefit. Social Work is dedicated to improving practice and advancing knowledge in social work and social welfare.

  13. Journal of the Society for Social Work and Research

    * In order to submit a manuscript to this journal, please read the guidelines for authors in the journal's homepage. ... » Journal of the Society for Social Work and Research. Abbreviation: J SOC SOC WORK RES ISSN: 2334-2315 eISSN: 1948-822X Category / Quartile: SOCIAL WORK - SSCI(Q4)

  14. Social Work Research

    Explore a collection of highly cited articles from the NASW journals published in 2020 and 2021. Read now. An official journal of the National Association of Social Workers. Publishes exemplary research to advance the development of knowledge and inform social.

  15. Social Work Research: A 40-Year Perspective and a ...

    Feldman starts his review with the 1991 groundbreaking report by the National Institute of Mental Health's Task Force on Social Work Research. Back then, the study chair was David Austin who for many years advocated for high-quality social work research. The report essentially suggested that social work lacks relevant and/or high-quality research.

  16. SSWR Awards

    Journal of the Society for Social Work and Research (JSSWR) JSSWR Early Career Reviewer Program: Request for Applications; Membership Menu Toggle. ... The Society for Social Work and Research (SSWR), incorporated in 1994, is a non‐profit, professional membership organization. SSWR supports social workers, social welfare professionals, social ...

  17. Journal of the Society for Social Work and Research

    The Impact IF 2022 of Journal of the Society for Social Work and Research is 1.80, which is computed in 2023 as per its definition. Journal of the Society for Social Work and Research IF is increased by a factor of 0.6 and approximate percentage change is 50% when compared to preceding year 2021, which shows a rising trend. The impact IF, also denoted as Journal impact score (JIS), of an ...

  18. Journal of the Society for Social Work and Research

    During the most recent 2021 edition, 93.59% of publications had an unrecognized affiliation. Out of the publications with recognized affiliations, 40.00% were posted by at least one author from the top 10 institutions publishing in the journal. Another 0.00% included authors affiliated with research institutions from the top 11-20 affiliations. . Institutions from the 21-50 range included 40. ...

  19. Involving Service Users in Social Work Education, Research and Policy

    Extract. This edited text, part of the Research in Social Work series, includes nineteen chapters: one providing an introduction, ten presenting collaborative models in social work education; five on collaborative models in research and policy; and three reflective chapters exploring key themes that have emerged throughout. What is particularly engaging and unique about this book is how it ...

  20. Children's Participation in Child Welfare: A Systematic Review of

    The British Journal of Social Work, Volume 54, Issue 3, April 2024, Pages 1092-1108, ... Whilst research in the social work field may have been to some extent sluggish to explore children's participation, it has increased more recently in both quality and quantity with a growing number of scholars paying greater attention to this field of ...

  21. In the Echoes of Tomorrow: The Intersection of Social Work and

    In the research findings, students point out that AI may lead to ethical violations in terms of privacy concerns, the possibility of data misuse, etc. Students emphasize the risk of increased unemployment with the integration of AI into social work. Future research could investigate the long-term impact of integrating AI into social work ...

  22. Active workplace design: current gaps and future pathways

    Interactive effects of workplace social environments. Workplace social environments such as norms and culture can significantly influence sedentary behaviours among office-based workers4 and can affect how workplace design influences workers' behaviour. Most previous studies have tested the effects of workplace design on employees' active and sedentary behaviours within Western contexts,5 ...

  23. News participation is declining: Evidence from 46 ...

    This is one of the reasons the rise of digital media, and later social media platforms and messaging application platforms more specifically - all of which can make participation easier and more accessible to a greater number of citizens - gave rise to early optimism around the promise of the 'participatory culture' (Jenkins, 2006), 'participatory politics' (Loader and Mercea, 2011 ...

  24. Journal of the Society for Social Work and Research

    ABSTRACTING AND INDEXING. Articles that appear in the Journal of the Society for Social Work and Research are indexed in the following abstracting and indexing services: Ulrich's Periodicals Directory (Print) Ulrichsweb (Online) Clarivate Analytics. Essential Science Indicators. Web of Science (Social Sciences Citation Index)

  25. 2024 Conference Home

    Journal of the Society for Social Work and Research (JSSWR) JSSWR Early Career Reviewer Program: Request for Applications; Membership Menu Toggle. ... The Society for Social Work and Research (SSWR), incorporated in 1994, is a non‐profit, professional membership organization. SSWR supports social workers, social welfare professionals, social ...

  26. Journal of the Royal Statistical Society Series A: Statistics in

    About the journal. The Journal of the Royal Statistical Society, Series A (Statistics in Society), publishes papers that demonstrate how statistical thinking, design and analyses play a vital role in life and benefit society. Learn more

  27. Secure Online Signup/Renewal Form

    Secure Online Signup/Renewal Form. Currently, the SSWR Membership forms are being updated and will be available as soon as possible. Please contact us if you have any questions or if you want to join/renew ASAP. Phone: 703-352-SSWR (7797) Fax: 703-359-7562. Email: [email protected].