• Open access
  • Published: 30 November 2018

A review protocol on research partnerships: a Coordinated Multicenter Team approach

  • Femke Hoekstra   ORCID: orcid.org/0000-0002-0068-652X 1 , 12   na1 ,
  • Kelly J. Mrklas 2 , 3   na1 ,
  • Kathryn M. Sibley 4 ,
  • Tram Nguyen 5 , 6 ,
  • Mathew Vis-Dunbar 7 ,
  • Christine J. Neilson 8 ,
  • Leah K. Crockett 4 , 9 ,
  • Heather L. Gainforth 1 , 12   na1 &
  • Ian D. Graham 10 , 11   na1  

Systematic Reviews volume  7 , Article number:  217 ( 2018 ) Cite this article

6255 Accesses

15 Citations

9 Altmetric

Metrics details

Research partnership approaches, in which researchers and stakeholders work together collaboratively on a research project, are an important component of research, knowledge translation, and implementation. Despite their growing use, a comprehensive understanding of the principles, strategies, outcomes, and impacts of different types of research partnerships is lacking. Generating high-quality evidence in this area is challenging due to the breadth and diversity of relevant literature. We established a Coordinated Multicenter Team approach to identify and synthesize the partnership literature and better understand the evidence base. This review protocol outlines an innovative approach to locating, reviewing, and synthesizing the literature on research partnerships.

Five reviews pertaining to research partnerships are proposed. The Coordinated Multicenter Team developed a consensus-driven conceptual framework to guide the reviews. First, a review of reviews will comparatively describe and synthesize key domains (principles, strategies, outcomes, and impacts) for different research partnership approaches, within and beyond health (e.g., integrated knowledge translation, participatory action research). After identifying commonly used search terminology, three complementary scoping reviews will describe and synthesize these domains in the health research partnership literature. Finally, an umbrella review will amalgamate and reflect on the collective findings and identify research gaps and future directions. We will develop a collaborative review methodology, comprising search strategy efficiencies, terminology standardization, and the division of screening, extraction, and synthesis to optimize feasibility and literature capture. A series of synthesis and scoping manuscripts will emerge from this Coordinated Multicenter Team approach.

Comprehensively describing and differentiating research partnership terminology and its domains will address well-documented gaps in the literature. These efforts will contribute to and improve the quality, conduct, and reporting of research partnership literature. The collaborative review methodology will help identify and establish common terms, leverage efficiencies (e.g., expertise, experience, search and protocol design, resources) and optimize research feasibility and quality. Our approach allows for enhanced scope and inclusivity of all research user groups and domains, thereby contributing uniquely to the literature. This multicenter, efficiency and quality-focused approach may serve to inspire researchers across the globe in addressing similar domain challenges, as exist in this rapidly expanding field.

Research partnership approaches, in which researchers and stakeholders work together collaboratively on a research project, are an important component of research, knowledge translation, and implementation [ 1 , 2 , 3 , 4 ]. These approaches are becoming increasingly popular, as efforts to ensconce stakeholder engagement within health care research, implementation, and improvement work converge, and are prioritized by health care systems, research funders, government, and other organizations [ 4 ]. In particular, the active integration of patients and patient-identified priorities into the research process [ 5 , 6 , 7 ] has become much more frequent and, in many cases, is now a mandated expectation of research teams [ 5 , 8 , 9 , 10 ]. Research partnership approaches align well with efforts to enhance participant empowerment [ 11 ], elevate disenfranchised voices [ 12 ], and engage in real-world solution finding [ 13 ] to improve research relevance and impact [ 14 , 15 , 16 , 17 ].

Over the last half of a century or more, research partnership approaches have evolved within multiple research domains. A number of these approaches can be differentiated by important similarities and differences (e.g., integrated knowledge translation (IKT), participatory research, co-production, participatory action research (PAR), engaged scholarship, Mode 2 knowledge production) [ 14 , 15 , 17 , 18 , 19 ]. This variability of approach and terminology presents considerable challenges for synthesis research in the field of research partnerships, particularly in the sub-field of IKT. The dispersion and variation of relevant literature is daunting, both scientifically and logistically, and in many cases precludes attempts at more exhaustive reviews [ 3 , 15 , 20 , 21 ]. Research partnership terminology [ 22 , 23 , 24 ] and definitions [ 15 , 25 ] vary significantly by discipline and are still actively evolving, making IKT and other research partnership approaches difficult to capture conceptually [ 1 , 2 , 3 , 21 , 25 , 26 , 27 , 28 , 29 , 30 , 31 , 32 , 33 ].

IKT Footnote 1 , in particular, has been compared and contrasted to other types of research partnership approaches. To illustrate, Salsberg and Merati [ 19 ], Salsberg [ 18 ], Jull and colleagues [ 15 ], and Bowen [ 34 ] highlight important comparisons between IKT and participatory health research: IKT and PAR, IKT and community-based participatory research (CBPR), and engaged scholarship and participatory research, respectively. However, syntheses conducted in this area to date highlight considerable limitations and challenges, such as the use of scope control techniques and the amenability of reported data for extraction and synthesis [ 1 , 2 , 21 , 35 , 36 , 37 ].

In preparing this protocol, we were unable to identify a single synthesis that located, described, compared, or evaluated the literature pertaining to both IKT and related partnership research approaches within the health domain, or beyond. We identified a single review examining empirically evaluated IKT studies [ 21 ], and several syntheses focused on individual types of partnerships and relevant domains [ 29 , 37 , 38 , 39 ]. No synthesis that comparatively described principles, strategies, outcomes, and impacts in different types of research partnership approaches was identified. The implications of these findings are significant. The co-existence of multiple, potentially relevant evidence domains is well-recognized; yet, this evidence remains largely disconnected and is often viewed superficially, or within disciplines alone. We believe that extending the comparative analysis to other domains (principles, strategies, outcomes, impacts) may help researchers more deliberately apply and rigorously evaluate research partnership approaches in future. Comparative analytics examining how and why IKT and other research partnership approaches work, the key domains (principles, strategies, outcomes, impacts), and the contextual conditions under which these approaches function may allow more deliberate and efficient stakeholder engagement [ 1 , 21 ] and would represent a major step forward in the design, conduct, assessment, and impact of IKT and other research partnership approaches, in real-world settings.

This protocol describes the work plan of a newly established Coordinated Multicenter Team, focused on optimizing the quality and efficiency of IKT and other research partnership syntheses. The team has a specific interest in IKT [ 20 , 40 , 41 ] and applied this lens in the design of the proposed studies. Using a collaborative approach, we will build consensus strategies to address common challenges (e.g., terminology, definitions, conceptual similarities/differences, evidence volume and dispersion, logistics/resource and feasibility issues) faced by researchers attempting to synthesize the research partnership literature, including the sub-field of IKT. The three main aims of this study are to:

Systematically scope the literature and comparatively describe and synthesize principles, strategies, outcomes, and impacts reported in different types of research partnership approaches within and beyond health;

Describe and synthesize the principles, strategies, outcomes, and impacts and the accompanying research methods and tools reported in different types of health research partnership studies; and

Amalgamate and reflect on the collective findings and identify research gaps and future directions.

This review protocol describes a Coordinated Multicenter Team approach to reviewing and synthesizing the key domains in different types of research partnership approaches. This work will contribute to broadening and deepening the the evidence base for research partnerships and practice.

Coordinated Multicenter Team

A Coordinated Multicenter Team approach to plan, execute, assess, and report the proposed research syntheses will be used. The team is spread geographically and comprises clustered, multicenter teams working on complementary themes and projects. This approach will create resource and time efficiency, high productivity, and effectiveness and enhance methodological, logistical, and reporting quality within this area of the research literature. The Coordinated Multicenter Team comprises currently nine individuals (KJM, FH, KMS, TN, MVD, CJN, LKC, IDG, HG) across six academic and healthcare centers (University of Calgary, Alberta Health Services, University of British Columbia Okanagan, University of Manitoba, Ottawa Hospital Research Institute, University of Ottawa). Our work is embedded within an international integrated knowledge translation research network [ 42 ] established to systematically study and advance what is known and documented about IKT.

Engagement of stakeholders in the proposed studies

Our Coordinated Multicenter Team will both study and employ an IKT approach in the proposed research [ 43 ]. Team members work with several stakeholder groups who have a vested interest in improving the science of IKT, including patients, in particular. A steering committee, consisting of a diverse representation of stakeholders (e.g., patients, policy and decision-makers, healthcare professionals, researchers), will be established for each individual review (see Appendix 1 ). Committee members will be actively involved in reviews according to their needs and preferences and according to the specific needs of each review [ 44 , 45 , 46 ]. At a minimum, the Coordinated Multicenter Team will engage its stakeholders in the following research phases:

Conceptual design and formulation of the research questions

Before starting data extraction

Data analysis, interpretation, and dissemination of results

Study design

Scoping practices outlined by Arksey, O’Malley, and other colleagues [ 47 , 48 , 49 , 50 , 51 ] guide our work to identify and describe the research questions, identify and select studies, abstract, collate, synthesize, and validate findings. Given the diversity of terminology and the dispersion of this literature, we will synthesize in three steps (Fig.  1 ). First, we will start broadly by conducting a review of reviews to comparatively describe and synthesize key domains (principles, strategies, outcomes, and impacts) for different research partnership approaches, within and beyond health (step 1). In this first step, we will identify the research partnership terminology and research scope in different practice domains in order to optimize our search strategies for subsequent steps. Secondly, using a more refined set of search strategy terms informed by the review of reviews, we will conduct several scoping reviews to describe and synthesize each key domain further in the health research partnership literature (step 2). Finally, we will amalgamate and reflect on the findings of all reviews conducted in the previous steps, using an umbrella review, to draw overarching conclusions, describe our collaborative approach and future directions, and contribute to the research agenda (step 3). A series of scoping and synthesis manuscripts will emerge a review of reviews (1a), three scoping reviews (2a–c), and one overarching umbrella review (3a).

figure 1

The three steps of the Coordinated Multicenter Team approach

The planning, execution, evaluation, and reporting of all reviews and findings will be guided by the Cochrane Collaboration Handbook of Systematic Reviews [ 52 ], the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) [ 53 ], Preferred Reporting Items for Systematic Review and Meta-Analysis Equity (PRISMA-E) [ 54 ], and the emergent PRISMA-ScR for Scoping Reviews [ 55 ], and recent guidance from Pollock and colleagues for the conduct of overviews of reviews [ 56 ]. Details about the compliance with the PRISMA-P guidelines are described in Additional file  1 .

Guiding conceptual framework

As part of protocol planning, the Coordinated Multicenter Team developed a consensus-based focused conceptual framework to guide its work (Fig.  2 ). Three authors (KJM, FH, HG) developed a first draft of the guiding framework based on early research questions and the PICOS for each individual review. The content of the framework was discussed with all members of the Coordinated Multicenter Team at several team meetings and revised iteratively until consensus was reached. The framework defines the topic of interest, describes key domains of research partnerships, and captures the overall intended scope and outcomes of the Coordinated Multicenter Team research agenda. This includes four key domains ( principles , strategies , outcomes , impacts ), and each of these domains will be assessed in terms of their research methods , methodologies and/or tools . Finally, both functionally and conceptually, we anticipate that the nature of each of the proposed domains will be heavily influenced by context .

figure 2

The guiding conceptual framework. All reviews will be centralized around principles, strategies, outcomes, and impacts of research partnerships. These four domains will be assessed in terms of their research methods, methodologies and/or tools

For the purpose of this review, we will use the following operational terms and definitions:

Research partnerships : “individuals, groups, or organizations engaged in collaborative research activity involving at least one researcher (e.g., individual affiliated with an academic institution) and any stakeholder actively engaged in any part of the research process (e.g., decision or policy maker, health care administrator or leader, community agency, charities, network, patients etc.)” [ 1 , 57 ]. Examples of research partnership approaches include, but are not limited to, IKT, participatory research, and participatory action research.

Principles : “fundamental norms, rules, or values that represent what is desirable and positive for a person, group, organization, or community and help it in determining the rightfulness or wrongfulness of its actions. Principles are more basic than policy and objectives and are meant to govern both” [ 58 ].

Strategies : “observable actions designed to achieve an outcome” [ 59 ].

Outcomes : “a planned, a priori assessment described in the study methods that is used to determine a change in status as a result of interventions, can be measured or assessed as a component of the study, and is not something of futuristic benefit”. (Adapted from the University of Waterloo Research Ethics—Definition of Outcome) [ 60 ].

Impact : “identifiable benefit to, or positive influence on, the economy, society, public services, health, the environment, quality of life, or academia” [ 61 ].

Method : “the techniques or procedures used to gather and analyze data related to some research question or hypothesis” [ 62 ].

Methodology : “the strategy, plan of action, process, or design underlying behind the choice and use of particular methods and the choice and use of methods to the desired outcomes” [ 62 ].

Tools : “an instrument (e.g., survey, measures, assessments, questionnaire, inventory, checklist, metrics, indicators, list of factors, subscales, or similar) that can be used to assess/evaluate the elements or domains of an IKT or health research partnership” [ 57 ].

Context : defined as “the physical, organizational, institutional, and legislative structures that enable and constrain, and resource and realize, people and procedures” [ 63 ].

Facilitators : “single or multilevel factors that are positively associated with or enhance IKT or research partnership and/or its definition, conceptualization, establishment, or conduct, design, assessment, or impact” [ 57 ]

Barriers : “single or multilevel factors that are negatively associated with or hinder IKT or research partnership and/or its definition, conceptualization, establishment, or conduct, design, assessment, or impact” [ 57 ].

Research questions

Several groups of research questions will guide our research (see Tables  1 and 2 ). The primary research question for the review of reviews 1a is as follows:

1a) What differences and similarities can be identified in reported principles , strategies , outcomes , and impacts among different health and non-health research partnership approaches?

The primary research questions for scoping reviews 2a–c are as follows:

2a) What principles and strategies are used to guide the different types of health research partnerships?

2b) What are the reported outcomes and impacts of the different types of health research partnerships, and what are the available measurement tools for assessing the outcomes and impact ?

2c) What research methodologies and methods have been used to explicitly study or evaluate the partnering process underpinning health research partnership?

The primary research questions for the overarching umbrella review 3 are as follows:

3a) What do we currently know about principles , strategies , outcomes , and impacts in the context of research partnership approaches? What are the research gaps in the literature on research partnership approaches? What are the next steps that should be taken in the field of research partnerships?

Secondary research questions for each individual scoping review are described in Appendix 2 .

Step 1: Review of reviews

Search strategy.

In consultation with our collaborating academic librarians (MVD, CJN), the Coordinated Multicenter Team developed a search strategy centered on capturing the following key concepts: partnership research , participatory research , knowledge translation , and knowledge transfer . We opted not to use controlled vocabularies given that preliminary inquiries confirmed poor capture of this literature by traditional health indexing. Controlled vocabulary would adversely impact precision and inflate recall [ 22 , 23 , 64 ].

There is great diversity in the terminologies used to express concepts associated with different types of research partnership. However, we will work on the assumption that review papers, as they synthesize existing knowledge, will use standardized terminology in their titles, abstracts, and keywords, allowing for a less complex, but still comprehensive strategy. The search will be piloted in four health databases (MEDLINE, Embase, CINAHL, and PsycINFO) to evaluate scope and feasibility. Appendix 3 describes an example of the search strategy in MEDLINE. Further refinement of this strategy will be based on our findings during an initial screening process. The refined search strategy will then be used to search a wider range of disciplines. Ultimately, the review will describe the terminology required for our subsequent reviews and allow us to identify a list of “gold standard” articles, ensuring scoping review search strategies are effective. The final search strategies for all individual databases will be available via the Open Science Framework [ 65 , 66 ].

Electronic data sources

We will search for review papers within and beyond the health domain using the following electronic databases: MEDLINE, Embase, CINAHL, PsycINFO, ERIC, Education Source, Social Services Abstracts, Sociological Abstracts, Sociology Database, Applied Social Sciences Index and Abstracts, Web of Science Core Collection, and JSTOR.

Screening process and data extraction

The search will be executed by our academic librarian (MVD), and the results managed using Endnote™ X.7.5.3. De-duplication of search findings will be done according to Bramer’s method [ 67 ]. De-duplicated results will be imported into Rayyan, a web-based tool designed to facilitate the screening process of literature reviews [ 68 ]. Prior to the actual screening process, we will choose a random sample (5%) of citations to conduct calibration screening. Two members of the Coordinated Multicenter Team (FH, KJM) will take part and review the same set of citations independently. We will calculate inter-rater agreement using the kappa statistic and start the screening process once a kappa ≥ 0.6 is achieved. Where discrepancies arise, these will be discussed and resolved by consensus or failing agreement, referred to a third team member for a final decision.

The screening process will be conducted in three separate rounds. In the first round, both members will screen all citations on the title only, independently and in duplicate. Citations included by at least one team member will pass title screening and will be stored in a new database for the next screening round. In the second round, both members will screen all citations on title and abstract, independently and in duplicate, guided by the following eligibility criteria: included articles must (a) describe a literature overview of research partnerships according to our definition, (b) describe a systematic search of the literature including search terms and databases, and (c) be published in the English language. Articles that (a) do not meet our definition of research partnership and/or (b) describe a review of a method or tool instead of literature overview will be excluded and the reasons for exclusion categorized. The development of search strategies for each subsequent scoping review will be informed by the classifications formulated by this review (step 2). The third round will involve gathering full-text versions of all citations meeting eligibility criteria. Using the previously described screening calibration process, the same two team members will screen full-text review papers independently and in duplicate, discussing discrepancies to consensus or failing consensus, referring them to a third team member for a final decision. Full-text screening will then be performed based on the specific eligibility criteria aligned with our research questions (see Table  1 and Appendix 2 for more details). Once a final set of eligible review papers is generated, data extraction from full-text review papers will proceed, independently, and in duplicate, using a pre-tested data extraction tool in MS Excel. The extractable data (e.g., principles, strategies, outcomes, impacts) will be summarized for different types of research partnership approaches (e.g., IKT, CBPR, PAR). Strategies to determine the risk of bias and the methodological quality of the included review papers will be developed using published guidance [ 56 , 69 , 70 ].

Step 2: Scoping reviews

Building on the findings from the review of reviews (step 1), the Coordinated Multicenter Team will refine the health research partnership search strategy. The review of reviews will be used to identify relevant terminology and definitions used in different types of health research partnership, allowing for the development of a high-quality, evidence-informed search strategy for each scoping review. The review of reviews will also supplement the creation of a “gold standard” list of articles to test subsequent searches against. This approach will require intense collaborative effort and multiple strategy alignments; we anticipate the output will generate a comprehensive, well-defined body of literature amenable to multiple reviews that are focused on specific aspects of different types of health research partnerships. The search process will be facilitated by the team’s academic librarian (MVD). For each scoping review (step 2, reviews 2a–c), three individual search strategies will be developed, to ensure reproducibility and feasibility. Search strategies will consist of two parts: (1) an overarching segment to identify different types of health research partnerships and (2) protocol-specific part(s) identifying and/or modifying the focus of each scoping review as per domains of the guiding framework (e.g., principles, strategies, outcomes, impacts). The first part of each search strategy will be the same for every scoping review (identification of research partnerships search terms), and the second part will be customized to match review-specific research questions. Strategies will be piloted in MEDLINE to determine scope and feasibility limitations and anticipate resource requirements. To optimize search quality and comprehensiveness and to refine the balance between search sensitivity and scope feasibility, draft search strategies will be scrutinized by a second academic librarian using the Peer Review of Electronic Search Strategies (PRESS) checklist [ 71 , 72 ]. The team will review and consider the suggestions and make final edits to the strategy as necessary. All final search strategies for the individual databases will be available through Open Science Framework [ 66 ].

All scoping reviews will search for articles by using the following four electronic health databases: MEDLINE, Embase, CIHNAL, and PsycINFO. Decisions regarding additional refinements to data sources (e.g., time span, grey literature) will be specific to each scoping review and informed by the review of reviews findings.

Screening process

The Coordinated Multicenter Team search will be executed by an academic librarian (MVD), and results managed and de-duplicated using Endnote™ X.7.5.3, as described previously. Title and abstract screening will be performed for each scoping review separately using a pre-tested MS Excel screening tool. Screening calibration will be undertaken in two stages, by title and abstract, and then in full text, using the methods described previously (step 1), and will be carried out independently and in duplicate by two team members. Discrepancies will be discussed and resolved by screener consensus or referred to a third investigator for final resolution.

To maximize quality and ensure comparability within a very large volume of literature, we formulated general eligibility criteria for all scoping reviews for use with title and abstract level screening. We will combine these general eligibility criteria with review-specific criteria. We will include citations that involve research partnerships in the health domain. We will exclude articles that do not meet the definition of research partnership. For all excluded studies, we will track primary reasons for exclusion. After title and abstract screening, each scoping review team will proceed with the full-text screening process. Once a final set of eligible papers is generated, data extraction from full-text papers will proceed.

Screening calibration within each scoping review will be undertaken at each screening level as described previously. A priori agreement on common terms and definitions was achieved and will be applied during all study levels.

Aspects related to study records (risk of bias) and data (synthesis, meta-bias, confidence in cumulative evidence) will be tailored to each individual scoping review paper. Details about these aspects will be described in the individual review papers.

Step 3: Overarching umbrella review

Finally, two researchers (HG and KS) will synthesize and aggregate findings from the review of reviews and the three scoping reviews using an umbrella review, in collaboration with review leads (FH, KJM, TN). In accordance with published guidelines for developing, conducting, and reporting umbrella reviews [ 73 ], this review will synthesize findings from the multiple reviews into one accessible and useable document. [ 74 ]

This review protocol outlines our Coordinated Multicenter Team approach to reviewing and synthesizing research partnerships using an innovative and collaborative review methodology. Our approach will result in a series of review manuscripts describing specific aspects of different types of research partnerships and attempt to address the documented gaps, with a specific focus and interest in IKT [ 20 , 40 , 41 ]. By documenting our Coordinated Multicenter Team approach, we hope to provide guidance to and inspire researchers in the same or other fields to tackle evidence bases that are challenged by scope, terminology, dispersion, and volume.

Our Coordinated Multicenter Team approach is centralized around three key aspects. First, we will optimize research quality by sharing knowledge and expertise among all team members. Our team currently consists of nine individuals with different backgrounds and expertise (e.g., KT, implementation, IKT, behavioral science, research partnerships, knowledge synthesis), working in six different organizations across Canada. This provides a unique opportunity to learn in collaboration and raise the quality and integrity of multiple reviews. All papers published by the Coordinated Multicenter Team will use common terms and language based on our consensus-driven and literature-based guiding framework and related terms and definitions and reporting criteria [ 21 ], unless otherwise noted (Fig.  2 ). The Coordinated Multicenter Team papers may be used as a template for future research to conduct studies and report on different types of research partnership (e.g., we know of at least two similarly structured systematic reviews that will cascade from this first proposed set of reviews). Our search strategies will be publicly available [ 65 , 66 ], giving other researchers the opportunity to use our searches and build upon our work in refining terms and in locating and describing the nature of the evidence base for IKT and other types of research partnership approaches. In this way, we hope that our work will enhance research quality and transparency in the field of IKT and other research partnership approaches by creating a common language for reporting and planning.

Second, we will increase capacity by maximizing synthesis team efficiency in all stages of the review processes (e.g., search strategy development, screening process, procedural alignments for screening, extraction, dissemination). For example, we will use our findings from the review of reviews to develop an overarching research partnership-focused search strategy that will be applied in all our scoping reviews. We expect that a search strategy built upon terms and definitions comprising the breadth of the literature pertaining to research partnership will result in more focused results and will improve feasibility with well-justified search strategy controls. Syntheses can pose significant time, resource, and volume challenges [ 75 ] in fields where there is a high diversity of terminology, procedures, and literature dispersion. Our efficiency- and quality-focused collaborative review methodology offers potential strategies to overcome these challenges and may therefore contribute to the literature on review methodologies addressing efficiency and quality improvement [ 76 , 77 , 78 ]. Moreover, this approach allows for an enhanced scope for each review and enhances inclusivity of all research user groups and domains, thus contributing uniquely to the literature and reducing the potential for duplication of efforts.

Third, we hope to maximize the impact of our work by ensuring that our projects are relevant and usable to a broad audience by using an IKT approach tailored to each individual review project [ 43 , 44 , 45 , 46 ]. We will establish steering committees consisting of a diverse group of stakeholders for each individual review and engage them throughout the review processes ( Appendix 1 ). Moreover, we will reflect on our own IKT approach and will share the lessons learned in the overarching umbrella review. Our Coordinated Multicenter Team approach will, therefore, meet the needs of our partners and ensure that both researchers and stakeholders can benefit from our work.

In summary, our protocol paper provides a methodological design template for future researchers to construct their own reviews or research. It contributes to the methodological refinement of review processes using multi-site collaborative teams, in which design, workflow, scientific and logistical strategy, and other efficiencies are leveraged to optimize research quality. Ultimately, we hope our efforts will contribute to and improve the quality, conduct, and reporting of the research partnership literature. Our Coordinated Multicenter Team approach may serve to inspire researchers across the globe in addressing similar domain challenges, as exist in this rapidly expanding field.

We operationalize IKT using a recent iteration of an earlier definition [ 79 ] as:

“…a way of approaching research to increase the chances that the results will be applicable to the population under study. [IKT] is a paradigm shift that focuses on engagement with end users and the context in which they work. Essentially, it is a collaborative way of conducting research that involves researchers and knowledge-users, sometimes from multiple communities (e.g. clinicians, managers, policy makers, patients, [among others]) working together as partners in the research process .” (Graham, Tetroe & MacLean, 2014, p.11) [ 80 ].

Abbreviations

  • Community-based participatory research
  • Integrated knowledge translation

Participatory action research

Drahota A, Meza RD, Brikho B, Naaf M, Estabillo JA, Gomez ED, Vejnoska SF, Dufek S, Stahmer AC, Aarons GA. Community-academic partnerships: a systematic review of the state of the literature and recommendations for future research. Milbank Q. 2016;94:163–214.

Article   Google Scholar  

Camden C, Shikako-Thomas K, Nguyen T, Graham E, Thomas A, Sprung J, Morris C, Russell DJ. Engaging stakeholders in rehabilitation research: a scoping review of strategies used in partnerships and evaluation of impacts. Disability & Rehabilitation. 2015;37:1390–400. https://doi.org/10.3109/09638288.2014.963705 .

Jagosh J, Macaulay AC, Pluye P, Salsbert J, Bush PL, Henderson J, Greenhalgh T. Uncovering the benefits of participatory research: implications of a realist review for health research and practice. Millbank Quarterly. 2012;90:311–46.

Goodman MS, Sanders Thompson VL. The science of stakeholder engagement in research: classification, implementation and evaluation. Translational Behavioral Medicine. 2017;7:486–91. https://doi.org/10.1007/s13142-017-0495-z .

Article   PubMed   PubMed Central   Google Scholar  

Canadian Institute for Health Research. Strategy for Patient Oriented Research (SPOR). Ottawa: Canadian Institutes for Health Research; 2018. cited 2018 March 7]. Available from: http://www.cihr-irsc.gc.ca/e/41204.html

Google Scholar  

Sofolahan-Oladeinde Y, Mullins CD, Baquet CR. Using community-based participatory research in patent-centered outcomes research to address health disparities in under-represented communities. Journal of Comparative Effectiveness Research. 2015;4:515.

World Health Organization. World Health Organization Alma Ata Declaration. Geneva: World Health Organization; 1978.

Patient-Centered Outcomes Research Institute (PCORI). Engagement: influencing the culture of research. Washington: PCORI; 2018 [March 21, 2018]. Available from: https://www.pcori.org/engagement .

Alberta Innovates Health Solutions (AIHS). Funding opportunities and programs. Edmonton: Alberta Innovates Health Solutions; 2018 [cited 2018 March 20, 2018]. Available from: http://www.aihealthsolutions.ca/funding/health-research-funding/ .

Auckland S. BRC Guidance: involving users in research. London: Guy's and Thomas' NHS Foundation Trust; 2010.

World Health Organization. Ninth futures forum on health systems governance and public participation. Copenhagen: World Health Organization; 2006.

O'Mara-Eves A, Brunton G, McDaid D, Oliver S, Kavanaugh J, Jamal F, Matosevic T, Harden A, Thomas J. Community engagement to reduce inequalities in health: a systematic review, meta-analysis and economic analysis. Public Health Research. 2013;1:4. https://doi.org/10.3310/phr01040 .

Mockford C, Staniszewska S, Griffiths F, Herron-Marx S. The impact of patient and public involvement on UK NHS health care: a systematic review. International Journal of Quality in Health Care. 2012;24:28–38.

Bowen S, Graham ID. Backwards design or looking sideways? Knowledge translation in the real world: comment on “a call for a backward design to knowledge translation”. International Journal of Health Policy and Management. 2015;4:545–7.

Jull J, Giles A, Graham ID. Community-based participatory research and integrated knowledge translation: advancing the co-creation of knowledge. Implement Sci. 2017;12:1–9.

Brett J, Staniszewska S, Mockford C, Herron-Marx S, Hughes J, Tysall C, Suleman R. Mapping the impact of patient and public involvement on health and social care research: a systematic review. Health Expect. 2012;17:637–50.

Rycroft-Malone J, Burton CR, Bucknall T, Graham ID, Hutchinson AM, Stacey D. Collaboration and co-production of knowledge in healthcare: opportunities and challenges. International Journal of Health Policy and Management. 2016;5:221–3.

Salsberg J, Maccaulay AC, Parry D. Chapter 2: guide to integrated knowledge translation research: researcher and knowledge-user collaboration in health research. In: Graham JMT ID, Pearson A, editors. Turning knowledge into action: practical guidance on how to do integrated knowledge translation research. Lippincott-Joanna Briggs Institute Synthesis Science in Healthcare Series: Book 21. Philadelphia: Lippincott Williams & Wilkins; 2014.

Salsberg J, Merati N. Participatory Health Research in North America: from community engagement to evidence-informed practice. In: Kongats K, Michael TW, editors. Participatory Health Research: voices from around the world (in press). San Fransisco: Springer; 2018.

Gagliardi AR, Kothari A, Graham ID. Research agenda for integrated knowledge translation (IKT) in healthcare: what we know and do not yet know. J Epidemiol Community Health. 2017;71:105–6. https://doi.org/10.1136/jech-2016-207743 .

Article   PubMed   Google Scholar  

Gagliardi A, Berta W, Kothari A, Boyko J, Urquhart R. Integrated knowledge translation (iKT) in health care: a scoping review. Implement Sci. 2016;11:1–12. https://doi.org/10.1186/s13012-016-0399-1 .

McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Straus SE. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of literature in 2006: a tower of babel? Implement Sci. 2010;5:16. https://doi.org/10.1186/1748-5908-5-16 .

McKibbon KA, Lokker C, Wilczynski NL, Haynes RB, Ciliska D, Dobbins M, Davis DA, Straus SE. Search filters can find some but not all knowledge translation articles in MEDLINE: an analytic survey. J Clin Epidemiol. 2012;65:651–9.

Graham ID, Tetroe, JM., Robinson, N., Grimshaw, J., and the International Funders Study Research Group,. An international study of health research funding agencies' support and promotion of knowledge translation. Academy of Health Annual Research Meeting; Boston 2005.

Granner ML, Sharpe PA. Evaluating community coalition characteristics and functioning: a summary of measurement tools. Health Education Research - Theory and Practice. 2004;19:514–32.

Article   CAS   Google Scholar  

Israel BA, Schulz AJ, Parker EA, Becker AB. Review of community-based research: assessing partnership approaches to improve public heath. Annu Rev Public Health. 1998;19:173–202.

Zakocs RE, Edwards EM. What explains community coalition effectiveness? A review of the literature. Am J Prev Med. 2006;30:351–61.

Seaton CL, Holm N, Bottorff JL, Jones-Bricker M, Errey S, Caperchione CM, Lamont S, Johnson ST, Healy T. Factors that impact the success of interorganizational health promotion collaborations: a scoping review. Am J Health Promot. 2017:1–15.

Voorberg WH, Bekkers VJ, Tummers LG. A systematic review of co-creation and co-production: embarking on the social innovation journey. Public Management Review. 2015;17:1333–57.

Walters SJ, Stern C, Robertson-Malt S. The measurement of collaboration within healthcare settings: a systematic review of measurement properties of instruments. JBI Database of Systematic Reviews and Implementation Reports. 2016:138–97. https://doi.org/10.11124/JBISRIR-2016-2159 .

LeClercq T, Hammedi W, Poncin I. Ten years of value cocreation: an integrative review. Rech Appl Mark. 2016;31:26–60.

Stolp S, Bottorff JL, Seaton CL, Jones-Bricker M, Oliffe JL, Johnson ST, Errey S, Medhurst K, Lamont S. Measurement and evluation practices of factors that contribute to effective health promotion collaboration functioning: a scoping review. Evaluation and Program Planning. 2016;61:38–44.

Gradinger G, Britten N, Wyatt K, Froggatt K, Gibson A, Jacoby A, Lobban F, Mayes D, Snape D, Rawcliffe T, Popay J. Values associated with public involvement in health and social care research: a narrative review. Health Expect. 2013;18:661–75.

Bowen S. The relationship between engaged scholarship, knowledge translation and participatory research. In: Liamputtong GHAP, editor. Participatory qualitative research methodologies in health. Los Angeles: SAGE; 2015. p. 183–99.

Chapter   Google Scholar  

Greenhalgh T, Jackson C, Shaw S, Janamian T. Achieving research impact through co-creation in community-based health services: literature review and case study. Milbank Q. 2016;94:392–429.

Smith KE, Bambra C, Joyce KE, Perkins N, Hunter DJ, Blenkinsopp EA. Partners in health? A systematic review of the impact of organizational partnerships on public health outcomes in England between 1997-2008. J Public Health. 2009;31:210–21.

Roussos ST, Fawcett SB. A review of collaborative partnerships as a strategy for improving community health. Annu Rev Public Health. 2000;21:369–402.

Varda D, Shoup J, Miller S. A systematic review of collaboration and network research in the public affairs literature: implications for public health practice and research. Research and Practice. 2012;102:564–71.

Rycroft-Malone J, Burton C, Wilkinson J, Harvey G, McCormack B, Baker R, Dopson S, Graham I, Staniszewska S, Thompson C, Ariss S, Melville-Richards L, Williams L. Collective action for knowledge mobilisation: a realist evaluation of the Collaborations for Leadership in Applied Health Research and Care. NIHR Journals Library: Southhampton; 2015.

Kothari A, McCutcheon C, Graham ID, for the iKT Research Network. Defining integrated knowledge translation and moving forward: a reponse to recent commentaries. International Journal of Health Policy and Management. 2017;6:1–2.

Graham ID, Kothari A, McCutcheon C, and the Integrated Knowledge Translation Research Network Project Leads. Moving knowledge into action for more effective practice, programmes and policy: protocol for a research programme on integrated knowledge translation. Implementation Science. 2018;13. https://doi.org/10.1186/s13012-017-0700-y .

Graham ID. Moving knowledge into action for more effective practice, programs and policy: A research program focusing on integrated knowledge translation (Foundation Scheme: 2014 1st Live Pilot). [Grant Application]. In press 2015.

Canadian Institutes of Health Research (CIHR). Guide to knowledge translation planning at CIHR: integrated and end-of-grant approaches. Ottawa: Her Majesty the Queen In Right of Canada; 2012.

Domecq JP, Prutsky, G., Elriayah, T., Wang, Z., Habhan, M., Shippee, N., Brito, J.P., Boehmer, K., Hasan, R., Firwana, B., Erwin, P., Eton, D., Sloan, J., Montori, V., Asi, N., Abu Dabrh, AM., Hassan Murad, M. Patient engagement in research: a systematic review. BMC Health Serv Res 2014;14:1–9. doi: https://doi.org/10.1186/1472-6963-14-89 .

Hyde C, Dunn KM, Higginbottom A, Chew-Graham CA. Process and impact of patient involvement in a systematic review of shared decision making in primary care consultations. Health Expect. 2017;20:298–308. https://doi.org/10.1111/hex.12458 .

Harris J, Croot L, Thompson J, Springett J. How stakeholder participation can contribute to systematic reviews of complex interventions. J Epidemiol Community Health. 2015;70:207–14. https://doi.org/10.1136/jech-2015-205701 .

Arksey H, O'Malley L. Scoping studies: towards a methodological framework. International Journal of Social Research Methodology: Theory and Practice. 2005;8:19–32.

Daudt HM, van Mossel C, Scott SJ. Enhancing the scoping study methodology: a large, inter-professional team's experience with Arksey and O'Malley's framework. BMC Med Res Methodol. 2013;13:1–9.

Levac D, Colquhoun H, O’Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5:1–9.

Colquhoun HI, Levac D, O'Brien KK, Straus S, Tricco AC, Perrier L, Kastner M, Moher D. Scoping reviews: time for clarity in defintion, methods and reporting. J Clin Epidemiol. 2014;67:1291–4. https://doi.org/10.1016/j.jclinepi.2014.03.013 .

Tricco AC, Lillie E, Zarin W, O'Brien K, Colquhoun H, Kastner M, Levac D, Ng C, Pearson Sharpe J, Wilson K, Kenny M, Warren R, Wilson C, Stelfox HT, Straus SE. A scoping review on the conduct and reporting of scoping reviews. BMC Med Res Methodol. 2016;16:1–10. https://doi.org/10.1186/s12874-016-0116-4 .

Higgins JP, & Green, S. Cochrane handbook for systematic reviews of interventions. 2011 [updated march 2011; cited 2015 November 5]. Available from: http://handbook.cochrane.org .

Moher D, Shamseer L, Clarke M, Ghersi D, Liberati A, Petticrew M, Shekelle P, Stewart LA, and PRISMA-P Group. Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) 2015 Statement. Systematic Reviews. 2015;4:1–9. https://doi.org/10.1186/2046-4053-4-1 .

Welch V, Petticrew M, Tugwell P, Moher D, O'Neill J, Waters E, White H, and the PRISMA-Equity Bellagio group. PRISMA-Equity 2012 extention: reporting guidelines for systematic reviews with a focus on health equity. PLOS Medicine. 2012;9:e1001333. https://doi.org/10.1371/journal.pmed.1001333 .

Tricco AC. Preferred Reporting Items for Systematic Reviews and Meta-Anaysis extension for Scoping Reviews (PRISMA-ScR) Toronto, Canada2015 [cited 2015 September 11, 215]. Available from: http://www.prisma-statement.org/Extensions/InDevelopment.aspx .

Pollock M, Fernandes RM, Becker LA, Featherstone R, Hartling L. What guidance is available for researchers conducting overviews of reviews of healthcare interventions? A scoping review and qualitative metasummary. Systematic Reviews. 2016;5:1–15.

Mrklas KJ. A scoping review of available tools for assessing integrated knowledge translation research or health research partnership impact. [PhD Dissertation]: University of Calgary; 2017.

Business Dictionary. Definition of principles: business dictionary; 2017 [cited 2017 retrieved December 5, 2017]. Available from: http://www.businessdictionary.com/definition/principles.html .

Oxford Dictionary. 2018. Definition for "strategies".

University of Waterloo. Research ethics: definition of a health outcome. Waterloo, ON: University of Waterloo; 2018 [cited 2018 March 7]. Available from: https://uwaterloo.ca/research/office-research-ethics/research-human-participants/pre-submission-and-training/human-research-guidelines-and-policies-alphabetical-list/definition-health-outcome .

Higher Education Funding Council for England: Research Excellence Framework 2014. Assessment framework and guidance on submissions 2011. Bristol, UK2014 [cited 14 Nov 2017]. Available from: http://www.ref.ac.uk/2014/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/GOS%20including%20addendum.pdf .

Crotty M. Introduction: the research process. In: (Ed). MC, editor. The foundations of social research: meaning and perspective in the research process. London: Sage; 1998. p. 1–17.

May C, Finch T, Mair F, Ballini L, Dowrick C, Eccles M, Gask L, MacFarlane A, Murray E, Rapley T, Rogers A, Treweek S, Wallace P, Anderson G, Burns J, Heaven B. Understanding the implementation of complex interventions in health care: the normalization process model. BMC Health Serv Res. 2007;7:1–7.

Lokker C, McKibbon KA, Wilczynski NI, Haynes RB, Ciliska D, Dobbins M, Davis DA, Straus SE. Finding knowledge translation articles in CINAHL. Studies in Health Technology & Informatics. 2010;160:1179–83.

Foster ED, Deardorff A. Open Science Framework (OSF). Journal of the Medical Library Association (JMLA). 2017;105:203–6. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5370619/pdf/jmla-105-203.pdf .

Hoekstra F, Mrklas KJ, Sibley K, Nguyen T, Vis-Dunbar M, Neilson C, Crockett L, Gainforth H, Graham ID. Understanding collaborative approaches to research: a synthesis of the research partnership literature; 2018. https://doi.org/10.17605/OSF.IO/GVR7Y .

Book   Google Scholar  

Bramer WM, Giustini D, de Jonge GB, Holland L, Bekhuis T. De-duplication of database search results for systematic reviews in endnote. Journal of the Medical Library Association (JMLA). 2016;104:240–3. https://doi.org/10.3163/1536-5050.104.3.014 .

Ouzzani M, Hammady H, Fedorowicz Z, Elmagarmid A. Rayyan - a web and mobile app for systematic reviews. Systematic Reviews. 2016;5. https://doi.org/10.1186/s13643-016-0384-4 .

Ballard M, Montgomery P. Risk of bias in overviews of reviews: a scoping review of methodological guidance and four-item checklist. Res Synth Methods. 2017;8:92–108. https://doi.org/10.1002/jrsm.1229 .

Pollock A, Campbell P, Brunton G, Hunt H, Estcourt L. Selecting and implementing overview methods: implications from five exemplar overviews. Systematic Reviews. 2017;6:145. https://doi.org/10.1186/s13643-017-0534-3 .

McGowan J, Sampson M, Lefebvre C. An evidence based checklist for the peer review of electronic search strategies (PRESS EBC). Evidence Based Library and Information Practice. 2010;5:149–54. https://doi.org/10.18438/B8SG8R .

McGowan J, Sampson M, Salzwedel D, Cogo E, Foerster V, Lefebvre C. Guideline statement: PRESS peer review of electronic search strategies 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6.

Aromataris E, Fernandez R, Godfrey CM, Holly C, Khalil K, Tungpunkom P. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach. International Journal of Evidence-Based Healthcare. 2015;13:132–40. https://doi.org/10.1097/XEB.0000000000000055 .

Grant M, Booth J. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Inf Libr J. 2009;26:91–108. https://doi.org/10.1111/j.1471-1842.2009.00848.x .

Petticrew M, Roberts H. Systematic reviews in the social sciences: a practical guide. Malden: Blackwell Publishing Co.; 2006.

Petticrew M. Time to rethink the systematic review catechism? Moving from 'what works' to 'what happens'. Systematic Reviews. 2015;4:1–6. https://doi.org/10.1186/s13643-015-0027-1 .

Uttley L, Montgomery P. The influence of the team in conducting a systematic review. Systematic Reviews. 2017;6:1–4. https://doi.org/10.1186/s13643-017-0548-x .

Tsertsvadze A, Chen Y, Moher D, Sutcliffe P, McCarthy N. How to conduct systematic reviews more expeditiously? Systematic Reviews. 2015;4:1–6. https://doi.org/10.1186/s13643-015-0147-7 .

Canadian Institutes of Health Research. Knowledge translation at the CIHR. Ottawa: Canadian Institutes of Health Research; 2000 [cited 2018 March 8]. Available from: http://www.cihr-irsc.gc.ca/e/29418.html#2 .

Graham ID, Tetroe JM, McLean RK. Chapter 1: Some Basics of Integrated Knowledge Translation Research. In: Graham ID, Tetroe JM, Pearson A, editors. Turning knowledge into action: practical guidance on how to do integrated knowledge translation research. Philadelphia: Lippincott Williams & Wilkins; 2014. p. 196.

Download references

Acknowledgements

This research is supported in part by the IKTR Network (Canadian Institutes of Health Research Foundation Grant (FDN #143237), a Canadian Institutes of Health Research Project Grant (iKT Project grant: 156372), a Michael Smith Foundation for Health Research Scholar Award (Scholar Award #16910), and by the International Collaboration on Repair Discoveries (F17-01540). KMS is supported by a Canada Research Chair in Integrated Knowledge Translation in Rehabilitation Sciences. TN holds a Postdoctoral Fellowship from the Canadian Institutes of Health Research (2016–2019). This manuscript is an amalgamation of research proposals funded by different funding bodies. None of the funding bodies had a role in writing this manuscript or will have a role in the collection, analysis, or interpretation of the data.

Availability of data and materials

Not applicable. However, the final search strategies for the individual databases that will be executed for each individual review will be available through the Open Science Framework. Other relevant information about the data screening and data extraction processes will be published as a supplementary material to future publications and/or will be available through the Open Science Framework.

Author information

Femke Hoekstra, Kelly J. Mrklas, Heather L. Gainforth and Ian D. Graham contributed equally to this work.

Authors and Affiliations

School of Health and Exercise Sciences, University of British Columbia Okanagan, Kelowna, BC, Canada

Femke Hoekstra & Heather L. Gainforth

Strategic Clinical Networks™, System Innovation and Programs, Alberta Health Services, Calgary, AB, Canada

Kelly J. Mrklas

Department of Community Health Sciences, Cumming School of Medicine, University of Calgary, Calgary, AB, Canada

Department of Community Health Sciences, Max Rady College of Medicine, University of Manitoba, Winnipeg, Manitoba, Canada

Kathryn M. Sibley & Leah K. Crockett

School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada

Tram Nguyen

CanChild Centre for Childhood Disability Research, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada

Library, University of British Columbia Okanagan, Kelowna, BC, Canada

Mathew Vis-Dunbar

University of Manitoba, Winnipeg, MB, Canada

Christine J. Neilson

George and Fay Yee Centre for Healthcare Innovation, University of Manitoba, Winnipeg, Manitoba, Canada

Leah K. Crockett

Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada

Ian D. Graham

School of Epidemiology and Public Health, University of Ottawa, Ottawa, ON, Canada

International Collaboration on Repair Discoveries (ICORD, University of British Columbia, Vancouver, ON, Canada

You can also search for this author in PubMed   Google Scholar

Contributions

KM co-led the protocol conceptualization, study design, coordination, the drafting and editing of the manuscript, and the development of the conceptual framework and participated in obtaining funding for the study. FH co-led the protocol conceptualization, study design, coordination, the drafting and editing of the manuscript, and the development of the conceptual framework. KS was involved with the protocol conceptualization, provided critical input into the study design and manuscript drafts, and participated in obtaining funding for the study. TN was involved with the protocol conceptualization and provided critical input into the study design and manuscript drafts. MVD was involved with the study and conceptual design of the search strategies and provided critical input into the study design and manuscript drafts. CJN was involved with the conceptual design of the search strategies, provided advice in the early protocol design stages, and provided input into manuscript drafts. LC provided feedback on protocol concepts and manuscript drafts. HG oversaw the protocol conceptualization and the study design, provided critical input into the study design and manuscript drafts and the development of the conceptual framework, and participated in obtaining funding for the study. IDG oversaw the protocol conceptualization, provided feedback on the study design and manuscript drafts, and participated in obtaining funding for the study. All authors have read and approved the final manuscript.

Corresponding authors

Correspondence to Femke Hoekstra or Heather L. Gainforth .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1.

PRISMA-P Checklist (DOCX 22 kb)

Appendix 1—Steering committees

A steering committee, consisting of a diverse representation of stakeholders (e.g., patients, policy and decision-makers, healthcare professionals, researchers), will be established for each individual review. This appendix provides an overview of the stakeholders who will be a member of one of the steering committees.

The following partners will take part in one of the steering committees:

Spinal Cord Injury Ontario

Spinal Cord Injury BC

Ontario SCI Solutions Alliance

Northern American Spinal Cord Injury Consumer Consortium

Miami Project

Rick Hansen Institute

Research Manitoba

International Collaboration on Repair Discoveries

Spinal Cord Injury Canada

Michael Smith Foundation for Health Research

Mrklas, KJ—PhD Supervisory Committee, Department of Community Health Sciences, Cumming School of Medicine, University of Calgary

Clinicians from Hamilton Health Sciences, Toronto Rehabilitation Institute—University Health Network, Seine River Physiotherapy

Patient research partners from the BC SUPPORT Unit, University of Calgary

Appendix 2—Compendium of Review Protocols

Review of reviews 1a—Comparing and synthesizing research partnership approaches

This review of reviews aims to identify differences and similarities in principles, strategies, outcomes, and impacts reported in different health and non-health research partnership approaches in order to improve the differentiation among different research partnership approaches.

Primary research question

What differences and similarities can be identified in reported principles, strategies, outcomes, and impacts among different health and non-health research partnership approaches?

Secondary research questions

What kind of health and non-health research partnership approaches can be distinguished?

What definitions are used for research partnerships?

What principles, strategies, outcomes, and impacts are reported in different health and non-health research partnership approaches?

What are the differences and similarities between research partnership within and beyond the health domain?

What theories, models, and frameworks are used to guide review papers on health and non-health research partnership approaches?

Scoping review 2a—Principles and strategies

This scoping review aims to identify key principles and strategies for different health research partnerships.

What principles and strategies are used to guide the different types of health research partnerships?

How can the identified strategies be linked to the identified principles ?

What theories, models, and frameworks are used to guide the different types of health research partnerships? How do they relate to the guiding principles and strategies?

How do the guiding principles and strategies differ between stages in the research process (e.g., data collection, data analysis, interpretation of findings, disseminating of findings)?

How do the guiding principles and strategies differ between settings ?

How do the guiding principles and strategies differ between populations ?

How do the guiding principles and strategies differ between groups of research users (e.g., patients, practitioners, decision-makers, health organizations)?

What facilitators and barriers are associated with the guiding principles and strategies?

Scoping review 2b—Outcomes, impacts, and tools

This scoping review aims to establish the scope, nature, and location of the global literature pertaining to the reported outcomes and impacts of different health research partnerships and outcomes or impact assessment tools.

Primary research questions

What are the reported outcomes and impacts of the different types of health research partnerships?

What are the available measurement tools for assessing the outcomes and impacts of the different types of health research partnerships?

How are the different health research partnership outcomes or impact (and/or related concepts) defined and described?

What are the key characteristics and constructs of the different health research partnership outcomes or impact assessment tools?

What are the documented barriers, facilitators, and/or other documented influences on the development, assessment, or use of the different health research partnership outcome or impact assessment tools?

What emergent gaps exist in health research partnership outcome or impact assessment?

To what extent is the nature and scope of the evidence base in this area amenable to systematic review?

What future research questions arise in the literature on health research partnership outcomes or impact assessment?

Scoping review 2c—Research methodologies and methods

This scoping review aims to describe the research methodologies and methods used to study or evaluate the partnering process underpinning health research partnership.

What research methodologies and methods have been used to explicitly study or evaluate the partnering process underpinning health research partnership?

How is IKT defined in studies?

To what extent has the IKT approach been explicitly studied/evaluated in published literature?

In what contexts has the IKT approach been explicitly studied/evaluated?

Review 3—Overarching umbrella review

The final paper aims to synthesize the findings from the four reviews in terms of the guiding conceptual framework.

What do we currently know about principles , strategies , outcomes , and impacts in the context of research partnership approaches? What are the research gaps in literature on research partnership approaches? What are the next steps that should be taken in the field of research partnerships?

Appendix 3—Search strategy

The example search strategy for the review of reviews in MEDLINE.

  • The search was run on January 29, 2018

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Hoekstra, F., Mrklas, K.J., Sibley, K.M. et al. A review protocol on research partnerships: a Coordinated Multicenter Team approach. Syst Rev 7 , 217 (2018). https://doi.org/10.1186/s13643-018-0879-2

Download citation

Received : 19 June 2018

Accepted : 07 November 2018

Published : 30 November 2018

DOI : https://doi.org/10.1186/s13643-018-0879-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Knowledge synthesis
  • Multicenter study
  • Collaborative research partnerships
  • Stakeholder engagement
  • Research principles and strategies
  • Research outcomes and impact

Systematic Reviews

ISSN: 2046-4053

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

research and gather information on partnerships

The Power of Partnerships: How to maximise the impact of research for development

Published on 30 March 2020

Part 1. Key Qualities of Research-Policy Partnerships

“The Impact Initiative has really helped explore some of the practical opportunities and approaches for research policy partnerships to make a stronger contribution to development policy and practice.  This has important implications for UK funded researchers seeking lasting and beneficial impacts for developing countries.” Mark Claydon Smith, Deputy Director of International Development, UK Research Innovation (UKRI)

1.1  Background

Globally, increasing attention is being given to developing cutting-edge research in collaboration with those most likely to use or benefit from it. In July 2019, the Department for International Development (DFID), Economic and Social Research Council (ESRC), UK Research Innovation, UK Collaborative on Development Research and the Impact Initiative jointly convened a knowledge-exchange event in London . The event built on IDS Bulletin 50.1 ‘Exploring Research–Policy Partnerships in International Development’ , prepared by the Impact Initiative . It brought together a diverse group of more than 20 ESRC-DFID-funded researchers and practitioners from five countries, working on education, disability rights, child poverty, conflict and health reform, inviting them to reflect on their experiences of partnership for development that spanned research, policy and practice. Their case studies and a literature review of development partnership theory was the basis for a set of three inter-related partnership qualities. The case studies and commentaries below are featured in the above-mentioned IDS Bulletin .

“The ESRC-DFID Strategic Partnership has been successful in demonstrating what approaches [to partnership] are effective, being sharply focused on the combination of relevance and academic rigour with targeted, well-planned research uptake methods.” Diana Dalton, Former Deputy Director of DFID’s Research Evidence Division (Excerpt from the Foreword in IDS Bulletin 50.1)

1.2  Defining partnerships for policy change in development

Analysis of partnerships for international development has tended to focus on interactions between donors in the global North and national partners in Southern contexts, whether government, non-governmental organisations (NGOs) or research institutions. This is a crucial field of study, particularly as the movement to ‘decolonise development’ gains momentum. However, our focus was specifically on partnerships between research producers and users in international development settings, given this is an area that has received more limited attention. A framework for such partnerships aimed at achieving impact must consider the dynamics of real-world policy engagement. Useful here are concepts of mutual interdependence that maximise benefits for all. This means mutual commitment to the objectives of the collaboration and a strategy that is compatible with each actor’s mission, values and goals.

1.3  How do inter-sector partnerships maximise impact?

Various perspectives on how research influences policy and practice affect our understanding of the value and ideal design of research–policy partnerships. Concepts range from traditional linear relationships between new knowledge and policy innovation derived from the natural sciences; to complex interactive models intertwining science and society. Donors and researchers have largely settled on 3–4 core definitions or modes of impact. These definitions are useful when thinking about the impact of effective research–policy partnerships beyond academia.

Impact Initiative Wheel of Impact

Source: Georgalakis and Rose (2019: 2)

1.4  Is partnership between specialists enough?

Connections between research producers and users, and productive relationships between key individuals and institutions are important, but on their own are inadequate. Policymakers operate in environments full of uncertainty, making decisions based on ambiguous information. Advocates of evidenced-informed policy need to simplify complex problems and frame information to meet policymakers’ demands.

1.5  What are the key qualities of effective research–policy partnerships?

We have identified three inter-related qualities of effective partnerships, testing them with donors, researchers, and civil society and government partners.

Research–policy partnerships analytical framework

Source: Georgalakis and Rose (2019: 10)

Bounded mutuality

Key to successful partnerships is a common understanding of a given problem, and compatible values which underpin collaboration even if partners have different mandates. In research–policy partnerships, this occurs where evidence supply and demand converge. For example, a shared agenda for improving education or health systems may cement relations between policy advisors, who must deliver viable recommendations to decision makers, and practitioners and researchers hoping to inform programme design.

Nonetheless, partnerships are bounded by differences in organisational cultures, priorities and accountability. Partners seek to recognise these differences, sometimes compromising and playing to one another’s strengths. This pragmatic approach need not undermine core values. There are risks, but acknowledging that partners have different objectives and interests can be liberating, providing opportunities for learning and greater leverage.

The Shifting In/equality Dynamics in Ethiopia (SIDERA) project studied the effect of inter-communal violence on pastoralists in south-west Ethiopia. Divisions existed between the Ethiopian research team’s objective of producing rigorous research that empowered communities and the government’s assumptions about the underlying causes of conflict. The research team found that decision makers were not oblivious to the plight of the very poor pastoralist communities. But they assumed that the communities had brought problems on themselves and a state-led remedy was necessary. The team engaged regional officials through informal networks, changing their attitudes. One official remarked: ‘I used to see revenge as the sole reason for pastoralists conflicts but now I understand the main reason is economic.’

The team reflected on the challenges of South–South research and government partnerships, writing: “Government calls on researchers to contribute to national development on various occasions, presuming that “national  priorities” and “reality” are uncontested, and that researchers will naturally subscribe to the state’s conceptualisations […] A   serious and fruitful partnership between academics and policymakers needs to navigate this contradiction in view of  reality and the objective of research”.

Source: Mulugeta et al . (2019: 112, 99–120)

Sustained interactivity

Sustained means building engagement from the outset of the research process and beyond. The most successful research partnerships continue after projects have ended. Partners see value in working together and look for opportunities for longer-term collaboration. Attempts at sustained interactivity may be politically charged. As well as formal structures, such as advisory groups, more iterative and evolving approaches to partnership are essential to building trust.

This approach recognises that research-policy processes are relational and messy, not technical or ordered. The conventional language of supply of evidence by researchers and demand for this evidence by policy actors masks the blurring of boundaries between research producers and users, and how learning and influence go both ways. Traditional policy cycle concepts have been discredited, and social and relational models of research engagement require ongoing interaction among a fluctuating group of partners and boundary partners outside the core partnership.

PRARI, a health-focused social policy research project in partnership with the Southern African Development Community (SADC), focused on regional approaches to reducing poverty. Researchers, public officials and NGOs collectively developed a toolkit to track pro-poor regional health policy. This deliberative process differed greatly from using non-academic partners as vehicles to disseminate results. The partnership pre-dated the project and continued after it ended. A SADC official described how the process helped them: ‘think more analytically about the purposes of […] regional-level action on health.’

Source: Yeates, Moeti and Luwabelwa (2019: 131, 121–142)

Policy adaptability

Adaptability refers to how partnerships identify key influencing spaces and re-frame evidence for specific audiences, adapting to changes in political or social contexts. Diversity among the active members of the partnership informs engagement strategies and even the research process itself, tailoring it to particular contexts. Collaboration with boundary partners, such as policy advocates, or other brokers, such as the media, is essential because they may incorporate evidence into their own campaigns and priorities. Intermediaries expand partnerships’ reach and legitimacy. The ability of research collaborations to provide coherent responses to perceived policy dilemmas resides in more than just the rigour of their research and the inclusivity of their partnership.

Using donors, implementation agencies and policy actors as brokers allows access to otherwise closed policy spaces. Adaptability is not just about fast and furious engagement in live policy processes. It is also necessary for longer-term agenda setting, which may mean deviating from envisaged pathways as new information affects how evidence is understood and used.

The DFID Education Team’s reflections on ESRC-DFID Raising Learning Outcomes projects in Uganda and India set out the central importance of embedding partnerships in-country. For research policy partnerships to achieve impact, they argued a need for ‘supercommunicators’. Depending on the context, this may be an NGO that can pilot innovations and deliver policy advocacy, as in the case of DFID’s collaboration with Ugandan NGO Mango Tree. The team discussed the dynamics of government–donor and government–researcher relations in effective knowledge exchange. The role of donors themselves as knowledge brokers in a research–policy partnership can be crucial.

Source: Hinton, Bronwin and Savage (2019: 43–64)

“Education researchers understand policy and policymakers have got their heads around research” Richard Clarke , DFID’s Director General for Policy, Research and Humanitarian

Part 2. Making research-policy partnerships work: Funders’ and researchers recommendations

At the partnerships framework launch, a diverse group of donors, ESRC-DFID-funded researchers, policy partners and civil society organisations discussed how to use and improve the framework, and its implications for their work. Their conclusions are summarised below.

2.1  Deploy the framework at the design stage of a research process to increase partnership viability

The inception phase of a research programme often entails stakeholder mapping and political economy analysis. The three qualities described above could be used to explore how the partnership relates to wider contextual analysis. Research partnerships are as much a product of social and political norms as the research area being addressed. An inclusive and participatory approach builds trust and mutual understanding around:

  • Exploiting differences in expertise and networks.
  • Clarifying roles and responsibilities.
  • Developing processes to continually exchange ideas.
  • Sustaining longer-term relations.
  • Framing evidence for policy audiences.
  • Identifying boundary partners.

Participants at the launch event agreed that research partnerships can be closely associated with co-production of research. In some cases participatory methodologies are relevant depending on the research objectives and approach. Meaningful engagement with partners may be viewed as an end in itself, empowering those who often lack a voice in international research studies and achieving ‘cognitive justice’. From this perspective, research is development, not for development. Partnership is a democratic tool that promotes equity and inclusivity.

Other valid and epistemically robust approaches to research also exist, including ones that envision a clearer division of labour between different partners in the pathways to impact process and a valid role for ‘independent research’. Such approaches still recognise the importance of engaging with policy actors throughout the research process including, for example, to frame relevant research questions, ensure appropriate interpretation of the data, and promote dissemination of the findings.

“If the question isn’t being asked in the right way, it’s unlikely to be impactful. We often think of the research as the key to impact, but it’s actually as much about whether anyone is asking the right question.” Mike Aaronson , Chair, Global Challenges Research Fund (GCRF)

2.2  Be honest about power dynamics and build trust

Donors and researchers with whom we shared the framework (presented above) agreed with the general view that equitable partnerships are desirable and morally imperative but, as the framework suggests, are not always a necessary pre-condition for innovative research and societal relevance. Tensions, trade-offs and compromises that occur when research and policy come together may still lead to progressive change. Trust is important, though, and some researchers urged caution in partnering with policy actors. Despite converging agendas, each partner is governed by a separate mandate. Other participants mentioned the importance of not underestimating researchers’ power. Government ministers might fear what research says and challenging dominant policy narratives can place enormous pressure on partnerships, so researchers need to be mindful of this in the presentation of their evidence.

“Knowledge is always partial and even rigorous social science is always contestable. It takes long term partnerships and mutual respect to reduce the risk that evidence will be thrown out for being perceived as irrelevant or not useful.” Melissa Leach , Director, Institute of Development Studies

“There are undoubtedly tensions between conducting rigorous research that can take 5-10 years, and the change in policy direction brought on by often much shorter political cycles.” Diana Dalton, Former Deputy Director of the Research and Evidence Division, DFID (Excerpt from the Foreword in IDS Bulletin 50.1)

“Evidence is political and who produces that evidence is political… How do you build from sustained interactivity to systemic change in how evidence is produced and engaged with?” Kate Newman , Head of Research, Evidence and Learning, Christian Aid

“In Kenya – too often, researchers want to mobilise persons with disabilities so they can collect information but without any plans for their further involvement in the research or dissemination.” Anderson Gitonga Kiraithev, United Disabled Persons of Kenya

2.3  Identify boundary partners and work with brokers

The movement to work across scientific disciplines and sectors acknowledges that tackling global challenges requires new forms of research–policy partnership. In global health, for example, there have been attempts to overcome barriers between researchers and policymakers by building multidisciplinary teams of academics, practitioners and government officials.

Although close relations between research and policy actors may be key to success, participants also discussed the importance of relationships with boundary partners and research intermediaries, such as the media, NGOs and civil society. These boundary partners can be mobilised at key moments and may be in a position to engage with audiences beyond the core partners’ reach. Their perceived legitimacy and ability to frame research in non-academic terms can raise awareness and build support for changes in policy direction.

Long-standing partnerships with policy actors who are often mid-level civil servants and advisors do not make it any less important to construct compelling policy-friendly narratives and identify key influencing opportunities in the political sphere. Also necessary are good timing, policy-relevant research, and the ability to contextualise research evidence for live policy issues and having individuals positioned appropriately. Mutual agendas and close working relationships do not automatically generate these qualities and thus deserve special attention.

“In our case, it was us the practitioners that went to the policymakers in a changing political context where research was just beginning to be engaged in democratic conversations. If the doors are closed, we have to use the windows to get into their offices. We managed to convince the Ministry of Women, Children and Youth to convene a meeting to engage the researchers and young people with the policymakers.” Anannia Admassu , Director, CHADET

2.4  Health-checking existing collaborations

Partnerships change over time: much is taken for granted or not discussed; initial strategies may not be revisited; partners may have very different perspectives on the partnership. Using the framework as part of project learning and adaptation, it is possible to explore these issues non-judgementally, to rekindle initial enthusiasm and optimise partnerships.

“We all have different power, so then the question is how we use our power in the most effective ways” Charlotte Watts , DFID Chief Scientific Advisor

“Timing is important, implementation is time bound. But we must not forget the issue of poor research, not all research is excellent. Recommendations are often not feasible, not based upon the real situation, or they can be far too general.’ Mushtaque Chowdhury , Vice Chair, BRAC

2.5  Enable mutuality, interactivity and adaptability

Research collaborations are a product of pre-existing relationships and power dynamics, shaped by theories of change, perceived mutual benefits, and a competitive funding environment in which there is a rush to secure solid partners deemed to be a good fit. The framework has implications for design and assessment of research calls. Support for interdisciplinarity is already strong, but more could be done to bring together ‘odd bedfellows’ who have little experience of working together or encourage prospective partners to bridge different networks and spheres of influence. Sustained interactivity requires support for iterative planning and ongoing communication; policy adaptability needs flexibility in planning and resourcing activities – this is as much about procurement as research strategy.

“Procurement departments in donor organisations need to take a good hard look at this partnerships framework.  It has important lessons for how contracts are set up and managed” Louise Shaxson, Director of Digital Societies – ODI

“We’ve talked a lot about the politics of policy today, but not of the politics of research funding – this plays a big role in framing research partnerships… There is a danger to focusing on partnerships as bounded entities, rather than seeing research collaboration as part of a more complex knowledge ecosystem. If we want to make practice more equitable, we need to examine participation and make changes across the system.” Jude Fransman, Open University

Part 3. Conclusions

The framework for research–policy partnerships presented here is shaped by an understanding of evidence-into-policy processes as fundamentally social and interactive, underpinned by political context, social norms and power. All three partnership qualities: Bounded Mutuality, Sustained Interactivity and Policy Adaptability, are found in the case studies from the ESRC/DFID-funded research. Although there is evidence that these qualities have brought about desired changes in terms of evidence use, capacity, knowledge and relationships, the comparative strength of the qualities in specific partnerships also suggests that even more could be achieved if they were more deeply rooted.

We propose that using this framework at the research design stage could increase partnerships’ viability by taking into account the importance of mutuality, interactivity and policy adaptability from the outset. We hope others will seek to validate this concept with existing methodologies and literature, and apply variations of it to their own work.

To find out more about the framework and how to use it in your own work, go to IDS Bulletin 50.1: ‘Exploring Research–Policy Partnerships in International Development’

Further reading

Baker, A; Crossman, S; Mitchell, I.; Tyskerud, Y and Warwick, R. (2018) How the UK Spends its Aid Budget , London,

Dalton, D. (2019a), ‘Foreword’ , IDS Bulletin 50.1: ix

Georgalakis, G. and Rose, P. (2019) ‘Exploring Research–Policy Partnerships in International Development’ IDS Bulletin 50.1

Hinton, R.; Bronwin, R. and Savage, L. (2019) ‘Pathways to Impact: Insights from Research Partnerships in Uganda and India’ , IDS Bulletin 50.1: 43–64

Mulugeta, M.F.; Gebresenbet, F.; Tariku, Y. and Nettir, E. (2019) ‘Fundamental Challenges in Academic–Government Partnership in Conflict Research in the Pastoral Lowlands of Ethiopia’ , IDS Bulletin 50.1: 99–120

UK Government (2018) The Allocation of Funding for Research and Innovation , London: Department for Business, Energy and Industrial Strategy

Yeates, N.; Moeti, T. and Luwabelwa, M. (2019) ‘Regional Research–Policy Partnerships for Health Equity and Inclusive Development: Reflections on Opportunities and Challenges from a Southern African Perspective’ , IDS Bulletin 50.1: 121–142

3 Logos

Cite this publication

Georgalakis, J. (2020) The Power of Partnerships: How to Maximise the Impact of Research for Development , IDS Digital Essay, Brighton: IDS

James Georgalakis

Director of Evidence and Impact

In partnership with

Supported by, about this publication, related content, practicing inclusive rigour: moving from trade-offs to bright spots.

17 May 2024

CDI@10: Replace Impact Evaluation with Contribution Analysis

Giel Ton

16 May 2024

Our collective path: the Inclusive Rigour Co-Lab story

Marina Apgar

Early Findings from Evaluation of Systemic Action Research in Kangaba, Mali

IDS Working Papers 604 and 605

Jacqueline Hicks

24 April 2024

  • Open access
  • Published: 17 November 2022

Exploring the “how” in research partnerships with young partners by experience: lessons learned in six projects from Canada, the Netherlands, and the United Kingdom

  • Linda Nguyen 1 , 2 , 3 ,
  • Bente van Oort 4 , 5 ,
  • Hanae Davis 3 ,
  • Eline van der Meulen 5 ,
  • Claire Dawe-McCord 6 , 7 ,
  • Anita Franklin 8 ,
  • Jan Willem Gorter 1 , 2 , 9 , 10 ,
  • Christopher Morris 11 &
  • Marjolijn Ketelaar 9 , 12  

Research Involvement and Engagement volume  8 , Article number:  62 ( 2022 ) Cite this article

3114 Accesses

7 Citations

10 Altmetric

Metrics details

Involvement of young partners by experience in research is on the rise and becoming expected practice. However, literature on how to promote equitable and meaningful involvement of young people is scarce. The purpose of this paper is to describe and reflect on different approaches between researchers and young partners by experience based on six research projects conducted in Canada, Netherlands, and United Kingdom.

From six exemplar research projects, at least one researcher and one young partner by experience were asked to collaboratively (1) describe the project; (2) summarise the values and practicalities of the project; and (3) reflect on their partnership. Thematic analysis was applied to the findings from these reflective exercises, which included meeting summaries, recordings, and notes.

All projects shared similar values, including mutual respect between all team members. Young partners were offered a variety of opportunities and approaches to being involved, for example in recruiting participants, co-analysing or (co-)presenting results. Supports were provided to the teams in a variety of ways, including organizing accessible meetings and having dedicated facilitators. Regular and proactive communication was encouraged by using asynchronous modes of communication, establishing reference documents, and a personal approach by facilitators. Facilitators aimed to tailor the needs of all team members by continuously discussing their preferred roles in the project. While most projects did not offer formal research training, various learning and skill development opportunities were provided throughout, including presenting skills or advocacy training.

With this paper, we demonstrated the value of reflection, and we invite others to reflect on their partnerships and share their lessons learned. Our recommendations for involvement of young people in research are: (1) Remember that it is okay to not know what the partnership might look like and there is no single recipe of how to partner; (2) Take the time to invest in partnerships; (3) Provide ongoing opportunities to reflect on partnerships; (4) Consider how to balance the power dynamics; and (5) Consider how to incorporate diversity in the background of young partners in research.

Plain English summary

In more and more projects, researchers and young people are working together in partnership; but there is little guidance about how to organize this partnership. In this paper, we share what partnerships in six projects from Canada, Netherlands, and United Kingdom looked like, so that others can be inspired. To do so, a researcher and a young partner from each project were asked to together: (1) describe their project, (2) summarize the practical details about the collaboration and (3) think about things that went well or could be improved. We found that all projects had the same beliefs important to partnerships, like having respect for each other. Young people could work on parts of the project they liked in a way that worked for them. They were supported by staff, could join meetings and were appreciated for their work. Clear communication during and in-between meetings was helpful. Youth were often asked about the role they wanted in the project. While there was often no formal training on how to do research, there were many opportunities to learn. We offer six recommendations to researchers and young people who want to partner together: (1) It is okay to not know what the partnership will look like and there is no single recipe of how to partner; (2) Take your time; (3) Discuss how the partnership is going; (4) Think about who is doing what and why; (5) Consider the diversity of young partners. We hope others will share their experiences.

Peer Review reports

Introduction

In recent years, partnerships between researchers and young people throughout the research project is increasing and becoming an expected practice. Young people are often experts by experience, which can refer to any adolescent or young adult with lived health or other experience in the context under study. This paper focuses on the involvement in research projects of young people who have a disability or a chronic health condition or are a family member of an individual with a chronic condition (e.g., siblings). Increasingly, researchers have responded to the right of young people by experience to express their views freely in matters that concern them [ 1 ], by involving them as partners in health-related research and directly consulting for their ideas and opinions [ 2 , 3 , 4 , 5 ]. Despite the imperative value and right to involve young people in research, there is limited information in the literature about how to enhance and therefore ensure their equitable involvement.

There are many terms to describe the involvement of experts by experiences that are often used interchangeably, including patient and public involvement, patient engagement, engagement, authentic stakeholder engagement, involvement and participation [ 6 ]. In this paper, we refer to the definition by the National Institute for Health Research in the United Kingdom, which defines involvement as a situation in which members of the public are actively involved in research studies (as co-applicants, advisory or steering members, joint grant holders) and inform research priorities as well as research development and conduct [ 7 ].

The involvement of young people in research can bring mutual benefits to young people, the research itself, and to researchers [ 8 , 9 ]. Young people involved in research projects may gain new knowledge and skills [ 10 , 11 , 12 , 13 ]. They may also have opportunities to develop or broaden a social network [ 10 , 13 ] and increase their independence, confidence and self-esteem [ 10 , 11 , 14 ]. Positive impact on the research itself has been reported too, including enhanced relevance of the research, access to hard-to-reach youth, and further application of the research [ 15 , 16 ]. The results of the research projects may produce outcomes enhanced by partnership, such as more user-friendly products and/or stronger calls to action and advocacy for future research [ 16 ]. Researchers have also reported personal benefits in partnering with young people, including increased motivation, satisfaction, and understanding about the perspectives of young people and end-users to enhance data collection and data analysis methods [ 10 , 17 , 18 ].

Given the value of engaging young people in research, it is important to think about the various aspects and stages of research, and the way how young people may want to be involved. A scoping review by Van Schelven and colleagues [ 19 ] reported on the varying levels of involvement of young people with a chronic condition in health research projects [ 19 ]. The scoping review identified that the involvement of young people is a continuum of activities, and the ways that young people can influence the research project can range from being informed to being a decision maker [ 19 ]. The included studies in the scoping review have involved young people in one part of the research project [ 11 , 12 , 20 , 21 ], but there is increasing consensus that there should be opportunities to involve young people throughout the entire research process, such as developing and prioritizing research questions, contributing to the study design, increasing accessibility of recruitment and advertisements, offering unique perspectives for data analysis, and improving the use and applicability of the research findings [ 7 , 10 , 15 , 16 , 20 , 22 , 23 ].

Many publications have reported on the value of “why” we involve young people in research and “what” activities they have been involved in. However, there continues to be an underreporting about how to involve young people in various stages of research [ 6 , 19 , 24 ]. The literature currently offers insufficient details about the context and mechanisms of how specific strategies were used to organize partnership with young people in research [ 24 ]. Reflections from research teams who engage with experts by experience can be beneficial to share with others who wish to form similar partnerships in research. In work conducted by Liabo et al. [ 25 ], researchers and public advisors from three involvement groups comprised of adults and parents participated in a reflective exercise in which they were asked about their perspectives about good practices for involvement identified from the literature, as well as challenges and what works well for their communities. Good practices for involvement from the literature were summarized as a framework, including an iterative process of evaluation of the values and practicalities for involvement [ 25 ]. The reflective framework proposed by Liabo and colleagues enables teams to reflect on their values, as well as the practicalities of how to achieve these values [ 25 ].

The purpose of this paper is to illustrate and reflect on different approaches to engage young people in research based on various research projects conducted in Canada, the Netherlands, and the United Kingdom, including the values, practicalities, and involvement in research, to share ‘how’ young people have been and can be involved as partners in research.

There were six different projects that we reflected on as individual case studies; these were two projects each in Canada, the Netherlands, and the United Kingdom. Footnote 1 Some members of the team (LN, BvO, and MK) had conversations about how partnerships with young people were similar and/or different based on their experiences in partnering with young people and researchers in projects. Based on these conversations, these members discussed about the importance of sharing these reflections as a paper. During the initial stages of conceptualizing this paper, they reached out to invite other members who may be interested in partnering in this paper. Young partners from each of these projects were invited to be involved in different roles (for example, as a listener, co-thinker, advisor, or partner as described in the Involvement Matrix [ 26 ]). An invitation with a guiding Terms of Reference was sent in the native languages of English or Dutch to young people about what their involvement might look like for this paper. For young partners who expressed an interest in being involved in this paper, they were invited to attend the team meetings based on their interest and role that they would like to have. Team meetings with both researchers and young partners were often held on a bimonthly basis since October 2020 to January 2022 to discuss how to share our reflections and lessons learned in this paper. Multiple strategies were used to establish a welcoming and inclusive environment, including sending a poll to ask for the availability of team members, sending meeting agendas, having a brief check-in with every member at the beginning of each meeting, recording the team meetings, and sharing a written summary after each meeting.

We use the term “project team” to refer to the case study project team and “author team” to describe our full team of co-authors. All young people in each project team are referred to as “young partners” in which they have emphasized the importance of highlighting the expertise of their lived experiences. The GRIPP2 short form is an international checklist that was created to enhance the quality, transparency, and consistency about patient and public involvement in research [ 27 ], and this checklist was used as a guideline to report on the involvement of young people in work described in this paper.

Each project team were asked to complete the following reflective exercises as dialogues:

Describe the project, including its aims, location, duration, number of young partners, roles and activities;

Summarise the values and practicalities of the project informed by the framework of involvement from Liabo and colleagues [ 25 ];

Reflect on the partnership between young partners and researchers using guiding questions (adapted from Liabo and colleagues [ 25 ] including:

What made/makes the group work?

What would we have done differently (and if applicable, moving forward in ongoing projects)?

Recommendations and key messages to others engaging with young partners in research?

The reflections also focused on how the practicalities reinforced the values from all project teams.

To complete these reflective exercises, each team met separately to have a conversation and write a document of key points of values and practicalities that they would like to share. The values are defined as ideals of importance that were held by project teams and included:

inclusivity to provide equal opportunities for young people to be involved;

partnership in which researchers and young people showed respect for each other’s contributions;

purposeful involvement about why young people were involved in the projects;

transparency with open and honest communication;

and value of different kinds of knowledge [ 25 , pp. 5].

The practicalities, defined as something that enables the values, included:

support to young partners;

capacity building in which there was co-learning between young partners and researchers with training for both groups;

proportional involvement that was tailored to the needs of the research and young partners, with pragmatic decisions being made;

communication that needs to be proactive;

and involvement throughout the research by young partners [ 25 , pp. 5].

The information from each reflective exercise was presented at the author team meetings, with discussions about similarities and differences across case study projects.

During the process of co-writing this paper, the full author team meeting was held with an introduction about writing guidelines such as the steps involved with writing a scientific paper, author roles, expectations, and timelines [ 28 , 29 , 30 ]. Thematic analysis was applied to the documents from these reflective exercises, which included meeting summaries, recordings, and notes from each project team. The first author (LN) used principles from deductive thematic analysis using the framework of values and practicalities outlined by Liabo and colleagues [ 25 ], as well as inductive thematic analysis to identify new information that did not fit in these categories of values and practicalities [ 31 ]. Information from this analysis was summarized to describe the similarities and differences across project teams about the values, practicalities, and reflections of the partnerships between researchers and young partners. A first draft of the following documents were shared: tables of values and practicalities based on thematic analysis across all project teams, a summary of similarities and differences across all project teams, and a manuscript. Each project team continued to meet separately to share feedback for these documents, and author team meetings provided an opportunity to discuss the key elements that should be included in the paper such as key similarities and differences across project teams, as well as key messages to share from our overall partnerships between young partners and researchers.

Across all case study projects, young partners ranged in age from 12 to 38 years old who shared their lived experiences during adolescence and young adulthood to inform the study project. Young partners had lived experiences, in which they either had a chronic health condition or disability, or had a sibling with a chronic health condition or disability. A brief description of each project is provided in Table  1 . The details of the background of our young partners and the projects are provided in Additional file 1 . A summary of values is presented, which were similar across all projects. We highlight the practicalities as the focus of this paper to illustrate how young partners have been involved in each project.

Summary of values across all projects

The values identified from the reflective exercises are summarized across all projects. There were equitable opportunities with considerations about how young people could be involved, while recognizing that they might also need to balance their commitments with their health, home, school, and work. Young partners were welcome to join the projects at any point, and in all projects, they could take a step back from the projects and rejoin when they are available. The partnership was built on a foundation of respect for each other’s contributions and roles to work together as a team. In Canada, and specific to the partnerships with young partners in the BEST SIBS Study and READYorNot™ BBD Project, there was a commitment to the partnership between young partners by experience and researchers, who may have already been familiar and been involved with other research projects at the CanChild Centre for Childhood Disability Research [ 32 ] and wanted to continue to build on patient-oriented research projects. Furthermore, there was a vision to enhance the work of patient-oriented research work from the Strategy of Patient-oriented Research (SPOR) from the Canadian Institutes of Health Research (CIHR) and the CIHR-SPOR Patient-Oriented Research Fellowship Award that was funding the READYorNot™ BBD Project, as well as the doctoral studies and the BEST SIBS Study, respectively [ 33 ].

In all projects, the contributions from young partners were appreciated which were shown through verbal comments, financial compensation, opportunities to contribute to all aspects of the project, and group bonding activities. There was a purpose for inviting young people to be involved in different aspects of the project, which was communicated from the beginning. All projects valued the different kinds of knowledge being shared by young partners and researchers. The perspectives and expertise shared by young partners were encouraged and taken into consideration in each project. In addition, all projects had the value of having open and honest communication between young partners and researchers to provide clarity about each step of the project. Accessible methods were adopted to ensure that young partners were supported in their involvement. For example, in-person meetings for the CFP Panel were held in a wheelchair accessible building. In the BEST SIBS Study and PiP Project, dietary restrictions were taken into consideration when food was provided. Flexible formats of communication were used based on the preferences of young partners in all projects, such as email, Facebook, WhatsApp, and Zoom. Documents were written in plain language with large text, where possible. However, the role of young partners could sometimes be unclear with a lack of transparency about how decisions were being made. Project teams reflected about how to improve communication methods and ensure transparency in decisions being made. Collective reflections by the project teams also suggested the need for more time and space to reflect on the changing involvement of youth throughout the project as well as internal changes in the project for a variety of reasons, such as practical changes. These reflections could help to evaluate the partnership throughout the project and identify good practices moving forward as a team. Additional file 2  provides further information about our values.

Practicalities and reflections

Involvement throughout the research.

Young partners by experience were asked about the different activities that they would like to be involved in. Additional file 3  outlines the involvement of young partners as they reflect about their experiences in partnering in research projects. The activities that young partners have been involved with provides context about the partnership between young partners and researchers in each project. We then further describe our reflections on the practicalities outlined in the proposed framework by Liabo et al. [ 25 ] to describe how young partners and researchers partnered together in each project team. Additional file 4 provides details about our reflections on the practicalities and what we could have done differently in our partnerships.

Preparation phase

During the preparation phase of a research project, young people were involved in a variety of activities which included reviewing plain language summaries of grant applications by the CFP Youth Panel and the SibYAC with the BEST SIBS Project. Most projects were focused on the research activities to start conducting a project, which included the development of the research question, providing feedback on questionnaires, identifying study methods, and developing the interview guide and recruitment materials. The PiP Project highlighted how it was important to involve young partners to draft the recruitment letter for participants in the research, in which young partners ensured that the language was appropriate, and the letter was appealing to other young people to participate in the study. Similarly, the recruitment materials were co-created in the RIP:STARS and VIPER Projects and BEST SIBS Study. Some young partners contributed to the co-development of the recruitment videos. For example, young partners in the READYorNot™ BBD Project drafted, scripted, and provided testimonial videos for the recruitment videos [ 34 ]. They further refined and launched the recruitment strategy on social media, such as through their personal networks on Facebook and Twitter. Similarly, young partners in the BEST SIBS Study provided testimonials about the importance of the study to encourage individuals to participate in the recruitment video [ 35 ], which they shared in their personal social media networks. In addition to participant recruitment, young partners were also involved with the design of the study; in the BEST SIBS Study, VIPER, RIP:STARS and PiP Project, young partners and researchers further discussed study methods that would be novel and engage with young people as participants in the study, such as with photo elicitation in which participants could share photographs and describe stories during the interviews [ 36 ]. For young partners of the READYorNot™ BBD Project, they were involved with the co-development of an App and prepared the e-learning modules to train research assistants who were conducting the study. In the RIP:STARS and VIPER Projects as the disabled young people were to undertake the whole research project themselves, preparation involved training in research methods and ethics. These examples of activities demonstrate how young people were involved in a variety of ways as part of the team to prepare the study.

Execution phase

During the execution of the study, all projects considered how to involve young people in all activities that they might express an interest. Young partners in the VIPERS and RIP:STARS Projects undertook all aspects of the research projects, including designing their sample, gathering data via interviewing participants and facilitating workshops with other disabled children, co-developing the analysis framework and conducting data analysis, co-writing the final report, developing policy and practice recommendations and implementing the evaluation of the project. As co-leaders of the research, they were trained and supported by academics to guide them in producing rigorous research, but final decisions and the execution of the study was delivered by the young people. Similarly, there was an opportunity for young partners in the PiP Project to be involved in the analyses and interpretation of the interviews. Young partners in the BEST SIBS Study piloted the interview guide and had the opportunity to be involved with the analyses of the interviews. A graduate trainee and first author of this paper (LN) learned alongside with the partners in the BEST SIBS Study about how to provide training and involve young partners in the BEST SIBS Study throughout the process of data analysis. LN developed a brief 10-minute tutorial to explain qualitative terms to young partners. There was an iterative process with multiple discussions among the team to understand and interpret the data. The discussions took place in approximately 1-hour meetings. In one discussion, a senior researcher with expertise in qualitative and mixed methods studies was invited to facilitate the session with young partners to elicit their perspectives of how they viewed the data. Drafts of the developing codes, categories, and themes supported by key quotes were shared with young partners to ask their perspectives and thoughts of whether the information made sense or could use further clarification, and whether there was further information they would like to have asked participants. Overall, young partners could be guided to share their perspectives of how they interpret the study data.

Implementation phase

During the implementation and knowledge translation activities of the study, all projects provided opportunities for young partners to be involved with co-presentations at national and international conferences. Young partners of the CFP Youth Panel spoke to key persons in politics, sciences, and societal organizations to improve the position of young people with disabilities, using the outcomes of the projects of the CFP program. Young partners in the BEST SIBS Study shared their personal stories and motivations for partnering in research to raise awareness about the important roles that siblings have in all aspects including research. Young partners of the VIPERS Project provided recommendations to central government, local government, strategic managers, and services, while young partners of the RIP:STARS Project presented to stakeholders and responded to government consultations. In addition to co-presentations, young partners from the VIPERS Project, RIP:STARS Project, READYorNot™ BBD Project, PiP Project and the BEST SIBS Study had the opportunity to co-author publications with researchers [ 36 , 37 , 38 , 39 , 40 ]. For projects that have concluded, young partners could inform the next project. For example, the young partners of the VIPERS Project subsequently developed the idea and concept for the research that later became undertaken by young partners in the RIP:STARS Project, which helped to carry on the legacy of the work by the VIPERS.

In each project phase, most teams formed subgroups based on the activities that young people expressed an interest in. The length of time that young partners have been involved in projects is illustrated in Additional file 5 . Some projects, for example, the PiP Project and VIPERS Project, had the same young partners involved throughout the project; other projects, for example, the BEST SIBS Study, CFP Youth Panel, READYorNot™ BBD Project, and RIP:STARS Project had new young partners join or other young partners take a step back when needed during the project. All teams had researchers and young partners working collaboratively together to highlight each other’s strengths during certain project activities and support the interests of all members. The RIP:STARS and VIPERS Projects took a further step in explicitly stating that they operated within the social model of disability, in which they collectively addressed any barriers to ensure that all members had an opportunity to be involved with the project activities.

Supports were provided to the teams in a variety of ways, including the accessibility of the meetings, compensation, and dedicated staff to support the whole team. Details about the supports offered by teams are provided in Additional file 4 . For earlier case study projects that were conducted before the COVID-19 pandemic, such as the CFP Youth Panel, PiP Project, and VIPERS Project, there were in-person meetings. The locations were selected based on their accessibility, including being wheelchair accessible or providing quiet spaces for time out. Other individual needs were considered, for example, the CFP Youth Panel ensured that a resting room and allergy considerations were taken into account. While there was aimed to anticipate on individual needs of young partners, such as resting rooms, the project teams emphasize that they would have wanted more time and resources to take these personal desires more into account.

The format of the meetings was influenced by the context of when the projects began. Some projects were conducted prior to the COVID-19 pandemic and were able to conduct a combination of in-person and virtual meetings. Some teams, such as the PiP Project and RIP:STARS, had a combination of meetings that in-person or teleconference (e.g., Skype). While some meetings occurred on a regular basis, the VIPERS and RIP:STARS Projects had meetings that took place for the full day in-person which provided time for socializing, and in-depth training about research and delivery of the project. Through reflection, the day was scheduled to include defined periods of ‘work’ delivered through creative methods of approximately 45 min, followed by a break, and a lunch break where we could share food and socialise together. Young people reflected positively on the in-person meetings, and wanted more of them. Other projects were conducted during the COVID-19 pandemic and only had their meetings online. For example, meetings in the BEST SIBS Study and READYorNot™ BBD Project only took place online through Zoom, which a toll-free number was provided to ensure that young people could attend the meeting at no additional cost for the call to them.

Compensation was provided to young people involved in partnering with researchers on the team. The funds that were used towards compensation came from different sources that were available in the specific country. For example, the CFP Youth Panel and PiP Project received funding from the FNO which was an organization that supported initiatives to increase opportunities for people in the Netherlands. The VIPERS and RIP:STARS Project received funding from The National Lottery in the United Kingdom. In Canada, the Canadian Institutes of Health Research (CIHR) has a Strategy for Patient-Oriented Research (SPOR). The READYorNot™ BBD Project was awarded funding from this institute with partner funding. The BEST SIBS Study was a doctoral research study, and while there was no funding for partner compensation at the beginning, young partners and the doctoral student (LN, first author on this paper), partnered together to submit grants and received two awards, from CIHR and the CHILD-BRIGHT Network (funded by SPOR).

Details about the funding received by each case study project are described in Additional file 1 .

There were a variety of forms of compensation. Some teams offered an annual honorarium fee with additional compensation for involvement in activities, and the BEST SIBS Study and READYorNot™ BBD Project offered compensation based on guidelines offered by the CHILD-BRIGHT Network [ 41 ]. For teams that had in-person meetings, the travel costs and meals were covered. Some teams incorporated social activities that were covered financially, for example, young partners in the PiP Project attended a museum or cooking workshop. Further details about compensation are provided in Additional file 4 . While compensation supported the involvement of young partners in research, we reflected about the importance of asking young people about how they wish to be compensated. Young partners described how they preferred to have some of the funds for compensation used for team activities. The team activities helped to build rapport between young people and the researchers, which was a strength of the teams. Examples of team activities included attending day trips or workshops together, sending e-gift cards sent to order meals and meet virtually, or sending care packages. Young people appreciated these social activities and would have wanted more of these kinds of activities to get to know their team members. Young partners reflected on the importance of remembering to be human with time for fun and laughs through these activities and even during the meetings throughout our partnership.

An important component to continue to build and sustain partnerships with young people is having dedicated staff to facilitate team activities. Some projects had one or two research coordinators for the team, who could be a dedicated individual hired for the project, researchers, or graduate students. The CFP Youth Panel had a member of the panel who was hired to be the chair a part-time job for 8–13 h a week, with support from other members of the team including the program leader and support officer of the CFP program. Some teams reflected on opportunities to have more young people partnered on the projects, but there would need to be considerations about availability of resources such as funding for compensation and personnel support.

Communication

Supports for regular and proactive communication were a key aspect to ensure that all team members were informed of the project activities. There was clear communication about the steps and purposes, and their involvement, which included invitations, in meetings, and constant contact in between meetings. All teams were proactive in continuing their communication using asynchronous modes outside of meetings, such as by sending regular updates by email and reminders for upcoming meetings. Some young people preferred to receive their updates and communications from the research team on other platforms, such as WhatsApp or Facebook messenger. Private group networking platforms were also used to connect as a team, including a platform called “Notebook” for the PiP Project, and a Facebook group for the BEST SIBS Study and READYorNot™ BBD Project. In all projects, young partners felt that the communication platforms were accessible and tailored to their needs. The variety of communication platforms allowed young partners to choose how they would like to be involved and informed about the projects. Young partners appreciated the flexibility to share their experiences, such as by emails, during team meetings, or during individual check-in meetings.

In addition to communication platforms, documents could be provided as a reference for young people and these documents could be a work-in-progress to be revisited throughout the project. For example, young partners in the BEST SIBS Study reflected that they would have liked to have documents including: Terms of Reference that outlined the project descriptions, possible roles and responsibilities to be discussed with young partners, and forms of compensation; Group Rules to describe expectations during meetings and on the Facebook group; and Activity Log that described the types of activities and hours contributed to the activities. Young partners in the BEST SIBS Study reflected on how information from the Activity Log would be helpful to include on a resume for their professional development. Based on feedback from the research team and young partners, these documents were created later on in the project.

A unique aspect of communication is the personal characteristics of the coordinator for each team. Specifically for young partners in the BEST SIBS Study, the member composition consists of young adults exclusively, including the graduate student researcher who was the facilitator of the group. This member composition of young adults allowed for a unique vulnerability and openness of conversation during meetings about the sibling experience. For the CFP Youth Panel, young partners could connect and communicate directly with the chair, who acted as a liaison to communicate with the other members of the CFP program team. The chair was a young partner with experience who helped to improve communication as members of the CFP Youth Panel and chair understood each other and were in similar phases in life. The author team reflected that the coordinators of all projects had personal traits that contributed to the successful partnership between the researchers and the young people: flexibility, openness, patience, willingness to listen, and a conscientiousness for everyone’s knowledge base (e.g., explaining and clarifying information while avoiding the use of acronyms). The coordinators made the effort to consistently communicate about the value that young people brought to the research projects.

Proportional

Throughout the process of involving young people in research, the teams aimed to tailor the needs of all members of the team including the young people themselves while balancing the research demands and available resources. All teams had meetings, where young partners had conversations with researchers, project leaders, or facilitators about the partnership; young partners were asked the roles that they would like to have in the projects and in various activities. In the VIPERS and RIP:STARS Projects, young partners were co-leaders and all young people decided together about the level of involvement that each person would like to have at each stage of the research cycle. They had different roles, such as undertaking fieldwork or planning a conference. These conversations took place during in-person meetings. In some projects, there was a chair or coordinator who young people could connect with to communicate about their level of involvement or changing involvement. Some projects also had communication tools that helped to facilitate the conversation with young people. In the PiP Project, the group had regular discussions about roles in next steps and activities. Young partners in the PiP Project later recognized the importance of discussing roles and expectations, which led to the co-development of the Involvement Matrix [ 26 ]. The Involvement Matrix could be used as a conversation tool to discuss the roles that young people would like to have in research (e.g., listener, co-thinker, advisor, partner, or decision-maker) for tasks in different stages of preparation, execution and implementation in the research project [ 26 ]. Other teams, specifically in the BEST SIBS Study and READYorNot™ BBD Project, were formed after the development of the Involvement Matrix and used this tool along with other tools to have conversations about the roles of young people on the team. Both teams with the BEST SIBS Study and READYorNot™ BBD Project had regular check-in meetings, which they used the Start, Stop, and Continue activity (i.e., what activities the team should start doing, stop doing, or continue doing) and the Involvement Matrix [ 26 ], while the team with the BEST SIBS Study additionally used the Patient Engagement Tool from the Ontario Brain Institute to identify examples of how young partners can be involved in research at different stages of the project (e.g., plan the study, recruit and retain participants, do the study, analyze the results, and/or disseminate the results).

Capacity building

Capacity building relates to the co-learning between young people and researchers, with opportunities for training and learning by experience. While most projects did not offer formal research training, there were considerations from the researchers about how to explain research concepts to young people. In all projects, young partners were explained about the different concepts of research. However, some young people, such as the young partners in the PiP Project, reflected that they would have appreciated opportunities for training about research. Young partners in the READYorNot™ BBD Project reflected about how they would have wanted more conversations with researchers about the terms of “patient-oriented research” and “co-design” to understand how each team member understood these terms and how team members can help each other learn and conduct the project in partnership. Some young people may have had training prior to joining a research project, specifically around the meaning of patient-oriented research. For example, a young partner in the BEST SIBS Study completed the Family Engagement in Research course offered by McMaster University, CanChild Centre for Childhood Disability Research and the Kids Brain Health Network [ 42 ]. Some teams offered training if young people expressed they had an interest for training about specific aspects of research.

In the CFP Youth Panel, young partners had the opportunity to be trained in specific activities of the project that they were interested in, such as political activities, conversation strategies, and social media and communication workshops. Young partners of the CFP Youth Panel were also offered a year-long program about advocacy. They were also supported in activities that were specific to the research project. In the VIPERS and RIP:STARS Projects, young partners were encouraged to take on more leadership roles, and they were trained for these roles which included presentation skills, engaging with media, and learning how to budget. Training was also offered to young partners in the VIPERS and RIP:STARS Projects that followed the same pathway and knowledge as an academic level research methods course at the university level but adapted to make it accessible to the individual needs of the group.

For all projects, as young partners reflected on how they gained confidence in their knowledge and skills, they took on the role of being leaders on the project such as being chair of a subteam in the CFP Youth Panel or co-presenting at international conferences for the BEST SIBS Study, PiP Project, READYorNot™ BBD Project. Upon reflection with young partners, researchers recognized the value of learning from and together with young partners about topics that were important to address in the research projects. In the VIPERS and RIP:STARS Projects, the project had an iterative approach to ensure that there was bidirectional learning between researchers and young partners. This ongoing bidirectional learning ensured that the projects were co-led with young partners. Such reflective practice led the RIP:STARS Project team to theorise and publish on a number of tensions between disabled young people becoming research leaders and dominant ideas about disabled children in a disabling society such as overprotection, being empowered through engagement within the project yet restricted in other areas of their personal life, and the emotional impact on disabled young researchers of gathering evidence of a continuing lack of autonomy and rights-based provision for disabled children and young people [ 43 ].

Recommendations

As an author team, comprised of both researchers and young partners, we reflected about strategies of how we partnered together. In this section, we present ‘calls to action’ for teams who wish to form partnerships between researchers and young people, which are based on our experiences, reflections, and lessons learned. While these calls to action are applicable to partnerships with stakeholders, there should be additional considerations when partnering with young people in research [ 19 , 44 ].

Remember that it is okay to not know what the partnership might look like and there is no single recipe of how to partner with young people There are different models to involve young people in research, as illustrated by our case study project teams. The roles of young people can also change over time. For example, young partners in the CFP Youth Panel were initially involved as advisors to advise the CFP program and over time, and they also became an advocacy and expert group. Young people may have opportunities to be involved as partners from the start and throughout the projects, as illustrated by young partners involved with the PiP Project, VIPERS Project, and RIP:STARS Project. Young people may also be partners on projects in a council comprised of both young people and parents/caregivers along with researchers, for example, the model of the PFAC with the READYorNot™ BBD Project. Young people may represent their perspectives in their roles as part of the family of an individual with a disability, such as the role of siblings with the SibYAC with the BEST SIBS Study.

The case study project teams all used a variety of strategies to involve young people in research. Some key strategies that were similar across all case study project teams, which included having supports to ensure the accessibility of the meetings that were in-person and virtual, offering compensation in different formats, and having coordinators to facilitate successful partnerships between researchers and young people. There should be flexible and proactive communications, which offer opportunities for young people to be involved in the projects. Young people could be involved in projects by providing feedback during project team meetings or asynchronously by email. The project might evolve over time and there may be small groups that are formed to work on specific initiatives that young people are interested in. These small groups may also provide further opportunities for young people to connect and get to know each other. There should also be communication that is proportional to the level of involvement that young people would like to have, and conversations are important to discuss roles and expectations with young people. These research projects should have opportunities for capacity building with co-learning between young people and researchers.

Take the time to invest and build rapport in partnerships Partnerships are an investment in knowledge and skills that have valuable benefits for the research projects, young people, and researchers. It is important to take the time to get to know young people, including their motivations, interests, and personal goals for being involved with the project. The characteristics of the facilitator for each project team was important for building rapport with young partners. Furthermore, the facilitators of all projects had personal traits that supported the building of rapport in the partnerships, including being flexible, open, patient, willing to listen and conscientious of the strengths and skills of young partners. While compensation was provided, young partners also appreciated social activities to build rapport within the team such as attending day trips or receiving care packages. Opportunities were provided to young people to be involved at different stages of the research project. Partnerships with young people is an investment, and project teams reflected about the opportunities provided to young partners to receive training for skill development; for example, attending workshops about political strategies, conversation strategies, and social media and advocacy. In all projects, the time invested in building the partnerships was a facilitating factor that allowed young partners to gain confidence in their knowledge and skills that ultimately led them to take on leadership roles.

Provide ongoing opportunities for the team of researchers and young people to reflect on their partnership experiences Young partners and researchers valued the opportunity to reflect on their partnership experiences both during the project and while writing this paper. In some projects, young people had the opportunity to reflect and evaluate the partnership including what was working well, how things could be better, and to ensure that everyone felt fully involved in all aspects of the project at the level they choose at that time. Each case study project team chose when to have these reflections and evaluations, which were often at the end of each meeting or stage of the research cycle. However, many young partners preferred to have regular check-in meetings throughout the project. Tools could also be used to have conversations for these reflections and evaluations. For example, young partners in the BEST SIBS Study and READYorNot™ BBD Project reviewed the Involvement Matrix [ 26 ] to see whether young people would prefer a change in roles or level of involvement. The SibYAC also provided the Public and Patient Engagement Evaluation Tool [ 45 ] at the end of each stage of the research cycle, which provided an opportunity for the SibYAC to anonymously provide an evaluation of the partnership experiences. These reflections provide an opportunity to incorporate feedback to enhance the partnership experiences. Young people may also reflect on the impact of their involvement and contributions in research, which may empower them to take the lead on research initiatives. As young partners and researchers on our author team reflected on the partnership experiences after the project was conducted, we would have liked to have included more opportunities to internally evaluate our partnership experiences including reflections about what was or was not working well throughout the research projects. As a collective team, we recognize how we are learning on the go, in which young people and researchers are learning from each other and create a balance of reflective exercises while meeting the needs of the project.

Consider how to balance the power dynamics in the partnership There should be considerations about how to reduce the power imbalances between young partners and researchers, which includes the decision-making process in the projects. From the beginning and throughout the project, conversations should be held between young people and researchers, possibly with the help of tools. In the RIP:STARS and VIPERS Project, there was an expectation from the beginning that power was to be a shared concept, which involved the researchers being prepared to give up power and take a step into not knowing how the project and partnership might look like. Tools have also been co-developed through the Family Engagement in Research course [ 42 ] for self-reflections between young people and researchers to address power imbalances [ 46 ]. While there may be constraints about certain decisions that are made, for example, from the funding agencies or institutions, there may be other initiatives related to the projects that may provide opportunities for young people to feel empowered and become decision-makers [ 43 ]. For example, young partners from the CFP panel took the initiative to not only provide advice for projects but to also suggest new projects that were not covered by the grant applicants including setting up a project about a platform to share experiential stories. Similarly, the SibYAC also suggested expanding the website to describe both the project’s work and also blogs about their stories as siblings of youth with disabilities, and some young partners of the PiP Project decided to build a website for young persons with cerebral palsy as part of a knowledge translation initiative. As the partnership developed over time, the power shifted in which young people began to take on leadership roles for initiatives related to the research project. Partnership includes making sure that young partners feel valued, respected, and seeing that their involvement makes a difference to the project. It is important for projects to be open to allow for the change in partnership dynamics.

Consider how to incorporate diversity in the background of young partners involved in research It is important to consider the diversity of young partners who have the opportunity be involved in research. Some projects teams, such as the BEST SIBS Study, READYorNot™ BBD Project, and the RIP:STARS project, are continuing their partnership and will have an opportunity to recruit additional partners from diverse backgrounds. For example, young partners with the BEST SIBS Study have considered how to recruit young partners from different genders and ethnicities moving forward. While the recruitment of partners from hard-to-reach populations can be challenging, different strategies can be used such as working with community organizations to recruit these partners [ 47 ]. Furthermore, the initial partnership could begin with one or two young partners and continue to expand over time, similar to how the partnership with young partners with the BEST SIBS Study began which started with two young partners and has expanded to have six young partners at the time of this paper. For young people who might not have had as much experience with partnerships in research, there could be opportunities for mentorship. For example, reflections from the READYorNot™ BBD Project identified there could be a ‘buddy system’, in which experienced partners could be paired with new partners. The experienced partners could answer questions, share resources, or provide feedback and comments about ways that new partners could be involved in the project.

In this paper, as a team comprised of young partners and researchers, we reflected about our experiences in partnering together in research. Across all six case study project teams, the perspectives of young partners were valued to inform different stages of research. Even throughout the process of writing this paper, we prioritized and asked young partners to share what they had learned. By engaging in this reflection process using the framework outlined by Liabo et al. [ 25 ], we identified how our partnerships had an impact on not only the research, but also on young partners and researchers.

In each of our case study projects, we identified similarities and differences in partnerships with young partners in Canada, the Netherlands, and the United Kingdom. While all case study project teams had similar values, there were both similar and different approaches in the practicalities of how we implemented our values. During our reflection process, we also recognized how partnerships can change over time both within a single project team and also across project teams.

The case study project teams were conducted at different times. Some project teams had already completed their projects, and this reflective process was useful to identify their positive experiences and consider what they might have done differently. For project teams that are continuing their partnerships at the time of this writing were able to use this reflection process in this paper and the learnings from other project teams to improve certain areas in their own projects moving forward. As young partners continue to increasingly be involved in research, we hope that the sharing of our lessons learned can be beneficial to current and future research teams. We hope this article is viewed as invitation to expand and exchange knowledge on how to involve young experts by experience in research.

Availability of data and materials

All data generated or analysed during this study are included in this published article.

Six case study projects:

1) Care and Future Perspective (CFP) with the Youth Panel in the Netherlands with EvdM and BvO;

2) Participation in Perspective (PiP) project in the Netherlands with LS and MK;

3) Voice, Inclusion, Participation, Empowerment, Research (VIPERS) project in England with AF;

4) Research into Practice: Skilled Team with Ambition, Rights and Strength (RIP:STARS) Project in England with AF;

5) BrothErs and Sisters involvement in health care TranSition for youth with Brain-based disabilities (BEST SIBS) Study with the Sibling Youth Advisory Council (SibYAC) in Canada with HD and LN; and.

6) READiness in Youth fOR transition Out of pediatric Care Brain-Based Disabilities (READYorNot™ BBD) Project with the Patient and Family Advisory Council (PFAC) in Canada with CDM, LN, and JWG.

Abbreviations

BrothErs and Sisters involvement in health care TranSition for youth with Brain-based disabilities Study

Care and future prospects

Canadian institutes of health research

Participation in perspective project

READiness in Youth fOR transition out of pediatric care brain-based disabilities brain-based disabilities project

Research into practice: skilled team with ambition, rights and strength project

Strategy for patient-oriented research

Voice, inclusion, participation, empowerment, research project

UN General Assembly. Convention on the Rights of the Child. 1989;1577:3.

Google Scholar  

Kirk S. Methodological and ethical issues in conducting qualitative research with children and young people: a literature review. Int J Nurs Stud. 2007;44:1250–60.

Article   PubMed   Google Scholar  

Allsop MJ, Holt RJ, Levesley MC, Bhakta B. The engagement of children with disabilities in health-related technology design processes: identifying methodology. Disabil Rehabil Assist Technol. 2010;5:1–13.

Clavering EK, McLaughlin J. Children’s participation in health research: from objects to agents? Child Care Health Dev, vol. 36. John Wiley & Sons Ltd.; 2010. p. 603–11.

Lundy L. Children’s rights and educational policy in Europe: the implementation of the United Nations convention on the rights of the child. Oxf Rev Educ. 2012;38:393–411.

Article   Google Scholar  

Nguyen T, Palisano RJ, Graham I. Perspectives and experiences with engaging youth and families in research. Phys Occup Ther Pediatr. 2019;39:310–23.

Hayes H, Buckland S, Tarpey M. Briefing notes for researchers: involving the public in NHS, public health, and social care research. Eastleigh: INVOLVE; 2012.

Curtin C. Eliciting children’s voices in qualitative research. Am J Occup Ther. 2001;55:295–302.

Article   CAS   PubMed   Google Scholar  

Barker J, Weller S. “Is it fun?” Developing children centred research methods. Int J Sociol Soc Policy. 2003;23:23–58.

Bailey S, Boddy K, Briscoe S, Morris C. Involving disabled children and young people as partners in research: a systematic review. Child Care Health Dev. 2015;41:505–14.

Bruce SM, Parker AT. Young deafblind adults in action: Becoming self-determined change agents through advocacy. Am Ann Deaf. 2012;157:16–26.

Dedding C. Delen in macht en onmacht. Universiteit van Amsterdam; 2009.

Lightfoot J, Sloper P. Having a say in health: involving young people with a chronic illness or physical disability in local health services development. Child Soc. 2003;17:277–90.

Canter KS, Roberts MC. A systematic and quantitative review of interventions to facilitate school reentry for children with chronic health conditions. J Pediatr Psychol. 2012;37:1065–75.

Dew A, Boydell KM. Knowledge translation: bridging the disability research-to-practice gap. Res Pract Intellect Dev Disabil. 2017;4:142–57.

Powers JL, Tiffany JS. Engaging youth in participatory research and evaluation. J Public Heal Manag Pract. 2006;12:79–87.

Dudley L, Gamble C, Preston J, Buck D, Hanley B, Williamson P, et al. What difference does patient and public involvement make and what are its pathways to impact? Qualitative study of patients and researchers from a cohort of randomised clinical trials. Jepson R, editor. PLoS One. Public Lib Sci. 2015;10:e0128817.

Vale CL, Thompson LC, Murphy C, Forcat S, Hanley B. Involvement of consumers in studies run by the medical research council clinical trials unit: results of a survey. Trials. 2012;13:9.

Article   PubMed   PubMed Central   Google Scholar  

van Schelven F, Boeije H, Mariën V, Rademakers J. Patient and public involvement of young people with a chronic condition in projects in health and social care: a scoping review. Heal Expect. 2020;23:789–801.

Rosen-Reynoso M, Kusminsky M, Gragoudas S, Putney H, Crossman MK, Sinclair J, et al. Youth-based participatory research: lessons learned from a transition research study. Pediatrics. 2010;126(Suppl 3):177-82.

Van Staa A, Jedeloo S, Latour JM, Trappenburg MJ. Exciting but exhausting: experiences with participatory research with chronically ill adolescents. Heal Expect. 2010;13:95–107.

Selby JV, Forsythe L, Sox HC. Stakeholder-driven comparative effectiveness research. JAMA. 2015;314:2235.

Frank L, Basch E, Selby JV. The PCORI perspective on patient-centered outcomes research. JAMA. 2014;312:1513–4.

Staley K. ‘Is it worth doing?’ Measuring the impact of patient and public involvement in research. Res Involv Engagem. 2015;1:1–10.

Liabo K, Boddy K, Bortoli S, Irvine J, Boult H, Fredlund M, et al. Public involvement in health research: what does ‘good’ look like in practice? Res Involv Engagem. 2020;6:11.

Smits D-W, van Meeteren K, Klem M, Alsem M, Ketelaar M. Designing a tool to support patient and public involvement in research projects: the involvement matrix. Res Involv Engagem. 2020;6:30.

Staniszewska S, Brett J, Simera I, Seers K, Mockford C, Goodlad S. GRIPP2 reporting checklists: tools to improve reporting of patient and public involvement in research. Res Involv Engagem. 2017;3:13.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Johnson-Sheehan R, Paine C. Writing Today. 3rd ed. Pearson; 2015.

Perneger TV, Hudelson PM. Writing a research article: advice to beginners. Int J Qual Heal Care. 2004;16:191–2.

Richards DP, Birnie KA, Eubanks K, Lane T, Linkiewich D, Singer L, et al. Guidance on authorship with and acknowledgement of patient partners in patient-oriented research. Res Involv Engagem. 2020;6:38.

Elo S, Kyngä H. The qualitative content analysis process. J Adv Nurs. 2008;62:107–15.

Pozniak K, Buchanan F, Cross A, Crowson J, Galuppi B, Grahovac D, et al. Building a culture of engagement at a research centre for childhood disability. Res Involv Engagem. 2021;7:78.

Government of Canada - Canadian Institutes of Health Research. Strategy for patient-oriented research [Internet]. 2021 [cited 2021 Nov 7]. Available from: https://cihr-irsc.gc.ca/e/41204.html .

CHILD-BRIGHT READYorNot™ Project Study Team. READYorNot(TM) Brain-Based Disabilities Project [Internet]. 2021 [cited 2022 Feb 4]. Available from: https://www.child-bright.ca/readyornot .

CanChild. BrothErs and Sisters involvement in health care TranSition for youth wIth Brain-based disabilitieS (BEST SIBS) Study [Internet]. 2020 [cited 2020 Aug 20]. Available from: https://www.canchild.ca/en/research-in-practice/current-studies/brothers-and-sisters-involvement-in-health-care-transition-for-youth-with-brain-based-disabilities-best-sibs-study .

Nguyen L, Jack SM, Di Rezze B, Ketelaar M, Gorter JW. Protocol of the BEST SIBS study: a qualitative case study to investigate the roles and responsibilities of siblings of youth with a neurodisability during health care transition. J Transit Med. 2021;3:1–11.

CAS   Google Scholar  

VIPER. Hear Us Out. Council for Disabled Children. [Internet]. 2014 [cited 2022 May 2]. Available from: http://councilfordisabledchildren.org.uk.testing.effusion3.dh.bytemark.co.uk/help-resources/resources/viper-findings-hear-us-out .

Gastel TCM van, Smits DW, Verheijden J, van de Water JKM. Alles over ons met ons: Hoe jongeren met cerebrale parese actief betrokken werden bij een onderzoeksproject. Ned Tijdschr voor Revalidatiegeneeskd. 2019;1:31–4.

Smits D-W, Klem M, Ketelaar M. Practical Guide in collaboration with the BOSK and with experience experts; youths and parents the involvement matrix involvement of patients in projects and research in projecten/onderzoek. 2019.

Gorter JW, Amaria K, Kovacs A, Rozenblum R, Thabane L, Galuppi B, et al. CHILD-BRIGHT READYorNot Brain-Based Disabilities Trial: protocol of a randomised controlled trial (RCT) investigating the effectiveness of a patient-facing e-health intervention designed to enhance healthcare transition readiness in youth. BMJ Open. 2021;11:48756.

CHILD-BRIGHT Network. Guidelines for Patient Partner Compensation [Internet]. 2020 [cited 2022 Apr 6]. Available from: https://www.child-bright.ca/compensation-guidelines .

CanChild Centre for Childhood Disability Research. Kids Brain Health Network, McMaster University. Family Engagement in Research Course [Internet]. 2021 [cited 2021 Nov 7]. Available from: https://www.canchild.ca/en/research-in-practice/family-engagement-in-research-course .

Brady G, Franklin A. Challenging dominant notions of participation and protection through a co-led disabled young researcher study. J Child Serv. 2019;14:174–85.

Allemang B, Cullen O, Schraeder K, Pintson K, Dimitropoulos G. Recommendations for youth engagement in Canadian mental health research in the context of COVID-19. J Can Acad Child Adolesc Psychiatry. 2021;30:123–30.

PubMed   PubMed Central   Google Scholar  

Abelson J, Li K, Wilson G, Shields K, Schneider C, Boesveld S. Supporting quality public and patient engagement in health system organizations: development and usability testing of the public and patient engagement evaluation tool. Heal Expect. 2016;19:817–27.

Champagne M, Demers C, Elias B, Gaudin-Drouelle D. Power Imbalance in Family Engagement in Research: A Self-Reflection Tool for Researchers [Internet]. 2021 [cited 2021 Nov 8]. Available from: https://canchild.ca/system/tenon/assets/attachments/000/003/639/original/What_is_power_Imbalance_in_research__English_Version.pdf .

Gonzalez M, Phoenix M, Saxena S, Cardoso R, Canac-Marquis M, Hales L, et al. Strategies used to engage hard-to-reach populations in childhood disability research: a scoping review. Disabil Rehabil. 2021;43:2815–27.

Download references

Acknowledgements

We thank all young partners involved in each case study project team: CFP Panel, PiP Project, BEST SIBS Study, READYorNot™ BBD Project, VIPERS Project, and RIP:STARS Project, who were involved in different roles in this paper. We acknowledge and give special thanks to our young partners, Lauren Sluiter and Jordan Matthews, who provided solicited and unsolicited advice throughout the process of writing this paper.

Authors’ information

LN graduated from the Bachelor of Health Sciences (Honours) Program, Child Health Specialization, at McMaster University (Hamilton, Ontario, Canada) in 2015. She completed her PhD in Rehabilitation Science at CanChild and McMaster University in 2022. Her PhD studies, supervised by JWG, were focused on the role of siblings of a brother or sister with a disability during transition from pediatric to adult healthcare. In 2018, she established the Sibling Youth Advisory Council (SibYAC) that is currently comprised of six young adult siblings who have a sibling with a disability. LN will continue her partnership with the SibYAC during her postdoctoral fellowship.

BvO has a Bachelor’s in Political Sciences and a research Master’s in Global Health at the Athena Institute of the Vrije Universiteit Amsterdam. Aside from her studies, she worked and did research at The Netherlands Organisation for Health Research and Development on the topics of (pediatric) rehabilitation research and youth involvement in research. Between 2015 and 2019, Bente was active in the Youth Panel Care and Future Prospect (CFP), first as a member and later as chair. The Youth Panel CFP was founded to advise the CFP program in the Netherlands on which projects to subsidise to help young people with a chronic health condition. The panel continued to expand its work and influence to improve the social position of young people in care, education, employment, sport, and empowerment. Today, the panel merged into a foundation, JongPIT, and BvO is part of the Supervisory Board.

EvdM has a Bachelors in Health & Life Sciences as well as in Biomedical Sciences. She is an active member for JongPIT, which is a foundation by and for young people with chronic disabilities and stands for equal rights for all young people with health issues in the Netherlands. JongPIT aims to limit the gap between young people with disabilities and their healthy peers, and promote youth participation in healthcare, policy and politics, school, and work environments. Aside from JongPIT, she is part of the Dutch patient federation, Prinses Máxima Center for Childhood Cancer and of Childhood Cancer International. EvdM hopes that she can make a change by participating in and doing research to improve healthcare.

HD has been a member the Sibling Youth Advisory Council with the BEST SIBS Study since 2019. She has a younger sibling with cerebral palsy, and uses her lived experience to contribute to research and community outreach initiatives and connecting with other siblings. She completed her BSc in Psychology, Neuroscience & Behaviour in 2015, and her PhD in Cognitive Psychology in 2020 at McMaster University. During her postdoctoral fellowship at CanChild, she worked with JWG and her research focus was on cognitive functioning in youth with cerebral palsy, to better understand how cognitive control impacts learning and emotion regulation in this population.

CDM has been a youth patient partner for eight years, most recently with the READYorNot™ BBD Project and the CHILD-BRIGHT National Youth Advisory Panel. She is a first year student in the Cumming School of Medicine at the University of Calgary. When she was an undergraduate student in the Bachelor of Health Sciences (Honours) Program at McMaster University, she completed her thesis project on the topic of paediatric to adult healthcare transition policies and practices. Through her lived experience as a young patient with a number of complex medical conditions, CDM brings a unique perspective to the world of patient-oriented research. Her interests include transitions in care, rare diseases, and healthcare policy.

CM is a Professor of Child Health Research at the University of Exeter Medical School. He leads PenCRU: the Peninsula Childhood Disability Research Unit, which undertakes a programme of applied health research aimed at identifying ways to improve the health and wellbeing of disabled children and their families. PenCRU involves families of disabled children as partners in all the activities of the unit through our Family Faculty. PenCRU works in close partnership with families, health and social care professionals and commissioners as the users of our research findings.

JWG is a Pediatric Physiatrist, Professor and Head of Pediatric Rehabilitation, University Medical Center Utrecht (The Netherlands), and Professor of Pediatrics (Part-Time) and an associate member in the School of Rehabilitation Science at McMaster University. He has been an investigator at CanChild since 2008. Jan Willem has training in rehabilitation medicine (physiatry) with a special clinical and research interest in transition services for youth with developmental disabilities. Jan Willem is currently co-leading one of the CHILD-BRIGHT projects (funded by CIHR-SPOR), READYorNot™ BBD Project which is a randomized controlled trial study to test an e-health intervention to improve the transition of care journey for youth with BBD. This project collaborates with multiple stakeholders, including the Patient and Family Advisory Council.

AF is a Professor of Childhood Studies at the University of Portsmouth, UK. Her background is in children’s social work and policy, and her research has mainly focused on disabled children and young people’s voice, rights, participation and protection. Specifically, she has contributed unique insight into the participation of disabled children and young people in decision-making. Studies have also focused on the abuse and protection of disabled children and young people ensuring that their voices and experiences inform the development of policy and practice in this area. AF developed a research methodology for co-leadership with disabled young people, and over the past decade has worked in partnership with groups of disabled young people (the VIPERS and RIP:STARS) to support disabled young people-led evidence generation to facilitate policy and practice change across disabled children’s lives.

MK is an Associate Professor at the Center of Excellence for Rehabilitation Medicine Utrecht, the research institute of the UMC Utrecht Brain Center, and De Hoogstraat Rehabilitation (Utrecht, Netherlands). She was the project leader of the PiP-project described in this paper.

LN holds the Canadian Institutes of Health Research Patient-Oriented Research Award – Transition to Leadership Stream (TLS 170679) and the Graduate Student Fellowship in Patient-Oriented Research through the CHILD-BRIGHT Network. JWG held the Scotiabank Chair in Child Health Research during the work presented in this article. MK received funding by FNO (project number 100 − 038), Amsterdam, the Netherlands during the work presented in this article. The CFP Youth Panel is funded by FNO, and BvO completed an internship on the involvement of youth in research funded by ZonMW. The VIPERS and RIP:STARS were funded by the Big Lottery in the United Kingdom, and the RIP:STARS was funded by The British Academy during the writing of this article. The READYorNot™ BBD Project is funded under the Canadian Institutes of Health Research (CIHR-SCA-145104) Strategy for Patient-Oriented Research initiative, with funding partner support from Montreal Children’s Hospital Foundation, Faculty of Health Sciences of McMaster University, New Brunswick Health Research Foundation, McMaster Children’s Hospital Foundation and Hamilton Health Sciences.

Author information

Authors and affiliations.

School of Rehabilitation Science, McMaster University, Hamilton, ON, Canada

Linda Nguyen & Jan Willem Gorter

CanChild Centre for Childhood Disability Research, McMaster University, Hamilton, ON, Canada

Sibling Youth Advisory Council, Hamilton, ON, Canada

Linda Nguyen & Hanae Davis

The Netherlands Organisation for Health Research and Development, The Hague, The Netherlands

Bente van Oort

Supervisory board of Stichting JongPIT, Amsterdam, The Netherlands

Bente van Oort & Eline van der Meulen

Cumming School of Medicine, University of Calgary, Calgary, AB, Canada

Claire Dawe-McCord

Bachelor of Health Sciences Program, McMaster University, Hamilton, ON, Canada

School of Education and Sociology, University of Portsmouth, Portsmouth, UK

Anita Franklin

UMC Utrecht Brain Center, University Medical Center Utrecht, Utrecht, The Netherlands

Jan Willem Gorter & Marjolijn Ketelaar

Department of Pediatrics, McMaster University, Hamilton, ON, Canada

Jan Willem Gorter

PenCRU (Peninsula Childhood Disability Research Unit), University of Exeter Medical School, University of Exeter, Exeter, UK

Christopher Morris

De Hoogstraat Rehabilitation, Utrecht, The Netherlands

Marjolijn Ketelaar

You can also search for this author in PubMed   Google Scholar

Contributions

LN, BvO, and MK had initial discussions to conceptualize the idea of this paper. BvO conducted the literature review in the Introduction section. LN led the team, including drafting the meeting agendas and summaries and facilitating the meeting discussions with support from BvO and MK. All authors were involved in completing the reflective exercises and tables. LN conducted the analysis of reflective exercises from all project teams. LN drafted the first version of the manuscript, and all authors reviewed multiple versions of the drafts. Iterative revisions to the manuscript were completed by LN, BvO, and MK. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Marjolijn Ketelaar .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

The authors declare that they have no completing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

. Description of the project and characteristics of young partners involved in the project.

Additional file 2

. Framework about the values for public involvement in health research.

Additional file 3

. Involvement in activities.

Additional file 4

. Framework about the practicalities for public involvement in health research.

Additional file 5

. Length of time of involvement in projects.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Nguyen, L., van Oort, B., Davis, H. et al. Exploring the “how” in research partnerships with young partners by experience: lessons learned in six projects from Canada, the Netherlands, and the United Kingdom. Res Involv Engagem 8 , 62 (2022). https://doi.org/10.1186/s40900-022-00400-7

Download citation

Received : 19 September 2022

Accepted : 04 November 2022

Published : 17 November 2022

DOI : https://doi.org/10.1186/s40900-022-00400-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Involvement
  • Partnership
  • Disability research
  • Young people
  • Adolescents and young adults
  • Participatory research
  • Lived experience
  • Decision-making

Research Involvement and Engagement

ISSN: 2056-7529

research and gather information on partnerships

  • Open access
  • Published: 15 April 2019

How are evidence generation partnerships between researchers and policy-makers enacted in practice? A qualitative interview study

  • Anna Williamson   ORCID: orcid.org/0000-0003-0897-4769 1 , 2 , 3 ,
  • Hannah Tait 1 ,
  • Fadi El Jardali 4 ,
  • Luke Wolfenden 5 , 6 ,
  • Sarah Thackway 7 ,
  • Jessica Stewart 8 ,
  • Lyndal O’Leary 9 &
  • Julie Dixon 10  

Health Research Policy and Systems volume  17 , Article number:  41 ( 2019 ) Cite this article

4674 Accesses

19 Citations

32 Altmetric

Metrics details

Evidence generation partnerships between researchers and policy-makers are a potential method for producing more relevant research with greater potential to impact on policy and practice. Little is known about how such partnerships are enacted in practice, however, or how to increase their effectiveness. We aimed to determine why researchers and policy-makers choose to work together, how they work together, which partnership models are most common, and what the key (1) relationship-based and (2) practical components of successful research partnerships are.

Semi-structured qualitative interviews were conducted with 18 key informants largely based in New South Wales, Australia, who were (1) researchers experienced in working in partnership with policy in health or health-related areas or (2) policy and programme developers and health system decision-makers experienced in working in partnership with researchers. Data was analysed thematically by two researchers.

Researcher-initiated and policy agency-initiated evidence generation partnerships were common. While policy-initiated partnerships were thought to be the most likely to result in impact, researcher-initiated projects were considered important in advancing the science and were favoured by researchers due to greater perceived opportunities to achieve key academic career metrics. Participants acknowledged that levels of collaboration varied widely in research/policy partnerships from minimal to co-production. Co-production was considered a worthy goal by all, conferring a range of benefits, but one that was difficult to achieve in practice. Some participants asserted that the increased time and resources required for effective co-production meant it was best suited to evaluation and implementation projects where the tacit, experiential knowledge of policy-makers provided critical nuance to underpin study design, implementation and analysis. Partnerships that were mutually considered to have produced the desired outcomes were seen to be underpinned by a range of both relationship-based (such as shared aims and goals and trust) and practical factors (such as sound governance and processes).

Conclusions

Our findings highlight the important role of policy-makers in New South Wales in ensuring the relevance of research. There is still much to understand about how to initiate and sustain successful research/policy partnerships, particularly at the highly collaborative end.

Peer Review reports

Bridging the evidence–practice gap in health services has the potential to make a substantial contribution to improving health outcomes and service efficiencies [ 1 , 2 ] and to reducing research waste [ 1 ]. Proposed solutions to bridging this gap originally focussed heavily on strategies to enhance the dissemination of research findings to research users [ 2 , 3 ] or on building the capacity of policy-makers and clinicians to engage with and use research evidence [ 4 ]. More recently, focus has turned to the potential for evidence generation partnerships between researchers and knowledge users, such as policy-makers, community groups, clinicians and consumers, to improve the availability of relevant, timely evidence to help inform decision-making [ 5 , 6 ]. There has been a proliferation of research funding initiatives internationally that require researcher/knowledge user collaboration [ 7 ] and increasing demands on researchers to demonstrate that their work has had an ‘impact’ on policy or practice [ 8 , 9 , 10 ].

A number of research approaches that centre around evidence generation partnerships have been outlined in recent years, including integrated knowledge translation [ 5 , 11 ], participatory action research [ 12 , 13 ] and engaged scholarship [ 14 , 15 ]. Most of these approaches emphasise the co-production of knowledge [ 6 ], whereby researchers and those most likely to use or be affected by the evidence produced, work together to produce evidence. By combining the varied skills and expertise of these groups, it is hypothesised that the resultant research will have greater relevance to knowledge users and will more likely be used in decision-making [ 16 , 17 ].

Much has been published about many of these approaches from a conceptual standpoint [ 18 , 19 ], and there has similarly been a considerable amount of literature examining how evidence generation partnerships function [ 13 , 20 , 21 ]. Most of this research, however, does not explore evidence generation partnerships specifically between researchers and policy-makers [ 22 , 23 ], focusing instead on partnerships with community groups [ 13 ], clinicians [ 24 ] and others. All of these partnership combinations are likely to share some common features such as a requirement to take into account the diverse perspectives and needs of all partners; however, the policy-maker/researcher combination is likely to present some unique partnership challenges and opportunities due to the vastly different work environment and pressures that these groups face [ 25 , 26 , 27 ]. Further, while rarely the case with most other common partnership combinations, evidence generation partnerships between policy agencies and researchers may be initiated and funded by the policy agency themselves or be funded through external research grants (such projects may be more likely to be researcher-initiated). These different modes of funding and initiation may have important implications for how a partnership functions [ 28 ]; however, this has rarely been explored. Thus, there remains much to be learnt about how research/policy partnerships are enacted in practice [ 29 ] and how best to bring the diffuse skills, needs and priorities of researchers together effectively [ 30 , 31 ].

The current paper is the first step in a broader initiative which aims to develop a set of tools to help facilitate successful research/policy partnerships around evidence generation activities. In this paper, we explore the views and experiences of researchers and policy-makers with extensive expertise in research partnerships regarding why and how researchers and research users work together in New South Wales, Australia, and the factors which underpin successful research partnerships. In particular, we address the following research questions:

Why do researchers and policy-makers choose to work together? What are the perceived benefits?

How do they work together? Which partnership models are most common?

What are the key (1) relationship-based and (2) practical components of successful research partnerships?

Participants and setting

Data was collected between October 2017 and April 2018. We sought to interview participants from two groups; namely (1) researchers experienced in working in partnership with policy in health or health-related areas and (2) policy and programme developers and health system decision-makers (hereafter policy-makers, defined as “ someone employed in a policy agency who drafts or writes health policy documents or develops health programs, or who makes or contributes significantly to policy decisions about health services, programs or resourcing ” [ 32 ]) experienced in working in partnership with researchers. Participants were purposively identified by an Advisory Committee guiding this project. The role of the Advisory Committee was to provide expert advice on issues relating to the scope, implementation and dissemination of a programme of work centred on partnership research of which the current paper is a part. The Advisory Committee included an approximately equal mix of researchers and senior policy-makers, selected for their expertise in partnership research, and their extensive knowledge of researchers and policy-makers who work in partnered research. Advisory Committee members asked the permission of potential participants before passing on their contact details to the study team.

Participants were eligible to participate in the current study if they were nominated by a member of the Advisory Committee based on the criteria that they (1) were a researcher or policy-maker; (2) worked primarily in health or health-related areas (e.g. social care); (3) had extensive experience in partnering with researchers (for policy-makers) or policy-makers (for researchers) on research projects (partnership on at least five projects); and (4) were employed by a university, policy or programme agency or Local Health District. All participants provided written, informed consent to participate.

Data collection

A semi-structured approach was used to elicit participants’ opinions and experiences regarding the barriers and facilitators to successful evidence generation partnerships amongst researchers and research users in accordance with best practice guidelines (Additional file 1 : Appendix 1) [ 33 ]. Interviews were conducted by AW, a researcher with extensive expertise in conducting research in partnership with policy-makers. Each interview sought information on why the interviewee engaged in partnership research and the perceived risks and benefits, how the interviewee would characterise the types of partnership models they have engaged in (and the advantages and disadvantages of each), characteristics of successful and unsuccessful partnerships (and the factors associated with each), and indicators and predictors of impact. Interviews were conducted in person or by phone depending on the preference of the interviewee and ranged from half to one hour in length. Data was audio recorded before being transcribed by a person with no personal or professional connection to the participants involved in the study.

Data was analysed thematically. Two researchers (AW and HT) independently read all of the transcripts and coded the data to discern themes inductively. No predetermined framework was used to guide analysis. The researchers met regularly to review the draft codes and themes throughout the process until agreement was reached regarding the final version. Synthesised data and emerging themes were reviewed by the Advisory Committee who participated in sense making of emerging data.

A total of 18 key informants participated in interviews, of whom 7 were primarily researchers and 11 were currently primarily policy-makers. All participating researchers were employed by universities at the Associate Professor or Professor level and had more than 15 years research experience in public health research. All of the policy-makers who participated were employed at a Manager level or above by government agencies whose work focussed on health or health-related issues (such as social care). Six of the policy-makers had PhDs and/or had previously been employed as researchers and were thus able to bring both a researcher and a policy-maker lens to their analysis of research partnerships. All participants, except for one researcher, were based in New South Wales, Australia. The themes which emerged from the interview with the participant from outside New South Wales were consistent with those which emerged more broadly.

Why do researchers and policy-makers choose to work together?

Three major themes emerged regarding why researchers choose to work with policy, namely (1) increasing the likelihood of research impact; (2) gaining access to sought after resources; and (3) obtaining funding. The reason most commonly cited by researchers for wanting to partner with policy-makers to conducting research was the belief that the resultant research would more likely have an impact on policy and thus contribute to improving health.

“ I do research so that I can improve health… the main way in which research impacts health is through policy... Research without any sort of policy engagement will often sit on the shelf and do nothing… ” Researcher

Many researchers were also motivated to partner with policy-makers due to the access to otherwise unavailable resources this could facilitate (such as routinely collected data held by agencies or the data required to evaluate large-scale government policies or programmes). Researchers also reported seeking collaborations with policy-makers due to the funds attached to the work, either through completing a tendered project or, preferably, receiving funds to carry out a project the researchers had initiated or co-produced in collaboration with a relevant agency.

Five major themes emerged regarding what policy-makers perceived they gained by partnering with researchers, namely (1) access to additional skills; (2) links to researchers who can be called on for timely, informal advice; (3) access to the networks of their research collaborators; (4) the creation of high quality, relevant evidence; and (5) public, evidence-based support for government decisions. Policy-makers most commonly reported seeking to collaborate with researchers in order to gain access to skills and capacity over and above that already available within their agency.

“ …the reason you go to a researcher is that they’ve got the expertise or the resources or the time or whatever in order to be able to collect that information that you need and therefore that’s helpful to you. ” Policy-maker

Having established collaborations with researchers was also said to create a situation in which policy-makers felt able to contact these researchers when they required fast expert advice on a relevant issue. Policy-makers also noted that their collaborators often assisted them in making links with researchers outside of the collaboration who possessed additional specific skills or knowledge they needed. Policy-makers were also driven to collaborate with researchers in order to create high quality evidence that was seen to have credibility to support their work.

Collaborations with researchers were also seen to be helpful when the researcher elected to support government decisions in the public domain, for example, by speaking about the evidence base underpinning policy, programme or health service delivery decisions. While research users did not report requesting assistance of this nature, they appreciated it when provided.

How do researchers and policy-makers collaborate on research projects?

Four major themes emerged in relation to researcher/policy-maker collaboration, namely (1) wide variation in extent of collaboration; (2) policy agency-initiated partnerships; (3) researcher-initiated partnerships; and (4) co-produced partnerships. Most participants agreed that “ someone has to have the initial idea ” for a partnership; as such, participants agreed that most partnerships could be categorised as either policy agency initiated, including projects put out for tender by agencies and commissioned work, or researcher initiated, either through a partnered funding application to an external granting body or by a direct approach to an agency to fund a particular piece of work. Once these partnerships began, there was said to be a considerable level of variation in the extent to which the researchers and agency staff involved collaborated from almost no collaboration through to high levels of collaboration throughout all stages of the research process. This sustained, high level collaboration was considered by participants to characterise co-production. Policy-makers and researchers tended to nominate research projects centred around evidence reviews and analysis of existing datasets as requiring minimal collaboration, while closer collaboration was considered important when conducting interventions and evaluations.

“ So, I see partnerships as a sort of continuum. At one end is that transactional type interaction, through to co-production at the other end. ” Policy-maker

However, these partnership categories (policy- or researcher-initiated and extent of collaboration) were not seen to be entirely clear cut, with some researchers and policy-makers explaining that, as some policy agencies move towards embedding researchers within their teams and sometimes funding research centres, the distinction between policy-makers and researchers can sometimes be blurry.

A few policy-makers felt that highly collaborative, co-produced research partnerships were always preferable; however, most participants reported that various levels of collaboration could be effective depending on the situation.

“ I guess again we don’t come at it from which model works best. It’s more like what are the requirements of the project .” Policy-maker

Policy agency-initiated partnerships

A range of shared benefits (two subthemes), perceived unique risks to researchers (five subthemes), perceived unique benefits to researchers (two subthemes), and unique risks to policy-makers (five subthemes) of policy agency-initiated partnerships were identified (see Table  1 for illustrative quotes). Most researchers and policy-makers alike agreed that policy-makers are best placed to identify the critical issues around what evidence is needed to guide their decision-making. Thus, agency-initiated work was thought by both researchers and policy-makers to often be particularly well targeted for real-world impact.

Small, short term, policy-initiated projects were rarely seen by researchers as offering opportunities to conduct ‘cutting edge research’, and reportedly often did not generate publications or turn a profit. Researchers reported willingness to engage in these and other types of agency-initiated projects anyway as they were seen as an effective way to build a relationship with a policy agency and gain a better understanding of the policy context in the hopes of developing more advantageous collaborations.

On the other hand, most researchers outlined a number of potential risks or costs that were sometimes associated with agency-initiated work. Chief among these was a perceived opportunity cost, with time spent working on agency-initiated research sometimes resulting in less traditionally valued research outputs (such as peer-reviewed publications) than time spent on researcher-initiated work. Indeed, policy-makers and researchers alike reported that, as policy-makers often needed evidence around a specific issue, within a specific timeframe and at a specific cost, research methods needed to be determined pragmatically, rather than striving for the most scientifically excellent design. In addition, the evidence produced was sometimes rendered irrelevant by changes in the political environment.

Time was again a concern amongst researchers when the amount of time spent to complete a project to an agency’s satisfaction was seen to ‘blow out’. Researchers and some policy-makers reported that this was often due to agencies not being “ clear [about] what they want ”, resulting in a more iterative process than was anticipated. This challenge was also sometimes seen to be related to agencies not disclosing key information to researchers, limiting their understanding of the context and what was required.

Policy-makers also reported potential risks related to engaging with researchers in agency-initiated research, including not receiving the information or evidence they required, or it being of poor quality, or delivered well after the agreed deadline. Another commonly reported barrier was researchers overpromising to win a tender but not being able to deliver on these promises. Frustrations were sometimes said to arise due to researchers ignoring the complexity of the issue being investigated.

Finally, policy-makers reported that, in policy-initiated projects, researchers often failed to adequately synthesise the reported evidence and/or provide recommendations for action. As recommendations for action were reportedly often one of the key outcomes sought by agencies in these partnerships, this was particularly disappointing.

Researcher-initiated partnerships

One shared benefit related to research-initiated partnerships was commonly identified by researchers and policy-makers. Policy-makers also identified a range of risks associated with such partnerships (four subthemes). Researchers considered that partnerships they led conferred a range of important benefits (three subthemes), but also some risks (two subthemes) (see Table  2 for illustrative quotes). Researcher-initiated partnerships were seen by policy-makers as an important source of innovation, but not necessarily of major immediate relevance to their work. Despite this, if the policy-makers agreed to partner on a researcher-initiated project, they reported that they did expect to engender benefits from this. Nonetheless, one of the key risks noted for researcher-initiated partnerships was that the evidence produced did not fill an evidence need for the agency, or that it took so long to produce that, by the time it was available, it was no longer relevant.

This risk was seen to be exacerbated by another commonly noted risk of this partnership type, namely that the researchers take over and offer little opportunity for policy-makers to help shape the research agenda. Relatedly, many policy-makers reported previous dissatisfactions related to researchers not sharing credit for the work the partnership produced, for example, through co-authorship or shared publicity.

Researchers cited significant benefits associated with researcher-initiated partnerships, including the opportunity to focus on their specific research interests and employ more ambitious study designs, increasing their chances of obtaining peer-reviewed grants and high impact journal articles. The chief risks researchers reported in researcher-initiated research partnerships were that the policy-maker partners did not engage (attend meetings, provide information and advice) or did not deliver on promised resources such as access to particular datasets.

Co-produced partnerships

All participants reported that co-production was a worthy goal and likely to be highly effective when done well (four subthemes). Some regarded it as the most effective partnership model for real world impact (see Table  3 for illustrative quotes). Achieving co-production was also noted to be difficult, with common challenges (two subthemes) and a range of facilitators (four subthemes) identified.

The benefits of co-production were seen to derive in part from the greater levels of engagement between all members of the team seen to be implicit in this model, driven by mutual aims and interests. This high level of engagement was in turn seen to allow the complementary skills and knowledge of all partners to be optimised. Resulting from this, a core strength of co-production was thought to be its ability to integrate tacit, experiential knowledge with traditional ‘evidence’, resulting in particularly nuanced and relevant outputs. Co-production was also seen to provide opportunities for frequent ‘relevance checks’ to ensure that the research being undertaken remains in line with ‘real world’ priorities.

Nonetheless, while co-production was the goal for many participants, some noted that, thus far, they had not been able to achieve it, with either the policy or the researcher partners dominating partnership projects in practice. Multiple reasons were given for the reported tendency for one group to dominate research partnerships, ranging from lower expectation amongst one group that the project might benefit themselves or their agency resulting in reduced engagement, to the dominating group offering little real opportunity for their partners to contribute to shaping the research agenda. The most frequently mentioned facilitators of co-production were things that allowed long-term relationships and trust to develop between researchers and policy-makers, namely stability of staff at policy agencies and policy agency-funded research centres.

What does ‘success’ in research partnerships look like?

Seven subthemes were identified in relation to the characteristics of successful partnerships. Key informants most commonly described a successful partnership as one where “ all stakeholders are happy with the outcomes ”. Expanding further, common perceptions of successful partnerships included all parties having a shared understanding of why they were partnering and what this would involve, the planned deliverables being delivered to a high standard and relevant, usable evidence being generated. Many participants noted that a successful partnership produces more than any partner could have alone and that it results in mutual gain. Creating evidence that impacts on policy and practice was considered an indicator of success, but as participants recognised that impact is influenced by a complex array of factors, they considered that partnerships could be considered successful even in the absence of this.

What are the components of successful research partnerships?

Participants identified a range of relationship-based (six subthemes) and practical facilitators (six subthemes) of successful partnerships as well as three cross-cutting facilitators (Table  4 ).

Shared aims and goals were seen as the fundamental building block of successful partnerships, and something that motivated partners to withstand the difficulties and challenges that can emerge over the course of partnerships.

“ So that’s my little tip there. All the energy at the start has to be about where you want to finish and why a partnership is important, because it will also help you during the times when the partnership’s tested, to remind yourself why working together was important in the first place. ” Researcher

Participants were also keenly aware of the differences between researchers and policy-makers in relation to their needs and goals. For example, researchers are generally expected to demonstrate their productivity through publications and funded grants, whereas policy-makers need to provide advice or develop policies or programmes around complex issues, often with very little preparation time. Both groups emphasised the necessity of understanding and attempting to accommodate these different needs.

“ … it’s the r elationship building that’s critical, being prepared to put the time and effort into the relationship building and having the ability to empathise with the perspective of someone who’s trying to run a health service or someone who’s having to make a policy decision… ” Researcher

Practical components of successful partnerships

All of the practical components of successful partnerships were seen to be underpinned by the need to take time at the outset of the partnership (and throughout) to ensure that ‘everyone is on the same page’. Practical ways of doing this that were frequently cited included documenting and agreeing on timelines, deliverables and who is responsible for what at the outset (but allowing some flexibility for when changes arise), agreeing on governance structures and processes and a publication policy (which specifies the potential role of research users in publications). Adequate funding was seen as another core practical component of success, as was all parties delivering as promised. Ongoing engagement, expressed through attendance at meetings, participation in discussions and provision of feedback, for example, was also seen as key. Mutual benefit was cited by all participants as fundamental to partnership success.

“ The best partnerships are clearly win/win…Where it becomes one-sided, all the gains being on one side at the expense of the other, that’s where things fall apart because the side that’s not gaining is thinking well, what am I doing here? ” Researcher

Our findings suggest that both researcher-initiated and policy agency-initiated evidence generation partnerships were common among researchers and policy-makers in New South Wales. While policy-initiated partnerships were thought to be the most likely to result in impact [ 34 ], researcher-initiated projects were seen to play an important role in advancing the science and often provided researchers with more opportunities to achieve the outputs valued in their profession such as high impact publications. In keeping with the literature, participants acknowledged that levels of collaboration varied widely between research/policy partnerships [ 16 , 20 , 28 , 35 , 36 ].

Co-production, or collaboration across all stages of the research process, was seen by some participants as the ideal model for collaboration, increasing the relevance of the research produced and the likelihood of impact [ 37 , 38 , 39 ]. Others considered that this time and resource-intensive way of working was not always warranted [ 40 ]. While agreeing aims, goals and deliverables collaboratively was always viewed as essential [ 41 ], these informants suggested that, once this was accomplished, some types of projects, such as analyses of large datasets and evidence reviews, could be completed largely independently by researchers with no loss of quality or relevance. In contrast, co-production was seen to be particularly worthwhile in relation to evaluation and implementation projects where the tacit, experiential knowledge of policy-makers provided critical nuance to underpin study design, implementation and analysis. Of note, several policy-makers suggested that, while ideally they would like to co-produce research, in practice they had not found this to be possible. Participants suggested that trusting, long-term relationships were generally a precursor to co-production. These were seen to be facilitated by policy agencies having a stable workforce; frequent staff changes in policy agencies are frequently cited as a barrier to successful evidence generation partnerships in the literature [ 31 , 42 ] and policy agency-funded research centres [ 43 ].

Key informants in the current study highlighted a clear distinction between researcher and policy agency-initiated partnerships. In keeping with the literature, policy-makers were thought by both groups to be best placed to identify what evidence is needed to guide real world decision-making [ 5 , 44 , 45 ]. The relevance of the ensuing research and the ability of policy agencies to utilise it meant that policy agency-initiated work was reported to be particularly likely to lead to real-world impact [ 22 , 34 ]. Achieving such impact was the primary motivation reported by researchers for engaging in evidence generation partnerships [ 22 ]. Potential opportunity costs were also noted, including time taken away from producing ‘world class’ research [ 28 , 46 ], challenges agreeing on final outputs [ 28 , 29 ], unexpectedly iterative processes [ 47 ] and the potential for the evidence produced to remain unused due to changes in the policy landscape [ 29 ].

Many of the key frustrations expressed by policy-makers around policy agency-initiated partnerships in the current study have been less commonly reported in the literature. These include researchers overstating what is possible in order to win tenders (leading to disappointment with the final product), choosing only to focus on the elements of a topic that interest them, or oversimplifying things in order to make them ‘doable’ and failing to fully engage with the complexity of an issue. As systems thinking approaches gain momentum in research [ 48 , 49 ], it will be interesting to observe whether this last challenge becomes less common. A key frustration noted by policy-makers was the failure of researchers to provide recommendations for action, one of the main outcomes reportedly sought by agencies when they initiated partnerships with researchers [ 28 , 50 , 51 ]. Despite the noted risks, researchers stated that even small agency-initiated projects were worthwhile as relationship-building and learning exercises [ 27 , 52 ], which may later help to underpin partnerships they felt to be more favourable to themselves. Policy-makers for their part continued to initiate evidence generation partnerships in order to access additional capacity and expertise to produce high quality, credible evidence to inform their work.

Researchers in the current study placed particular value on researcher-initiated partnerships. These were considered to offer the opportunity to conduct research that was both ‘scientifically excellent’, and likely to have impact. Policy-makers, on the other hand, noted that, while sometimes providing useful evidence, these projects were less often of immediate relevance or importance to them. Unsurprisingly then, the major risk researchers reported in researcher-initiated partnerships was that policy-makers did not engage [ 29 , 53 ]. This suggests that even researchers who aim to conduct research with real world impact, and who are experienced in working partnership with policy, are often not in alignment with policy agencies in their opinions as to research priorities. This finding is consistent with the large body of literature describing the widespread differences between the research and policy worlds [ 26 , 27 , 37 , 54 ], whereby researchers most often attempt to ‘push’ their research to policy agencies rather than ascertaining when there may be a policy ‘pull’ for evidence and building at least some of their research agenda to align with this [ 55 , 56 ]. One mechanism governments in New South Wales [ 57 ] and others internationally [ 58 , 59 , 60 , 61 ] employ to help develop long-term collaborations with researchers and increase the production of policy-relevant research is government-funded research centres. While as yet there have been few evaluations of such initiatives, it seems a potentially promising method of facilitating productive, lasting partnerships.

Many of the key barriers and facilitators reported to underpin partnership success or failure noted in the current study are consistent with those reported in the literature. Many of these appeared to speak primarily of a clear need to ensure that the different needs and drivers of all parties were understood, valued and accommodated as much as possible. Thus, participants underlined the foundational need for shared aims and goals, ensuring that all parties were benefiting from the collaboration and maintaining open communication, trust and mutual respect [ 16 , 20 , 21 , 27 , 42 ].

Amongst our key informants, practical tools were also thought to be of great value in supporting a well-functioning partnership; these included having clearly documented and agreed timelines, deliverables, workplans, publication policies and governance structures [ 31 , 53 ]. While some groups appear to have well-developed methods in this area, this appears to be a partnership capacity gap for others. As noted elsewhere [ 29 , 31 ], ongoing engagement, expressed through things such as attendance at meetings, participation in discussions and provision of feedback and presenting findings as they emerge, was described as critical to keeping the project on track and maintaining relationships.

A strength of the current study is the participation of both researchers and policy-makers who are highly experienced in working in partnership on evidence generation activities as we expected this group to be able to provide considerable insight into the complexities of research/policy partnerships. By exploring their views broadly, rather than in relation to a particular project, we were able to draw out their key learnings from working in partnership on evidence generation activities. We acknowledge the limitations inherent in the small sample size utilised here, the fact that most of our participants were from New South Wales and that our sample was not randomly selected. Indeed, the experiences of these highly practiced individuals may diverge from those of researchers and policy-makers who have less experience of participating in research/policy partnerships, or who work in settings where such partnerships are less common. Although many of our findings align with the existing literature, these limitations mean that our findings may not be broadly generalisable.

There is still much to understand about how to initiate and sustain successful research/policy partnerships, particularly at the highly collaborative co-production end. Our findings highlight the important role of policy-makers in ensuring the relevance of research, mostly in evaluation and implementation projects where co-production appears to be particularly valuable. The next phase of our work will seek to provide a more detailed exploration of the specific practical strategies, including governance approaches, that can be put in place to facilitate strong and effective evidence generation partnerships between researchers and policy-makers.

Macleod M, Michie S, Roberts I, Dirnagl U, Chalmers I, John PA, Ioannidis J, et al. Biomedical research: increasing value, reducing waste. Lancet. 2014;383(9912):101–4.

Article   Google Scholar  

Dobbins M, Hanna SE, Ciliska D, Manske S, Cameron R, Mercer SL, et al. A randomized controlled trial evaluating the impact of knowledge translation and exchange strategies. Implement Sci. 2009;4:61.

Lavis JN, Wilson MG, Grimshaw JM, Haynes RB, Hanna S, Raina P, et al. Effects of an evidence service on health-system policy makers’ use of research evidence: a protocol for a randomised controlled trial. Implementation Sci. 2011;6:51.

CIPHER Investigators. Supporting Policy In health with Research: an Intervention Trial (SPIRIT) – protocol for a stepped wedge trial. BMJ Open. 2014;4(7):e005293.

Graham ID, Logan J, Harrison MB, Straus SE, Tetroe J, Caswell W, et al. Lost in knowledge translation: time for a map? J Contin Educ Heal Prof. 2006;26:13–24.

Heaton J, Day J, Britten N. Collaborative research and the co-production of knowledge for practice: an illustrative case study. Implement Sci. 2016;11:20.

McLean RKD, Graham ID, Tetroe JM, Volmink JA. Translating research into action: an international study of the role of research funders. Health Res Policy Syst. 2018;16:44.

Greenhalgh T, Fahy N. Research impact in the community-based health sciences: an analysis of 162 case studies from the 2014 UK Research Excellence Framework. BMC Med. 2015;13:232.

Hanney S, Greenhalgh T, Blatch-Jones A, Glover M, Raftery J. The impact on healthcare, policy and practice from 36 multi-project research programmes: findings from two reviews. Health Res Policy Syst. 2017;15:26.

Newson R, King L, Rychetnik L, Bauman AE, Redman S, Milat AJ, et al. A mixed methods study of the factors that influence whether intervention research has policy and practice impacts: perceptions of Australian researchers. BMJ Open. 2015;5(7):e008153.

Graham ID, Kothari A, McCutcheon C, Integrated Knowledge Translation Research Network Project Leads. Moving knowledge into action for more effective practice, programmes and policy: protocol for a research programme on integrated knowledge translation. Implementation Sci. 2017;13:22.

Baum F, MacDougall C, Smith D. Participatory action research. J Epidemiol Community Health. 2006;60(10):854.

Jagosh J, Biush PL, Slalsberg J, Macaulay MC, Greenhalgh T, Wong G, et al. A realist evaluation of community participatory research: partnership synergy, trust building and ripple effects. BMC Public Health. 2015;15:725.

Bowen SJ, Graham ID. From knowledge translation to engaged scholarship: promoting research relevance and utilization. Arch Phys Med Rehabil. 2013;94:S3–8.

Van de Ven AH. Engaged Scholarship: A guide to Organizational and Social Research. New York: Oxford University Press; 2007.

Google Scholar  

Sibbald SL, Tetroe J, Graham ID. Research funder required research partnerships: a qualitative inquiry. Implementation Sci. 2014;9:176.

Solberb LI, Glasgow KE, Unutzer J, Jaeckels N, Oftedahl G, Beck A et al. Partnership Implementation Research. Medical Care. 2010; 48(7): 576-82.

Graham I, Tetroe J. Some theoretical underpinnings of knowledge translation. Acad Emerg Med. 2008;14(11):936–41.

Greenhalgh T, Jackson C, Shaw S, Janamian T. Achieving research impact through co-creation in community-based health services: literature review and case study. Milbank Q. 2016;94(2):392–429.

Corbin JH, Jones J, Barry MM. What makes intersectoral partnerships for health promotion work? A review of the international literature. Health Promot Int. 2018;33(1):4–26.

PubMed   Google Scholar  

Rycroft-Malone J, Burton C, Wilkinson J, Harvey G, McCormack B, Baker R, et al. Collective action for knowledge mobilisation: a realist evaluation of the Collaborations for Leadership in Applied Health Research and Care. Health Services and Delivery Research. Southampton: NIHR Journals Library; 2015.

Bowen S, Botting I, Graham ID, Huebner LA. Beyond “two cultures”: guidance for establishing effective researcher/health system partnerships. Int J Health Policy Manag. 2017;6(1):27–42.

Voorberg WH, Bekkers VJJM, Tummers LG. A systematic review of co-creation and co-production: embarking on the social innovation journey. Public Manag Rev. 2015;17(9):1333–57.

Sahs JA, Nicasio AV, Storey JE, Guarnaccia PJ, Lewis-Fernandez R. Developing research collaborations in an academic clinical setting: challenges and lessons learned. Community Ment Health J. 2017;53(6):647–60.

Caplan N. The two-communities theory and knowledge utilization. Am Behav Sci. 1979;22(3):459–70.

Brownson RC, Royer C, Ewing R, McBride TD. Researchers and policymakers: travelers in parallel universes. Am J Prev Med. 2006;30(2):164–72.

Kothari A, MacLean L, Edwards N, Hobbs A. Indicators at the interface: managing policymaker-researcher collaboration. Knowledge Manage Res Pract. 2011;9:203–14.

Martin S. Co-production of social research: strategies for engaged scholarship. Public Money Management. 2010;30:211–8.

Nystrom ME, Karltun J, Keller C, Gare A. Collaborative and partnership research for improvement of health and social services: researchers experiences from 20 projects. Health Res Policy Syst. 2018;16:46.

Article   CAS   Google Scholar  

Kothari A, Wathen CN. Integrated knowledge translation: digging deeper, moving forward. J Epidemiol Community Health. 2017;71:619–23.

Gagliardi AR, Berta W, Kothari A, Boyko J, Urquhart R. Integrated knowledge translation (IKT) in health care: a scoping review. Implementation Sci. 2016;11:38.

Haynes A, Turner T, Redman S, Milat AJ, Moore G. Developing definitions for a knowledge exchange intervention in health policy and program agencies: Reflections on process and value. Int J Soc Res Methodol. 2015;18(2):145–59.

Tracy SJ. Qualitative quality: eight “big-tent” criteria for excellent qualitative research. Qualitative Inquiry. 2010;16(10):837–51.

Kok MO, Gyapong JO, Wolffers I, Ofori-Adjei D, Ruitenberg J. Which health research gets used and why? An empirical analysis of 30 cases. Health Res Policy Syst. 2016;14:36.

Murray L. Deliberative research for deliberative policy making: creating and recreating evidence in transport policy. Social Policy Society. 2011;10(4):459–70.

Ross S, Lavis J, Rodriguez C, Woodside J, Denis J. Partnership experiences: involving decision-makers in the research process. J Health Serv Res Policy. 2003;8(Suppl 2):26–34.

Campbell D, Redman S, Jorm L, Cooke M, Zwi AB, Rychetnik L. Increasing the use of evidence in health policy: Practice and views of policy makers and researchers. Aust New Zealand Health Policy. 2009;6:21.

Innvaer S, Vist G, Trommald M, Oxman A. Health policy-makers’ perceptions of their use of evidence: a systematic review. J Health Serv Res Policy. 2002;7(4):239–44.

Lomas J. Improving Research Dissemination and Uptake in the Health Sector: Beyond the Sound of One Hand Clapping. In: Policy Commentary. Hamilton, ON: McMaster University Centre for Health Economics and Policy Analysis; 1997. Report No.: C97–1.

Gajda R. Utilizing collaboration theory to evaluate strategic alliances. Am J Eval. 2004;25:66–77.

Gagliardi AR, Kothari A, Graham ID. Research agenda for integrated knowledge translation (IKT) in healthcare: what we know and do not yet know. J Epidemiol Community Health. 2017;71(2):105–6.

Hofmeyer A, Scott C, Lagendyk L. Researcher-decision-maker partnerships in health services research: Practical challenges, guiding principles. BMC Health Serv Res. 2012;12:280.

Thackway S, Campbell D, Loppacher T. A long-term, strategic approach to evidence generation and knowledge translation in NSW. Australia Public Health Res Pract. 2017;27(1):e2711702.

Hegger I, Marks LK, Janssen SWJ, Schuit AJ, Keijsers JFM, van Oers HAM. Research for Policy (R4P): development of a reflection tool for researchers to improve knowledge utilization. Implementation Sci. 2016;11:133.

Keown K, Van Eerd D, Irvin E. Stakeholder engagement opportunities in systematic reviews: Knowledge transfer for policy and practice. J Contin Educ Health Prof. 2008;28(2):67–72.

Golden-Biddle K, Reay T, Petz S, Witt C, Casebeer A, Pablo A, et al. Toward a communicative perspective of collaborating in research: the case of the researcher-decision-maker partnership. J Health Serv Res Policy. 2003;8(Suppl 2):20–5.

Johnson R, Grove A, Clarke A. It’s hard to play ball: a qualitative study of knowledge exchange and silo effects in public health. BMC Health Serv Res. 2018;18:1.

Cherney A, Head B. Supporting the knowledge-to-action process: a systems-thinking approach. Evid Policy. 2011;7(4):471–88.

Huang TTK, Grimm B, Hammond RA. A systems-based typological framework for understanding the sustainability, scalability, and reach of childhood obesity interventions. Children's Health Care. 2011;40(3):253–66.

Lavis J, Davies H, Oxman A, Denis JL, Golden-Biddle K, Ferlie E. Towards systematic reviews that inform health care management and policy-making. J Health Serv Res Policy. 2010;10(Supp 1):35–48.

Lavis JN, Robertson D, Woodside JM, McLeod CB, Abelson J. How can research organizations more effectively transfer research knowledge to decision makers? Milbank Q. 2003;81(2):221–48 171–2.

Mirzoev TN, Omar MA, Green AT, Bird PK, Lund C, Ofori-Atta A, et al. Policy partnerships-experiences of the Mental Health and Poverty Project in Ghana, South Africa, Uganda and Zambia. Health Res Policy Syst. 2012;10:30.

Jose K, Venn A, Jarman L, Seal J, Teale B, Scott J, et al. Partnering Healthy@Work: an Australian university-government partnership facilitating policy-relevant research. Health Promot Int. 2017;32(6):964–76.

Choi BCK, Pang T, Lin V, Puska P, Sherman G, Goddard M, et al. Can scientists and policy makers work together? J Epidemiol Community Health. 2005;59(8):632.

Kerr EA, Riba M, Udow-Phillips M. Helping health service researchers and policy makers speak the same language. Health Serv Res. 2015;50(1):1–11.

Oxman AD, Vandvik PO, Lavis JN, Fretheim A, Lewin S. SUPPORT Tools for evidence-informed health Policymaking (STP) 2: Improving how your organisation supports the use of research evidence to inform policymaking. Health Res Policy Syst. 2009;7(Suppl 1):S2.

Thackway S, Campbell D, Loppacher T. A long-term, strategic approach to evidence generation and knowledge translation in NSW, Australia. Public Health Res Practice. 2017;27(1):2711702.

Soper B, Hinrichs S, Drabble S, Yaqub O, Marjanovic S, Hanney S, Nolte E. Delivering the Aims of the Collaborations for Leadership in Applied Health Research and Care: Understanding their Strategies and Contributions. Health Services and Delivery Research. Southampton: NIHR Journals Library; 2015.

Stetler CB, Mittman BS, Francis J. Overview of the VA Quality Enhancement Research Initiative (QUERI) and QUERI theme articles: QUERI Series. Implementation Sci. 2008;3:8.

Heller DJ, Hoffman C, Bindman AB. Supporting the needs of state health policy makers through university partnerships. J Health Polit Policy Law. 2014;39(3):667–77.

Molleman G, Fransen G. Academic collaborative centres for health promotion in the Netherlands: building bridges between research, policy and practice. Fam Pract. 2012;29(Suppl 1):157–62.

Download references

Acknowledgements

Our sincere thanks to the key informants who participated in this study and generously shared their experiences and learnings.

This work was funded through a New South Wales Health EMC Fellowship held by AW. LW is supported by a NHMRC CDF II Fellowship (APP1128348).

Availability of data and materials

The datasets generated and/or analysed during the current study are not publicly available due to privacy concerns but may be available from the corresponding author on reasonable request.

Author information

Authors and affiliations.

The Sax Institute, PO Box K617, Haymarket, NSW, 1240, Australia

Anna Williamson & Hannah Tait

University of Sydney, Sydney, Australia

Anna Williamson

University of New South Wales, Sydney, Australia

American University of Beirut, Beirut, Lebanon

Fadi El Jardali

University of Newcastle, Callaghan, Australia

Luke Wolfenden

Hunter New England Population Health, New Lambton, Australia

New South Wales Health, North Sydney, Australia

Sarah Thackway

Department of Family and Community Services (FACS) Insights, Analysis and Research (FACSIAR), Ashfield, Australia

Jessica Stewart

Western NSW & Far West Local Health Districts, Dubbo, Australia

Lyndal O’Leary

South Eastern Sydney Local Health District (SESLHD), Carringbah, Australia

Julie Dixon

You can also search for this author in PubMed   Google Scholar

Contributions

AW led this work, including contributing to the study design, conducting and analysing the key informant interviews and drafting the paper. HT analysed the interviews and contributed to drafting the paper. All authors made substantial contributions to the design of the study and the analysis and interpretation of data and were involved in critically revising the manuscript for important intellectual content. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Anna Williamson .

Ethics declarations

Ethics approval and consent to participate.

The current study was approved by the University of New South Wales human Ethics Committee (HC17843).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:.

Appendix 1. Partnership research key informant interviews: policy, programme or health service deliverers. (DOCX 32 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Williamson, A., Tait, H., El Jardali, F. et al. How are evidence generation partnerships between researchers and policy-makers enacted in practice? A qualitative interview study. Health Res Policy Sys 17 , 41 (2019). https://doi.org/10.1186/s12961-019-0441-2

Download citation

Received : 04 November 2018

Accepted : 19 March 2019

Published : 15 April 2019

DOI : https://doi.org/10.1186/s12961-019-0441-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Partnership
  • co-production

Health Research Policy and Systems

ISSN: 1478-4505

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

research and gather information on partnerships

John W Gardner Center for Youth and their Communities

Data Use and Inquiry in Research-Practice Partnerships: Four Case Examples

Report Cover

The four case examples presented in this brief are drawn from the Gardner Center's substantial experience conducting rigorous research in research-practice partnerships. The first case describes a partnership approach that enhances a school district's capacity to use integrated longitudinal data to tackle persistent problems of practice and monitor students' development. The second case exemplifies how an equitable research model, grounded in mutualism and sensitive to cultural nuances, can be leveraged to elevate the experience marginalized communities. The third case furthers knowledge about the implementation process and partnership dynamics within a Promise Neighborhood initiative, specifically as stakeholders negotiate accountability demands with the need for more actionable information. The final case highlights strategies that foster partnership within a national professional learning network that is working to build out-of-school time systems using data to improve programming for underserved youth.

Suggested Citation:

Biag, M., Gerstein, A., Fehrer, K.C., Sanchez, M, and Sipes, L. (2016). Data Use and Inquiry in Research-Practice Partnerships: Four Case Examples. Stanford, CA: John W. Gardner Center for Youth and Their Communities.

Document(s): 

Data Use and Inquiry in RPPs White Paper.pdf

Related Topics

  • Research Partnership to Support the Mission Promise
  • The Redwood City Community Schools Project
  • A Needs Assessment Study of the Family Engagement
  • Next Generation After School System Building Initiative

More Publications

Personalized learning as transformative learning: alternative education lessons from big picture learning, oakland kids first: peers advising students to succeed implementation study, understanding youth leadership development: an examination of the yell program.

  • Submit your COVID-19 Pandemic Research
  • Research Leap Manual on Academic Writing
  • Conduct Your Survey Easily
  • Research Tools for Primary and Secondary Research
  • Useful and Reliable Article Sources for Researchers
  • Tips on writing a Research Paper
  • Stuck on Your Thesis Statement?
  • Out of the Box
  • How to Organize the Format of Your Writing
  • Argumentative Versus Persuasive. Comparing the 2 Types of Academic Writing Styles
  • Very Quick Academic Writing Tips and Advices
  • Top 4 Quick Useful Tips for Your Introduction
  • Have You Chosen the Right Topic for Your Research Paper?
  • Follow These Easy 8 Steps to Write an Effective Paper
  • 7 Errors in your thesis statement
  • How do I even Write an Academic Paper?
  • Useful Tips for Successful Academic Writing

Transformative Forces: Social Entrepreneurship as Key Competency

Virtual learning experiences among postgraduate students in namibia, collaborative governance in government administration in the field of state security along the republic of indonesia (ri)-malaysia border area.

  • IT Service Management System Practices in Kenya
  • Introduction Economic and Psychological Well-Being During COVID-19 Pandemic in Albania, A Need for Sustainability
  • Designing a Framework for Assessing Agripreneurship Action for the Green Scheme Irrigation Projects, Namibia
  • The Potential Utilisation of Artificial Intelligence (AI) in Enterprises
  • Case Study – Developing a National Research and Evidence Base for The Health and Wellbeing Chapter of The Welsh Government’s 2023 Innovation Strategy for Wales
  • Slide Share

Research leap

Developing effective research collaborations: Strategies for building successful partnerships

Research collaborations have become increasingly popular in recent years, with more and more researchers recognizing the benefits of working together to achieve common research goals. Collaborative research offers a range of advantages, including increased funding opportunities, access to specialized expertise, and the potential for greater impact and reach of research outcomes. However, building successful partnerships requires careful planning and execution, as well as a willingness to overcome the challenges that can arise when working with others. In this article, we will explore the strategies for developing effective research collaborations and building successful partnerships.

Identify shared research interests and goals:

The first step in building a successful research collaboration is to identify shared research interests and goals. This requires careful consideration of the research areas that are of interest to all parties involved, as well as an understanding of the specific research questions that each party seeks to answer. This may involve conducting a thorough literature review to identify knowledge gaps or areas where additional research is needed. Once shared research interests and goals have been identified, a clear research plan can be developed that outlines the objectives, research methods, and expected outcomes of the collaboration.

Establish clear roles and responsibilities:

In order to avoid confusion and ensure that everyone involved in the research collaboration is working towards the same objectives, it is important to establish clear roles and responsibilities from the outset. This means clearly defining the tasks and responsibilities of each member of the research team, as well as outlining the timelines and milestones for the project. This can help to avoid duplication of effort, reduce the risk of misunderstandings, and ensure that everyone is aware of their contribution to the collaboration.

Foster open communication and collaboration:

Effective research collaborations require open communication and collaboration between all parties involved. This means creating a supportive and inclusive research environment where everyone feels comfortable sharing their ideas and perspectives, and where constructive feedback is encouraged. Regular meetings and check-ins can help to ensure that everyone is on track, and that any issues or concerns are addressed in a timely manner. Collaborative research platforms, such as shared online spaces or project management software, can also help to facilitate communication and collaboration among team members.

Build trust and mutual respect:

Building trust and mutual respect is essential for developing effective research collaborations. This means creating a culture of transparency and honesty, where everyone feels comfortable sharing their thoughts and concerns without fear of judgment or reprisal. It also means respecting the expertise and opinions of other team members, and being willing to compromise and find common ground when differences arise. By building trust and mutual respect, research collaborations can create a strong foundation for success and ensure that all members feel valued and supported.

Manage conflicts and challenges:

Despite the best planning and execution, conflicts and challenges can arise when working on collaborative research projects. These may include differences in research approaches or methodologies, competing priorities or interests, or misunderstandings about roles or responsibilities. Effective conflict management is essential for maintaining the momentum of the collaboration and ensuring that everyone remains focused on achieving the shared research goals. This may involve implementing clear conflict resolution protocols, establishing open lines of communication for addressing concerns, or seeking external mediation when necessary.

Research collaborations offer a range of benefits, from increased funding opportunities to access to specialized expertise and resources. However, building successful partnerships requires careful planning and execution, as well as a willingness to overcome the challenges that can arise when working with others. By following the strategies outlined in this article, researchers can develop effective research collaborations that are based on shared research interests and goals, clear roles and responsibilities, open communication and collaboration, trust and mutual respect, and effective conflict management. By doing so, they can increase the impact and reach of their research outcomes, and make meaningful contributions to their respective fields.

Suggested Articles

Useful Tips for Successful Academic Writing

Creation Process of the Successful Research Project. Follow these Steps while doing a research in…

Try a New Method for Your Research. Research Tools for Primary and Secondary Research

The Role of Primary and Secondary Research Tools in Your Survey Try a New Method…

Image depicting successful project development through research inquiry

If you are embarking on a research project, the first step is to develop a…

how-to-organize-your-paper-wisely

How to organize your research paper effectively Wondering about organizing your paper wisely? And here…

Related Posts

research and gather information on partnerships

Comments are closed.

  • Skip to main content
  • Skip to FDA Search
  • Skip to in this section menu
  • Skip to footer links

U.S. flag

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

U.S. Food and Drug Administration

  •   Search
  •   Menu
  • Science & Research
  • Science and Research Special Topics
  • Advancing Regulatory Science

The Digital Variome: Understanding the Implications of Digital Tools on Health

CERSI Collaborators: University of California at San Francisco (UCSF): Andrew Auerbach MD MPH (UCSF, Variome and DOVeS); Benjamin Rosner MD PhD (UCSF, Variome and DOVeS co-PI); Stanford Univerisity (Stanford): Matthew Horridge PhD (Stanford, DOVeS)

FDA Collaborators: Center for Devices and Radiological Health(CDRH): Bakul Patel, MS, MBA (Formerly of CDRH); Vinay Pai, PhD; Catherine Bahr; Leeda Rashid, MD, MPH, ABFM; Arti Tandon, PhD; Charlie Yongpravat, PhD; Anindita Saha, PhD

CERSI Subcontractors: Flying Buttress Associates- Jeph Herrin, PhD

CERSI In-Kind Collaborators: Stanford Univerisity (Stanford): Mark Musen, PhD (Stanford, DOVeS)

Non-Federal Entity Collaborators: Johnson and Johnson- Karla Childers, MSJ, Paul Coplan, ScD, MBA, Stephen Johnston, MSc

Project Start Date: October 12, 2021 Project End Date: February 28, 2022-->

Regulatory Science Framework

Charge I “Modernize development and evaluation of FDA-regulated project” and Focus Area “C. Analytical and computational Methods.”

Regulatory Science Challenge

There is a consistent need to research and develop the methods used to ensure the quality and safety of FDA-regulated products. Research and development in this area helps FDA employ scientifically valid approaches for combining patient input and data from multiple sources. These 'real world' data insights are key to informing regulatory decision-making both for traditionally regulated products (e.g. drugs and devices) as well as for new and emerging products such as digital health tools. Furthermore, as the FDA considers new products for regulatory approval, they may examine whether these new products are “substantially equivalent” to previously approved products. The wave of new digital health products creates regulatory decision challenges that need to be informed by real world data and data that help identify the degree to which products may be similar.

Project Description and Goals

The Digital Variome project extends work that is ongoing as part of our overarching CERSI project Developing Frameworks and Tools for Integration of Digital Health Tools into Clinical Practice , a national network of leading academic medical centers, researchers, and innovators working to identify how real world measures and data can be used across types of software used in health, and the eventual data sources required to carry out real world performance measurement and post-market surveillance of digital health tools (DHTs). ADviCE identified several challenges to DHT adoption: (1) Variable definitions of which DHTs are relevant to clinical care delivery; (2) Lack of consistent, common terms to describe DHTs during selection, (3) Wide variability in how health systems integrate DHTs into practice and, (4) Lack of a framework and tools to evaluate DHTs’ real-world performance through post-market surveillance.

The ADviCE project in turn framed the goals of the Variome proposal, which focused on identifying data sources and potential partnerships needed to create a learning health collaboration that might leverage tools such as NEST or resources (e.g., PCORnet, or payor data) to provide data needed to carry out post-market surveillance of DHTs. Few of these data networks or partnerships could gather information needed for DHT post-market surveillance, so investigators turned their attention to tools which would both facilitate efficient specification of DHT characteristics while also being flexibly able to accommodate measures that might vary between DHTs even though applicable to similar patients or health systems.

With this realization, the research team extended their Priority Measurement framework and expanded it to represent a range of potential metrics applicable to real world performance. Investigators built on their consensus work from Developing Frameworks and Tools for Integration of Digital Health Tools into Clinical Practice to identify specific domains and measures relevant to each broad domain. For example, within the area of Product Performance  Cybersecurity, investigators developed subdomains where metric identification was recognized as a key next step. Not surprisingly, a wide range of potential measures were identified. For example, each of the Measure Concepts for Real World Health might have dozens or even hundreds of patient- or population-specific metrics that are supported by evidence, are broadly used, or both.

This realization led to development of the Digital medicine Outcomes Value Set (DOVeS), as a powerful and flexible approach to classifying digital health tools according to key features and important clinical outcomes identified by our work to this point.

DOVeS was blueprinted using Protégé software using input from research collaborators and professional ontologists so that it permits flexible expansion as outcomes or population definitions change and technology advances. DOVeS was then tested and validated against real DHT and company characteristics to yield a working prototype that facilitates search and display of data using the overall ADviCE/Variome approach. DOVeS has been published on BioPortal and is publicly available for broad use.

DOVeS has the potential to be scaled up to include a broader and more representative sample of real-world digital health tools, accommodate new technologies (e.g. large language lodels (LLMs)), while also being tested for usability and feasibility as a practical framework for use by health systems, vendors, and regulators use.

Research Outcomes/Results

There are several outcomes to date associated with the development of the Variome project and DOVeS Ontology. The first is that the DOVeS ontology has been expanded substantially over the course of this support, informed by real world digital health outcomes gleaned from industry and academic experts. The second is that DOVeS has been made publicly available on the BioPortal website so that a community of digital health experts may continue to contribute to it over time. The third is that a prototype user interface overlying DOVeS has been created (only a non-functional wireframe was originally proposed) leading to functional demonstrations that show the power and value of DOVeS in identifying tools based on common outcomes. Fourth, several public presentations of DOVeS have been made. Finally, a peer reviewed publication on the development of DOVeS is forthcoming and will help disseminate awareness of the ontology and its value. In the future, investigators hope to convert the prototype front end user interface into a robust platform capable of supporting regulatory insights as well as health system leader inquiries and decisions about digital health tools.

Research Impacts

This project enhances foundational requirements for regulatory science research by providing the FDA and other stakeholders with a new way to categorize and identify digital health tools based on outcomes they influence. This is particularly valuable to enable more appropriate apples-to-apples comparison of digital health tools that influence similar outcomes which could be valuable for "substantial equivalence" assessment as well as both superiority and non-inferiority considerations. The ontology is also particularly valuable for ongoing post-market surveillance.

Publications

No peer-reviewed publications to date; Investigators plan to analyze and publish follow-up study results.

Dr. Auerbach has published invited editorials in JAMA IM on digital health regulation based in part on his experiences with ADviCE.

How to Do Market Research: The Complete Guide

Learn how to do market research with this step-by-step guide, complete with templates, tools and real-world examples.

Access best-in-class company data

Get trusted first-party funding data, revenue data and firmographics

What are your customers’ needs? How does your product compare to the competition? What are the emerging trends and opportunities in your industry? If these questions keep you up at night, it’s time to conduct market research.

Market research plays a pivotal role in your ability to stay competitive and relevant, helping you anticipate shifts in consumer behavior and industry dynamics. It involves gathering these insights using a wide range of techniques, from surveys and interviews to data analysis and observational studies.

In this guide, we’ll explore why market research is crucial, the various types of market research, the methods used in data collection, and how to effectively conduct market research to drive informed decision-making and success.

What is market research?

Market research is the systematic process of gathering, analyzing and interpreting information about a specific market or industry. The purpose of market research is to offer valuable insight into the preferences and behaviors of your target audience, and anticipate shifts in market trends and the competitive landscape. This information helps you make data-driven decisions, develop effective strategies for your business, and maximize your chances of long-term growth.

Business intelligence insight graphic with hand showing a lightbulb with $ sign in it

Why is market research important? 

By understanding the significance of market research, you can make sure you’re asking the right questions and using the process to your advantage. Some of the benefits of market research include:

  • Informed decision-making: Market research provides you with the data and insights you need to make smart decisions for your business. It helps you identify opportunities, assess risks and tailor your strategies to meet the demands of the market. Without market research, decisions are often based on assumptions or guesswork, leading to costly mistakes.
  • Customer-centric approach: A cornerstone of market research involves developing a deep understanding of customer needs and preferences. This gives you valuable insights into your target audience, helping you develop products, services and marketing campaigns that resonate with your customers.
  • Competitive advantage: By conducting market research, you’ll gain a competitive edge. You’ll be able to identify gaps in the market, analyze competitor strengths and weaknesses, and position your business strategically. This enables you to create unique value propositions, differentiate yourself from competitors, and seize opportunities that others may overlook.
  • Risk mitigation: Market research helps you anticipate market shifts and potential challenges. By identifying threats early, you can proactively adjust their strategies to mitigate risks and respond effectively to changing circumstances. This proactive approach is particularly valuable in volatile industries.
  • Resource optimization: Conducting market research allows organizations to allocate their time, money and resources more efficiently. It ensures that investments are made in areas with the highest potential return on investment, reducing wasted resources and improving overall business performance.
  • Adaptation to market trends: Markets evolve rapidly, driven by technological advancements, cultural shifts and changing consumer attitudes. Market research ensures that you stay ahead of these trends and adapt your offerings accordingly so you can avoid becoming obsolete. 

As you can see, market research empowers businesses to make data-driven decisions, cater to customer needs, outperform competitors, mitigate risks, optimize resources and stay agile in a dynamic marketplace. These benefits make it a huge industry; the global market research services market is expected to grow from $76.37 billion in 2021 to $108.57 billion in 2026 . Now, let’s dig into the different types of market research that can help you achieve these benefits.

Types of market research 

  • Qualitative research
  • Quantitative research
  • Exploratory research
  • Descriptive research
  • Causal research
  • Cross-sectional research
  • Longitudinal research

Despite its advantages, 23% of organizations don’t have a clear market research strategy. Part of developing a strategy involves choosing the right type of market research for your business goals. The most commonly used approaches include:

1. Qualitative research

Qualitative research focuses on understanding the underlying motivations, attitudes and perceptions of individuals or groups. It is typically conducted through techniques like in-depth interviews, focus groups and content analysis — methods we’ll discuss further in the sections below. Qualitative research provides rich, nuanced insights that can inform product development, marketing strategies and brand positioning.

2. Quantitative research

Quantitative research, in contrast to qualitative research, involves the collection and analysis of numerical data, often through surveys, experiments and structured questionnaires. This approach allows for statistical analysis and the measurement of trends, making it suitable for large-scale market studies and hypothesis testing. While it’s worthwhile using a mix of qualitative and quantitative research, most businesses prioritize the latter because it is scientific, measurable and easily replicated across different experiments.

3. Exploratory research

Whether you’re conducting qualitative or quantitative research or a mix of both, exploratory research is often the first step. Its primary goal is to help you understand a market or problem so you can gain insights and identify potential issues or opportunities. This type of market research is less structured and is typically conducted through open-ended interviews, focus groups or secondary data analysis. Exploratory research is valuable when entering new markets or exploring new product ideas.

4. Descriptive research

As its name implies, descriptive research seeks to describe a market, population or phenomenon in detail. It involves collecting and summarizing data to answer questions about audience demographics and behaviors, market size, and current trends. Surveys, observational studies and content analysis are common methods used in descriptive research. 

5. Causal research

Causal research aims to establish cause-and-effect relationships between variables. It investigates whether changes in one variable result in changes in another. Experimental designs, A/B testing and regression analysis are common causal research methods. This sheds light on how specific marketing strategies or product changes impact consumer behavior.

6. Cross-sectional research

Cross-sectional market research involves collecting data from a sample of the population at a single point in time. It is used to analyze differences, relationships or trends among various groups within a population. Cross-sectional studies are helpful for market segmentation, identifying target audiences and assessing market trends at a specific moment.

7. Longitudinal research

Longitudinal research, in contrast to cross-sectional research, collects data from the same subjects over an extended period. This allows for the analysis of trends, changes and developments over time. Longitudinal studies are useful for tracking long-term developments in consumer preferences, brand loyalty and market dynamics.

Each type of market research has its strengths and weaknesses, and the method you choose depends on your specific research goals and the depth of understanding you’re aiming to achieve. In the following sections, we’ll delve into primary and secondary research approaches and specific research methods.

Primary vs. secondary market research

Market research of all types can be broadly categorized into two main approaches: primary research and secondary research. By understanding the differences between these approaches, you can better determine the most appropriate research method for your specific goals.

Primary market research 

Primary research involves the collection of original data straight from the source. Typically, this involves communicating directly with your target audience — through surveys, interviews, focus groups and more — to gather information. Here are some key attributes of primary market research:

  • Customized data: Primary research provides data that is tailored to your research needs. You design a custom research study and gather information specific to your goals.
  • Up-to-date insights: Because primary research involves communicating with customers, the data you collect reflects the most current market conditions and consumer behaviors.
  • Time-consuming and resource-intensive: Despite its advantages, primary research can be labor-intensive and costly, especially when dealing with large sample sizes or complex study designs. Whether you hire a market research consultant, agency or use an in-house team, primary research studies consume a large amount of resources and time.

Secondary market research 

Secondary research, on the other hand, involves analyzing data that has already been compiled by third-party sources, such as online research tools, databases, news sites, industry reports and academic studies.

Build your project graphic

Here are the main characteristics of secondary market research:

  • Cost-effective: Secondary research is generally more cost-effective than primary research since it doesn’t require building a research plan from scratch. You and your team can look at databases, websites and publications on an ongoing basis, without needing to design a custom experiment or hire a consultant. 
  • Leverages multiple sources: Data tools and software extract data from multiple places across the web, and then consolidate that information within a single platform. This means you’ll get a greater amount of data and a wider scope from secondary research.
  • Quick to access: You can access a wide range of information rapidly — often in seconds — if you’re using online research tools and databases. Because of this, you can act on insights sooner, rather than taking the time to develop an experiment. 

So, when should you use primary vs. secondary research? In practice, many market research projects incorporate both primary and secondary research to take advantage of the strengths of each approach.

One rule of thumb is to focus on secondary research to obtain background information, market trends or industry benchmarks. It is especially valuable for conducting preliminary research, competitor analysis, or when time and budget constraints are tight. Then, if you still have knowledge gaps or need to answer specific questions unique to your business model, use primary research to create a custom experiment. 

Market research methods

  • Surveys and questionnaires
  • Focus groups
  • Observational research
  • Online research tools
  • Experiments
  • Content analysis
  • Ethnographic research

How do primary and secondary research approaches translate into specific research methods? Let’s take a look at the different ways you can gather data: 

1. Surveys and questionnaires

Surveys and questionnaires are popular methods for collecting structured data from a large number of respondents. They involve a set of predetermined questions that participants answer. Surveys can be conducted through various channels, including online tools, telephone interviews and in-person or online questionnaires. They are useful for gathering quantitative data and assessing customer demographics, opinions, preferences and needs. On average, customer surveys have a 33% response rate , so keep that in mind as you consider your sample size.

2. Interviews

Interviews are in-depth conversations with individuals or groups to gather qualitative insights. They can be structured (with predefined questions) or unstructured (with open-ended discussions). Interviews are valuable for exploring complex topics, uncovering motivations and obtaining detailed feedback. 

3. Focus groups

The most common primary research methods are in-depth webcam interviews and focus groups. Focus groups are a small gathering of participants who discuss a specific topic or product under the guidance of a moderator. These discussions are valuable for primary market research because they reveal insights into consumer attitudes, perceptions and emotions. Focus groups are especially useful for idea generation, concept testing and understanding group dynamics within your target audience.

4. Observational research

Observational research involves observing and recording participant behavior in a natural setting. This method is particularly valuable when studying consumer behavior in physical spaces, such as retail stores or public places. In some types of observational research, participants are aware you’re watching them; in other cases, you discreetly watch consumers without their knowledge, as they use your product. Either way, observational research provides firsthand insights into how people interact with products or environments.

5. Online research tools

You and your team can do your own secondary market research using online tools. These tools include data prospecting platforms and databases, as well as online surveys, social media listening, web analytics and sentiment analysis platforms. They help you gather data from online sources, monitor industry trends, track competitors, understand consumer preferences and keep tabs on online behavior. We’ll talk more about choosing the right market research tools in the sections that follow.

6. Experiments

Market research experiments are controlled tests of variables to determine causal relationships. While experiments are often associated with scientific research, they are also used in market research to assess the impact of specific marketing strategies, product features, or pricing and packaging changes.

7. Content analysis

Content analysis involves the systematic examination of textual, visual or audio content to identify patterns, themes and trends. It’s commonly applied to customer reviews, social media posts and other forms of online content to analyze consumer opinions and sentiments.

8. Ethnographic research

Ethnographic research immerses researchers into the daily lives of consumers to understand their behavior and culture. This method is particularly valuable when studying niche markets or exploring the cultural context of consumer choices.

How to do market research

  • Set clear objectives
  • Identify your target audience
  • Choose your research methods
  • Use the right market research tools
  • Collect data
  • Analyze data 
  • Interpret your findings
  • Identify opportunities and challenges
  • Make informed business decisions
  • Monitor and adapt

Now that you have gained insights into the various market research methods at your disposal, let’s delve into the practical aspects of how to conduct market research effectively. Here’s a quick step-by-step overview, from defining objectives to monitoring market shifts.

1. Set clear objectives

When you set clear and specific goals, you’re essentially creating a compass to guide your research questions and methodology. Start by precisely defining what you want to achieve. Are you launching a new product and want to understand its viability in the market? Are you evaluating customer satisfaction with a product redesign? 

Start by creating SMART goals — objectives that are specific, measurable, achievable, relevant and time-bound. Not only will this clarify your research focus from the outset, but it will also help you track progress and benchmark your success throughout the process. 

You should also consult with key stakeholders and team members to ensure alignment on your research objectives before diving into data collecting. This will help you gain diverse perspectives and insights that will shape your research approach.

2. Identify your target audience

Next, you’ll need to pinpoint your target audience to determine who should be included in your research. Begin by creating detailed buyer personas or stakeholder profiles. Consider demographic factors like age, gender, income and location, but also delve into psychographics, such as interests, values and pain points.

The more specific your target audience, the more accurate and actionable your research will be. Additionally, segment your audience if your research objectives involve studying different groups, such as current customers and potential leads.

If you already have existing customers, you can also hold conversations with them to better understand your target market. From there, you can refine your buyer personas and tailor your research methods accordingly.

3. Choose your research methods

Selecting the right research methods is crucial for gathering high-quality data. Start by considering the nature of your research objectives. If you’re exploring consumer preferences, surveys and interviews can provide valuable insights. For in-depth understanding, focus groups or observational research might be suitable. Consider using a mix of quantitative and qualitative methods to gain a well-rounded perspective. 

You’ll also need to consider your budget. Think about what you can realistically achieve using the time and resources available to you. If you have a fairly generous budget, you may want to try a mix of primary and secondary research approaches. If you’re doing market research for a startup , on the other hand, chances are your budget is somewhat limited. If that’s the case, try addressing your goals with secondary research tools before investing time and effort in a primary research study. 

4. Use the right market research tools

Whether you’re conducting primary or secondary research, you’ll need to choose the right tools. These can help you do anything from sending surveys to customers to monitoring trends and analyzing data. Here are some examples of popular market research tools:

  • Market research software: Crunchbase is a platform that provides best-in-class company data, making it valuable for market research on growing companies and industries. You can use Crunchbase to access trusted, first-party funding data, revenue data, news and firmographics, enabling you to monitor industry trends and understand customer needs.

Market Research Graphic Crunchbase

  • Survey and questionnaire tools: SurveyMonkey is a widely used online survey platform that allows you to create, distribute and analyze surveys. Google Forms is a free tool that lets you create surveys and collect responses through Google Drive.
  • Data analysis software: Microsoft Excel and Google Sheets are useful for conducting statistical analyses. SPSS is a powerful statistical analysis software used for data processing, analysis and reporting.
  • Social listening tools: Brandwatch is a social listening and analytics platform that helps you monitor social media conversations, track sentiment and analyze trends. Mention is a media monitoring tool that allows you to track mentions of your brand, competitors and keywords across various online sources.
  • Data visualization platforms: Tableau is a data visualization tool that helps you create interactive and shareable dashboards and reports. Power BI by Microsoft is a business analytics tool for creating interactive visualizations and reports.

5. Collect data

There’s an infinite amount of data you could be collecting using these tools, so you’ll need to be intentional about going after the data that aligns with your research goals. Implement your chosen research methods, whether it’s distributing surveys, conducting interviews or pulling from secondary research platforms. Pay close attention to data quality and accuracy, and stick to a standardized process to streamline data capture and reduce errors. 

6. Analyze data

Once data is collected, you’ll need to analyze it systematically. Use statistical software or analysis tools to identify patterns, trends and correlations. For qualitative data, employ thematic analysis to extract common themes and insights. Visualize your findings with charts, graphs and tables to make complex data more understandable.

If you’re not proficient in data analysis, consider outsourcing or collaborating with a data analyst who can assist in processing and interpreting your data accurately.

Enrich your database graphic

7. Interpret your findings

Interpreting your market research findings involves understanding what the data means in the context of your objectives. Are there significant trends that uncover the answers to your initial research questions? Consider the implications of your findings on your business strategy. It’s essential to move beyond raw data and extract actionable insights that inform decision-making.

Hold a cross-functional meeting or workshop with relevant team members to collectively interpret the findings. Different perspectives can lead to more comprehensive insights and innovative solutions.

8. Identify opportunities and challenges

Use your research findings to identify potential growth opportunities and challenges within your market. What segments of your audience are underserved or overlooked? Are there emerging trends you can capitalize on? Conversely, what obstacles or competitors could hinder your progress?

Lay out this information in a clear and organized way by conducting a SWOT analysis, which stands for strengths, weaknesses, opportunities and threats. Jot down notes for each of these areas to provide a structured overview of gaps and hurdles in the market.

9. Make informed business decisions

Market research is only valuable if it leads to informed decisions for your company. Based on your insights, devise actionable strategies and initiatives that align with your research objectives. Whether it’s refining your product, targeting new customer segments or adjusting pricing, ensure your decisions are rooted in the data.

At this point, it’s also crucial to keep your team aligned and accountable. Create an action plan that outlines specific steps, responsibilities and timelines for implementing the recommendations derived from your research. 

10. Monitor and adapt

Market research isn’t a one-time activity; it’s an ongoing process. Continuously monitor market conditions, customer behaviors and industry trends. Set up mechanisms to collect real-time data and feedback. As you gather new information, be prepared to adapt your strategies and tactics accordingly. Regularly revisiting your research ensures your business remains agile and reflects changing market dynamics and consumer preferences.

Online market research sources

As you go through the steps above, you’ll want to turn to trusted, reputable sources to gather your data. Here’s a list to get you started:

  • Crunchbase: As mentioned above, Crunchbase is an online platform with an extensive dataset, allowing you to access in-depth insights on market trends, consumer behavior and competitive analysis. You can also customize your search options to tailor your research to specific industries, geographic regions or customer personas.

Product Image Advanced Search CRMConnected

  • Academic databases: Academic databases, such as ProQuest and JSTOR , are treasure troves of scholarly research papers, studies and academic journals. They offer in-depth analyses of various subjects, including market trends, consumer preferences and industry-specific insights. Researchers can access a wealth of peer-reviewed publications to gain a deeper understanding of their research topics.
  • Government and NGO databases: Government agencies, nongovernmental organizations and other institutions frequently maintain databases containing valuable economic, demographic and industry-related data. These sources offer credible statistics and reports on a wide range of topics, making them essential for market researchers. Examples include the U.S. Census Bureau , the Bureau of Labor Statistics and the Pew Research Center .
  • Industry reports: Industry reports and market studies are comprehensive documents prepared by research firms, industry associations and consulting companies. They provide in-depth insights into specific markets, including market size, trends, competitive analysis and consumer behavior. You can find this information by looking at relevant industry association databases; examples include the American Marketing Association and the National Retail Federation .
  • Social media and online communities: Social media platforms like LinkedIn or Twitter (X) , forums such as Reddit and Quora , and review platforms such as G2 can provide real-time insights into consumer sentiment, opinions and trends. 

Market research examples

At this point, you have market research tools and data sources — but how do you act on the data you gather? Let’s go over some real-world examples that illustrate the practical application of market research across various industries. These examples showcase how market research can lead to smart decision-making and successful business decisions.

Example 1: Apple’s iPhone launch

Apple ’s iconic iPhone launch in 2007 serves as a prime example of market research driving product innovation in tech. Before the iPhone’s release, Apple conducted extensive market research to understand consumer preferences, pain points and unmet needs in the mobile phone industry. This research led to the development of a touchscreen smartphone with a user-friendly interface, addressing consumer demands for a more intuitive and versatile device. The result was a revolutionary product that disrupted the market and redefined the smartphone industry.

Example 2: McDonald’s global expansion

McDonald’s successful global expansion strategy demonstrates the importance of market research when expanding into new territories. Before entering a new market, McDonald’s conducts thorough research to understand local tastes, preferences and cultural nuances. This research informs menu customization, marketing strategies and store design. For instance, in India, McDonald’s offers a menu tailored to local preferences, including vegetarian options. This market-specific approach has enabled McDonald’s to adapt and thrive in diverse global markets.

Example 3: Organic and sustainable farming

The shift toward organic and sustainable farming practices in the food industry is driven by market research that indicates increased consumer demand for healthier and environmentally friendly food options. As a result, food producers and retailers invest in sustainable sourcing and organic product lines — such as with these sustainable seafood startups — to align with this shift in consumer values. 

The bottom line? Market research has multiple use cases and is a critical practice for any industry. Whether it’s launching groundbreaking products, entering new markets or responding to changing consumer preferences, you can use market research to shape successful strategies and outcomes.

Market research templates

You finally have a strong understanding of how to do market research and apply it in the real world. Before we wrap up, here are some market research templates that you can use as a starting point for your projects:

  • Smartsheet competitive analysis templates : These spreadsheets can serve as a framework for gathering information about the competitive landscape and obtaining valuable lessons to apply to your business strategy.
  • SurveyMonkey product survey template : Customize the questions on this survey based on what you want to learn from your target customers.
  • HubSpot templates : HubSpot offers a wide range of free templates you can use for market research, business planning and more.
  • SCORE templates : SCORE is a nonprofit organization that provides templates for business plans, market analysis and financial projections.
  • SBA.gov : The U.S. Small Business Administration offers templates for every aspect of your business, including market research, and is particularly valuable for new startups. 

Strengthen your business with market research

When conducted effectively, market research is like a guiding star. Equipped with the right tools and techniques, you can uncover valuable insights, stay competitive, foster innovation and navigate the complexities of your industry.

Throughout this guide, we’ve discussed the definition of market research, different research methods, and how to conduct it effectively. We’ve also explored various types of market research and shared practical insights and templates for getting started. 

Now, it’s time to start the research process. Trust in data, listen to the market and make informed decisions that guide your company toward lasting success.

Related Articles

research and gather information on partnerships

  • Entrepreneurs
  • 15 min read

What Is Competitive Analysis and How to Do It Effectively

'  data-srcset=

Rebecca Strehlow, Copywriter at Crunchbase

research and gather information on partnerships

17 Best Sales Intelligence Tools for 2024

research and gather information on partnerships

  • Market research
  • 10 min read

How to Do Market Research for a Startup: Tips for Success

'  data-srcset=

Jaclyn Robinson, Senior Manager of Content Marketing at Crunchbase

Search less. Close more.

Grow your revenue with Crunchbase, the all-in-one prospecting solution. Start your free trial.

research and gather information on partnerships

  • Open access
  • Published: 23 May 2024

30 years of youth system of care lessons learned – a qualitative study of Hawaiʻi’s partnership with the Substance Abuse and Mental Health Services Administration

  • Kelsie H. Okamura 1 , 2 ,
  • David Jackson 2 , 4 ,
  • Danielle L. Carreira Ching 1 ,
  • Da Eun Suh 2 ,
  • Tia L. R. Hartsock 3 ,
  • Puanani J. Hee 4 &
  • Scott K. Shimabukuro 4  

BMC Health Services Research volume  24 , Article number:  658 ( 2024 ) Cite this article

60 Accesses

Metrics details

The Hawaiʻi State Department of Health, Child and Adolescent Mental Health Division (CAMHD) has maintained a longstanding partnership with Substance Abuse and Mental Health Services Administration (SAMHSA) to enhance capacity and quality of community-based mental health services. The current study explored CAMHD’s history of SAMHSA system of care (SOC) awards and identified common themes, lessons learned, and recommendations for future funding.

Employing a two-phase qualitative approach, the study first conducted content analysis on seven final project reports, identifying themes and lessons learned based on SOC values and principles. Subsequently, interviews were conducted with 11 system leaders in grant projects and SOC award projects within the state. All data from project reports and interview transcripts were independently coded and analyzed using rapid qualitative analysis techniques.

Content validation and interview coding unveiled two content themes, interagency collaboration and youth and family voice, as areas that required long-term and consistent efforts across multiple projects. In addition, two general process themes, connection and continuity, emerged as essential approaches to system improvement work. The first emphasizes the importance of fostering connections in family, community, and culture, as well as within workforce members and child-serving agencies. The second highlights the importance of nurturing continuity throughout the system, from interagency collaboration to individual treatment.

Conclusions

The study provides deeper understanding of system of care evaluations, offering guidance to enhance and innovate youth mental health systems. The findings suggest that aligning state policies with federal guidelines and implementing longer funding mechanisms may alleviate administrative burdens.

Peer Review reports

Youth are disproportionately impacted by mental health disorders with average rates higher than adults in the United States [ 1 ]. This begins early on with one in six children aged two to eight years diagnosed with a mental, behavioral, or developmental disorder and persists over time with one in five youth having experiences with a severe mental health disorder at some point in their life [ 1 , 2 , 3 ]. At the end of 2021, the U.S. Surgeon General declared a youth mental health crisis noting that rates of emergency room visits for suspected suicide attempts had increased in some demographics by more than 50% compared to the same time period in 2019 [ 4 ]. Despite the large and increasing need for services, alarming gaps have been found in access to care and it is estimated that half of youth will not receive adequate treatment, which is detrimental to healthy growth and development into adulthood [ 5 ]. Large barriers to youth mental health care occur at the organizational and community levels where differing priorities across child-serving agencies may contribute to lower rates of youth access to services [ 6 ].

The system of care (SOC) approach was developed in the 1980s as a strategy to address siloed child-serving agencies through an integrated and principle-driven approach to tiered services for youth with social, emotional, and behavioral difficulties [ 7 ]. The SOC core values, informed by the Child and Adolescent Service System Program principles [ 8 ], are that services should be: (a) family and youth driven, (b) community-based, and (c) culturally and linguistically competent. These values are operationalized through guiding principles such as interagency collaboration, care coordination, and partnerships with families and youth [ 7 ]. The SOC approach applies principles to help guide coordinated efforts to support youth whose services intersect multiple child-service agencies (e.g., mental health, judiciary, education, child welfare). Several cross-site studies have evaluated youth SOC efforts over time with differential operational definitions of SOC values and principles [ 7 , 8 , 9 , 10 , 11 ]. Each study indicated the importance of sustainability planning at the outset and aligning infrastructure and service development to meet local system requirements. For example, Brashears and colleagues noted that having interagency involvement in developing and implementing shared administrative processes was a common challenge [ 9 ]. Moreover, fiscal crises, leadership turnover, and methodological concerns for assessing long-term sustainment were noted as barriers in the SOC approach. Indeed, the SOC approach requires commitment and financial resources to succeed.

In 1992, the United States federal government signed into public law the establishment of the Substance Abuse and Mental Health Services Administration (SAMHSA; cf. Congressional public law 102–321) given the disconnect between youth and families’ need for services, the SOC approach, and the variable federal financial priorities. The SAMHSA goal was to support substance abuse and mental health prevention and intervention in the United States through the establishment of a federal funding authority operated under the Department of Health and Human Services. Within SAMHSA, there are three major centers that currently fund prevention and intervention services. The Center for Mental Health Services supports the development of services for adults with serious mental illnesses and youth with serious emotional disturbances through the administration and oversight of SOC expansion awards, cooperative agreements, and mental health services block grant programs (i.e., a discretionary fund to help prevent and treat mental health disorders). The Center for Substance Abuse Prevention develops comprehensive prevention systems through national leadership in policy and programs through promoting effective prevention practices and applying prevention knowledge. Their goals are to build supportive workplaces, schools, and communities, drug-free and crime-free neighborhoods, and positive connections with friends and family. Similarly, the Center for Substance Abuse Treatment seeks to improve and expand existing substance abuse treatment and recovery services. This center administers the Substance Abuse Prevention and Treatment Block Grant Program and supports the free treatment referral service to link clients to community-based substance use disorder treatment. The SAMHSA operates an over ten billion a year budget with $225 million dedicated to children’s mental health and SOC initiatives in 2024 [ 12 ].

The Hawaiʻi State Department of Health Child and Adolescent Mental Health Division (CAMHD) is the state’s Medicaid behavioral health carveout and the primary agency responsible for developing and administering clinical services for approximately 2,000 youth each year. The CAMHD provides care coordination and clinical oversight at seven regional Family Guidance Centers statewide and delivers in-home (e.g., intensive in-home, Multisystemic therapy) and out-of-home (e.g., transitional family home, community-based residential, hospital-based residential) services through 17 community-based contracted agencies. A centralized state office oversees all administrative, clinical, and performance functions including annual reporting of youth served and clinical outcomes (see https://health.hawaii.gov/camhd/annual-reports/ ). The CAMHD has a longstanding history of SAMHSA SOC expansion awards beginning in 1994 and continuing to the present in an almost unbroken succession [ 13 ]. These developments began shortly after a class-action lawsuit was brought against the state (Felix v. Waihee, 1993), when Hawaiʻi was ranked among the lowest in the nation for youth mental health services [ 14 ]. The settlement, referred to as the Felix Consent Decree, resulted in federal oversight that lasted from 1994 to 2004 [ 15 , 16 ]. The federal decree mandated and oversaw the development of a statewide SOC, and in many ways complemented the goals of SAMHSA SOC expansion awards that overlapped with federal oversight and continued for two more decades. The various SOC awards operationalized SOC principles and ranged from filling in gaps within the service continuum to enhancing existing services through trauma-informed care, wraparound care coordination, and improved knowledge management systems.

The purpose of this study is to examine the Hawaiʻi State CAMHD system’s SAMHSA SOC award history to identify common themes, lessons learned, and recommendations for future funding. The first goal was to understand the development and evolution of SOC values and principles (e.g., youth and family voice) within and across each grant. The second goal was to describe and reflect on common themes and lessons learned through the 30 years and seven CAMHD SOC expansion awards. This is the first study to date that examines themes across previous SAMHSA SOC awards from one state’s perspective. There were no a prior hypotheses given the exploratory nature of this study. The intention was to contribute to research and improved practices around effective SOC grant implementation at the federal and state system levels.

This study used a two-phase qualitative approach with (a) content analysis on seven final project reports and (b) key informant interviews with 11 system leaders. Initially, for the final project reports, a matrix template was utilized to summarize data by domains consistent with SAMHSA’s Center for Mental Health Services Infrastructure Development, Prevention, and Mental Health Promotion indicators (e.g., Policy Development, Workforce Development) which would have allowed comparisons across multiple projects and domains. However, after multiple trials to code past project reports into the indicators, the two lead investigators (Okamura, Jackson) opted to use a grounded approach to identifying themes and lessons learned based on SOC values and principles. Initial results from project reports guided the information collected in interviews, which iteratively guided subsequent interviews until saturation and consensus was reached on the final themes.

For the interviews, a purposive sampling strategy was utilized to obtain feedback from system leaders who have had extensive experience within individual grant projects and/or across multiple SOC award projects within the state [ 17 ]. Interview participants included four previous grant project directors and seven system leaders whose roles included regional center chiefs (one who, at the time of data collection, was acting as the statewide chief administrator), clinical supervisors, training specialists, and a performance manager. All interviews were recorded and transcribed, except for one participant who declined to be recorded but whose responses were paraphrased in notes. The lead investigators conducted all interviews. A semi-structured interview was developed and used (see Supplemental File), which evolved during the study to further probe more specific themes that were emerging. Initial interview questions asked participants about what they remembered, lessons learned, and what recommendations they had based on the project. Additional probes were used to obtain their perspectives on areas including the project’s impact on the state’s mental health division and larger system of care, its impact on the specific project’s focus areas, and its impact on the division’s relationship with SAMHSA. In addition, participants were asked about their overall reflections on the SOC awards, thoughts on how they have impacted the system over multiple years, and how they could be best utilized in the future.

All data from project reports and interview transcripts were independently coded by the two lead investigators, who each reviewed every report and transcript. Data were analyzed using rapid qualitative analysis techniques [ 18 ]. Rapid qualitative analysis is well-suited for projects that aim to be completed in one year or less that do not rely on traditional transcription coding [ 19 ]. For this project, main points from interviews were summarized to provide a quick and accessible “sketch” of the data as data were organized and collected. These sketches were organized into a matrix to allow for quick identification of similarities, differences, and trends in responses [ 20 ]. ​Therefore, reliability calculations such as kappa or intraclass correlations were not appropriate for this method. This study was deemed exempt and non-human subjects research by the Hawaiʻi State Department of Health Institutional Review Board.

System of care principle development and application

The Hawaiʻi State CAMHD has operated seven SAMHSA SOC awards from 1994 to present day (2024) as detailed in Table  1 . Several project directors served multiple SOC awards which provided continuity. Specifically, Kealahou, Kaeru, and Data to Wisdom projects had the same project director, which helped to infuse trauma-informed care and bridge previous work in youth and parent partner services. There was variation in the project foci with some projects focused on developing SOC infrastructure (e.g., care coordination model) and others also focused on developing services (e.g., adaptive behavioral intervention) within the service array. The ʻOhana Project and Hoʻomohala both set foundations for the CAMHD SOC by establishing care coordination, contracted provider agencies, and building the service array. Kealahou, Laulima, and Kaeru projects continued to build the CAMHD SOC while focusing on targeted populations and specialty services. The Cultures of Engagement in Residential Care focused primarily on residential treatment settings and eliminating the use of seclusion and restraint. The Data to Wisdom grant focused on SOC development to infuse data driven decision making, knowledge management, and trauma-informed systems. Project geographic locations also changed over time from specific areas (e.g., urban Honolulu) to the broader overall statewide system.

System of care award themes

Content validation and interview coding revealed two content and two general process themes across the seven projects. Content themes were defined as areas that required long-term and consistent efforts across multiple projects and grants to develop. Content themes included (a) interagency collaboration and (b) youth and family voice. Process themes were defined as essential approaches to system improvement work. The general process themes reflected various aspects of (c) connection and (d) continuity, with more specific sub-themes within those.

Interagency collaboration

The first topic theme reflected the need for continual building of interagency collaboration across every project (see Fig.  1 ). From the first CAMHD SAMHSA award, the ʻOhana project, CAMHD coordinated interagency agreements with other child serving systems such as the Department of Education and Child Welfare Services. These child-serving system partners served as governing and advisory groups for the SOC awards, alongside consistent integration with other direct service provider agencies and academic partnerships to support the SOC. During Kealahou and Laulima, there was an effort to formalize the interagency collaboration through the execution of memoranda of agreements between agencies and targeted strategies to improve system collaboration, such as the multi-agency consent form. The formation of the Hawaiʻi Interagency State Youth Network of Care through revised statute furthered the commitment to interagency collaboration, which CAMHD and project directors have co-chaired. The development of interagency collaboration followed an advisory (e.g., members from other child-serving agencies contributing feedback to project goals and implementation), integration (e.g., formal advisory council and committee established), and leadership (e.g., chairing advisory council, and leading task forces and special projects) pathway for CAMHD. This theme is consistent with SOC values and principles and align with the priorities across funding announcements to build and enhance SOCs. Perceptions of key informants also reinforced the idea that interagency collaboration was a critical aspect of SOC development; however, successful collaboration is challenging to achieve (see Table  2 ).

figure 1

Hawaiʻi state CAMHD interagency collaboration development

CAMHD = Hawaiʻi State Child and Adolescent Mental Health Division; CERC = Cultures of Engagement in Residential Care; D2W = Data to Wisdom

Youth and family voice

The second topic theme was youth and family voice, which represents the long road to fully integrating family voice from the system to the client treatment level (see Fig.  2 ). Parent and youth integration into governing councils and advisory boards to help guide grant activities began from the first award, the ʻOhana project. Eventually, parent and youth peer partner services became integrated into the treatment team level. There were several community-based organizations, like Hawaiʻi Youth Helping Youth, that supported youth and family voice through identifying and training advocates. These advisory activities continued, with more applied support to individual families and youth occurring in Project Kealahou. During this project, the priority to develop a sustainable infrastructure for youth and parent peer partners supporting individual families began. Medicaid reimbursement was pursued for the first time for youth and parent peer partner services, which continued in negotiations to amend the state plan for approximately 12 years. This reimbursement effort continued into the current SOC award, the Data to Wisdom project, with a focus on developing youth peer partner certification as a step toward successful Medicaid reimbursement. Similar to interagency collaboration, the youth and family voice theme progressed from an advisory role (e.g., having youth and families advise grant activities and goals) to informing service (e.g., hiring youth and parent advocates) to pursuing a standalone service (e.g., full integration of youth and parent peer services).

Informant interviews also shed light on the nuances of increasing family voice. New challenges and opportunities emerged alongside greater incorporation of and respect for youth and parent perspectives. One such challenge with youth and family voice is in building trust across different levels within a treatment team and system of care. Language remains a key moderator of trust building (see Table  2 ). Indeed, the SOC value of youth and family driven and principle of partnership with youth and families were applied differently as youth and parent voice became stronger within the treatment team with the support of peer partners.

figure 2

Hawaiʻi state CAMHD youth and family voice development

Complementing the content themes were process themes related to how systems work should be accomplished to be successful, based on the experiences and recommendations of key informants. The first general process theme was encapsulated in the concept of “connection,” as it relates to (1) how services should connect youth to their family, community, and culture, (2) how workforce members should be connected to each other, and (3) how child-serving agencies should be connected to each other. Fostering these connections often goes beyond day-to-day roles and responsibilities and requires additional focused and sustained efforts.

Regarding the connection of youth to their family, community, and culture, one staff member noted a need for community-based interventions (see Table  2 ). Additionally, connection through communication and relationship building among workforce members and creating the structures to maintain relationships was described as important. One informant noted the importance of learning collaboratives in the project which created a shared place to connect and learn.

Finally, similar to the content theme of interagency collaboration being a continual endeavor, informants relayed many thoughts about how the system could connect agencies together to be more successful in the goal of system improvement. One leadership member noted the need for venues where legislators and other leaders from organizations to come together regularly to discuss issues (see Table  2 ). The connection not only built trust and clarified roles, but created shared responsibility within the SOC so that not one organization or body was making decisions independently of another.

The second general process theme was summarized in the concept of “continuity.” This theme emerged from comments about the importance of efforts such as (1) ensuring the continuity (sustaining) of interagency collaboration, (2) ensuring the continuity of new initiatives, (3) ensuring greater continuity (increased length) of award time periods, (4) ensuring continuity in the CAMHD model of care, (5) ensuring continuity in trauma-informed care, and (6) ensuring continuity in staffing. Overall, it was conveyed that better care for youth requires continuity throughout the system, from interagency collaboration to individual treatment.

First, ensuring continuity of interagency collaboration refers to the maintenance of the formal structures and relationships beyond a single project. For example, the establishment of Hawaiʻi Interagency State Youth Network of Care secured a platform for tackling issues that crossed agencies and could function independently from the restraints of single award periods (see Table  2 ).

More broadly, informants expressed difficulty in achieving sustainability and the need to ensure the continuity of new initiatives instead of them being a “one-off” or pilot projects. Some informants noted that typical award periods are not long enough to develop and sustain successful initiatives. As seen with the youth and family voice topic domain, it does take longer than a single grant to see any sort of transformational or long-term change.

The CAMHD model of care also emerged as a consistent topic throughout the final reports and interviews. The model of care was perceived as a pendulum swinging from a more intensive care coordination model, aligned with system of care values and care coordination principles, to a more “medical model” and managed care. As one person stated “we need to figure out what is our model…” and another informant noted some history related to care coordination to a medical model (see Table 2 ).

Trauma-informed care was a consistent thread in all SOC awards, and the importance of continuity emerged in interviews. Continuity was critical both at the system level, where consistent efforts needed to be made over multiple grant periods to build a more trauma-informed system, as well as the client level, where addressing a youth’s trauma requires time, patience, and consistency of support from the treatment team. As one person noted:

“it takes time to do trauma-informed care” and “you can only move as fast as the individual is able to move.”

Finally, a consistent challenge was in ensuring continuity in staffing. With limited award periods, staff begin to find other opportunities when funding nears the end and positions have not become permanent. Moreover, the start of new awards is typically delayed because of the challenges in establishing new positions and hiring new staff. A leadership informant noted that transitioning grant staff to new grants or from existing grants can cause disruptions to the system and staff morale.

The current study was a review on 30 years and seven awards given to the Hawaiʻi State Child and Adolescent Mental Health Division by the Substance Abuse and Mental Health Services Administration to expand the system of care. Two major topic themes of interagency collaboration and youth and family voice were identified that aligned with SOC values and principles. Two process themes of connection and continuity weaved throughout other SOC principles such as trauma-informed care. The Hawaiʻi State CAMHD continues to be a leader in SOC expansion despite ongoing administrative and fiscal challenges that common with other SOC expansion efforts [ 9 , 10 ]. Their dedication to SOC values and principles is evident in the investment of resources to start and close multiple awards, build interagency collaboration, and innovate within and across the child-serving system and its agencies.

Building interagency collaboration is one of the most difficult aspects of system improvement [ 9 ]. The CAMHD has needed to constantly invest resources (e.g., funding, personnel, legislation) to meet its goals. Lessons learned from interagency collaboration range from developments in coordinated interagency agreements with other child serving agencies (e.g., Department of Education, Child Welfare Services) which provided inbuilt advisory groups for SOC expansion, to consistent integration with other direct service provider agencies and academic partnerships to support that expansion, and finally through to formalization and strengthening of interagency collaboration through formal agreements and targeted strategies like the universal and multi-agency consent form. Networking within and between child-serving agencies was noted as an important aspect in building interagency collaboration. However, turnover can impact continuity and momentum. Legislation and policies have the potential to sustain collaboration and must be implemented with intention and proper funding to ensure high quality facilitation informed by equitable methods [ 21 ].

Partnering with youth and families has been a consistent theme in successful efforts to expand systems of care in other states, and the CAMHD has sought to continue developing this area through multiple grants despite ongoing challenges [ 9 , 10 ]. Lessons learned from youth and family voice range from the integration of parent and youth into governing councils and advisory boards, identification and training of advocates, and applied support to individual families and youth including the long and continuing work toward Medicaid reimbursement. It is interesting that the progression from youth and family voice informing service to a standalone service is representative of almost two decades of systems work. Systems change is truly a long-game and there have been many efforts to support these changes, including federal legislation and funding priorities (e.g., SAMHSA Office of Behavioral Equity and new funding priorities around marginalized communities). Moreover, updated SAMHSA funding announcements have explicitly called for language around culturally and linguistically appropriate, evidence-informed, recovery-oriented, trauma-informed care that highlights the commitment around SOC values and principles.

From these lessons, several areas for future attention emerged. These included considerations of the state and federal policies that often seem at odds with each other. As one informant noted: “we need to look at how the contracts and procurement is done.” This is particularly pertinent to state procurement laws which make it difficult to initially collaborate with and contract providers without a suitable means of paying them for their time, further complicating and delaying the work. A key leader noted:

I think the state system could really benefit from looking at how to support grants better and how to handle rules maybe differently, and procurement differently, and just be, provide more support…I think the state needs a grant office like a, you know, a university would have and they need to help us.

Moreover, establishing new funding accounts, job descriptions, and personnel management policies intersects divisional, departmental, state, and federal bureaucracies that often contribute to lengthy stalls in completing work and spending funds. For example, for the current SOC award, the project director was hired approximately six months after the notice of award, because the position needed to be established and associated with a new award and account code, despite the person already being in the previous SOC award project director position. Landscaping current federal and state policies on spending, procurement, and community collaboration may help to identify better pathways and strategies to executing federal grants within state infrastructure.

Furthermore, mental and behavioral health payment structures require ongoing attention. Stroul and Manteuffel noted that while award sites reported using a range of financing strategies, increasing Medicaid reimbursement was the most frequent strategy [ 11 ]. However, most strategies were not seen as very effective, and the highest effectiveness ratings were for increasing Medicaid funding, increasing state mental health funding, obtaining and coordinating funds with other systems, and redeploying funds to lower cost service alternatives [ 11 ]. Certification and credentialing processes that are needed for reimbursement are often time-intensive to develop and requirements may not align with health equity and lived experience. For example, in Project Hoʻomohala, a bachelor’s degree was required to hire a peer specialist. However, this requirement excluded many transitioned-age youth with lived experience who were more closely related in age which may have brokered trust and rapport more quickly. Initiatives that compare funding and certification rates and examine empirically the extent to which financing strategies improve service reach are necessary evaluation activities that should be included in SOC awards [ 22 , 23 ].

Programs for targeted populations and complex cases, which allow for flexible scheduling and funding, are also needed. Co-occurring mental health, disabilities, and substance use programs provide holistic care for youth and families. Special populations like racial, ethnic, sexual orientation, and gender/sex minorities that require adapted interventions should be a federal and state funding priority. As one interviewee noted:

Girls matter. Treatment for girls needs to be individualized more so than just, I don’t know, some of the EBS [evidence-based services] stuff you know, and I’m not knocking the EBS stuff, that is important. We need more research about girls. And that is a recommendation…The basic need is huge, so I think the lessons learned, we really do need more flexible funding to be able to support girls in their treatment, girls in their homes.

Improving integration into existing structures like home-based care, primary care, and school-based services, as well as integration of informal supports (e.g., youth peer support), requires continued effort to evolve with the changing managed care landscape. Payment and reimbursement strategies to incentivize practice use and improved clinical outcomes should also be considered.

Several recommendations emerged from the current study for operating future SAMHSA SOC awards in CAMHD and other state systems. First, there was enthusiasm for the focus of SOC awards to include more goals around infrastructure development and sustainment and to avoid “stand-alone” services. For example, one informant noted that “ It’s kind of a problem if you have a stand-alone service with its own team and it’s going to go away when the grant money is gone. ” Indeed, sustainability planning should begin prior to an infrastructure grant application being written to ensure there is a clear sustainable financial plan or objectives to continue pursuing funding for specific initiatives. Integrating procurement and administrative activities as specific and targeted award objectives, while unconventional, will emphasize the disconnect between federal and state procedures and spending priorities. Both state and federal legislators should be aware of funding mechanisms that have the potential to operate well in state government and to champion legislation that would create less bureaucracy in favor of the community. For example, including procurement clauses within federal funding announcements that allow for the federal government to supersede state laws may aid in timely execution of contracts using federal funds. Moreover, creating grants management, contracting, and fiscal positions that sit within procurement and administrative offices at the highest department level will be crucial to more timely execution of grant activities. Second, reliance on within-system historical knowledge is fraught with error. Future SOC awards should include evaluation objectives, like this project, to memorialize previous accomplishments, reflect on shared understanding and inconsistencies, and to archive important SOC activities in legacy documents. The third recommendation is related to communication within and between SOC awards by maintaining staff from one project to another. It is helpful to have ongoing role and responsibility clarification meetings internally and with child-serving system partners to avoid confusion and miscommunication. Learning collaboratives and protected time for project directors to share lessons learned and recommendations would aid in knowledge consolidation between projects. It would also be beneficial to allow for multi-year overlap of federal SOC awards to create continuity and retain employees. As one informant noted:

“And we recruited and hired a lot of really great people, and I think that the challenge becomes, as the grant starts to come to a close, or is nearing its end, that you recognize that people may leave because the positions are time limited. So, to the extent that it’s possible to think about positions for those folks, I feel like that is important.

Trauma-informed care principles are a necessary component of ensuring continuity of care. Trauma-informed care requires active responses in the form of integrating knowledge related to trauma into policies, procedures, and practices as well as careful attention to avoiding re-traumatization and secondary trauma of those involved in systems [ 24 ]. Lessons learned include making changes at the individual and organizational level to ensure that all aspects of care would be both transparent and trauma-informed. As one informant noted:

For example, in meetings, there were things that were being said in, in care coordination meetings, things that were being said that were offensive to the youth peers. And so there was a lot of work to prepare and debrief the youth peers after they were in meetings. We had peers that had previously been in care and saw their, their, their staff that they had worked with in some of our meetings. I mean, and had really negative experiences with them. And so the debriefing and you know, the secondary traumatic stress and the triggering even at the peers was intense. And so there was a lot of work with that that had to be addressed and done.

Guiding principles of trauma-informed care include creating a safety net instilled with trustworthiness and transparency, among others, to build confidence toward motivation for continued engagement [ 25 ]. Moreover, and consistent with interagency collaboration, a trauma-informed child-serving system should create a shared lexicon that speaks within and between agencies to improve navigation for youth and families. One informant noted:

And thinking about the needs of children as complex and they may have needs that span the way government agencies are organized. And so, recognizing that it is on the onus of government or organizations to be set up to better serve families rather than the onus on families to try to navigate really burdensome infrastructure to get the services that they need.

Limitations

The current study is not without limitations. First, the study relied on retrospective accounts of past project final reports and informant interviews. Both sources of information included objective and subjective accounts of previous SAMHSA SOC awards that represent a limited perspective. Moreover, key informants were identified via purposive sampling which may affect generalizability to other systems. Future research may wish to focus on convergence with multiple sources of objective data including financial reports, progress indicators, and any other technical assistance data available. Second, the information sources rely heavily on leadership and a small subgroup of CAMHD staff perspectives. It is unclear the extent to which some of these themes and lessons learned are uniformly understood throughout the various levels and roles within CAMHD and the child serving system. Additionally, the initial content coding design intended to rely on SAMHSA Infrastructure Development, Prevention, and Mental Health Promotion indicators (e.g., Policy Development, Workforce Development) to aid in generalizability to other systems. However, the indicator definitions were difficult to operationalize. For example, the Workforce Development categories contain five indicators that measure the number of organizations, communities, people, changes, and consumer or family members that are trained in, are credentialed or certified, implemented, and/or delivered mental health services. However, these metrics were almost never reported on within final reports and it was unclear how meaningful these metrics were to the system and aligned with SOC values and principles. While this study ultimately chose to use a grounded approach, future studies may wish to carefully think through key indicators to compare within and across systems over time. Despite these limitations, examining SAMHSA SOC awards within one system has the potential to inform how state and federal governments operate funds to support mental health innovation. Additional methods like landscape analysis and policy development could help to address the financial and administrative bureaucracy of operating federal funds in a state government association.

Federal funding is critical to addressing the youth mental health crisis [ 4 ]. The current study examined system of care expansion trends that represented multimillion-dollar investments and decades of work around interagency collaboration and youth and family voice, as well as attempts to build connection and continuity. It is hoped that the lessons learned will aid other systems and future work in being more evidence-informed. Similar delays in award progress and spending stemming from incongruencies between state and federal policies are consistent with previous SOC research and anecdotal reports from others involved in SAMHSA and SOC efforts. Targeted state alignment with federal policies and longer funding mechanisms may aid in ameliorating administrative burden on systems. That said, SAMHSA SOC expansion awards have the potential to fund innovative work that create legacy cultures around SOC values and principles.

Data availability

The dataset used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

Child and Adolescent Mental Health Division

Cultures of Engagement in Residential Care

Data to Wisdom

Hawaiʻi Interagency State Youth Network of Care

  • Substance Abuse and Mental Health Services Administration
  • System of Care

National Alliance on Mental Illness (NAMI). Mental Health by the Numbers: Infographics and Fact Sheets. https://www.nami.org/mhstats (2023). Accessed 15 Nov 2023.

Centers for Disease Control and Prevention (CDC). Children’s Mental Health: Data and Statistics on Children’s Mental Health. https://www.cdc.gov/childrensmentalhealth/data.html (2023). Accessed 15 Nov 2023.

Merikangas KR, He JP, Burstein M, Swanson SA, Avenevoli S, Cui L, Benjet C, Georgiades K, Swendsen J. Lifetime prevalence of mental disorders in US adolescents: results from the National Comorbidity Survey replication–adolescent supplement (NCS-A). J Am Acad Child Adolesc Psychiatry. 2010;49(10):980–9.

Article   PubMed   PubMed Central   Google Scholar  

US Department of Health and Human Services. Protecting youth mental health: The US Surgeon General’s advisory. https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-health-advisory.pdf (2021). Accessed 15 Nov 2023.

Whitney DG, Peterson MD. US national and state-level prevalence of mental health disorders and disparities of mental health care use in children. JAMA Pediatr. 2019;173(4):389–91.

Garney W, Wilson K, Ajayi KV, Panjwani S, Love SM, Flores S, Garcia K, Esquivel C. Social-ecological barriers to access to healthcare for adolescents: a scoping review. Int J Environ Res Public Health. 2021;18(8):4138.

The Institute for Innovation & Implementation. The Evolution of the System of Care Approach. https://www.cmhnetwork.org/wp-content/uploads/2021/05/The-Evolution-of-the-SOC-Approach-FINAL-5-27-20211.pdf (2021). Accessed 15 Nov 2023.

CASSP Technical Assistance Center: A System of Care for Severely Emotionally Disturbed Children &, Youth. https://files.eric.ed.gov/fulltext/ED330167.pdf (1986). Accessed November 15, 2023.

Brashears F, Davis C, Katz-Leavy J. Systems of care: the story behind the numbers. Am J Community Psychol. 2012;49(3–4):494–502.

Article   PubMed   Google Scholar  

Child, Adolescent and Family Branch Center for Mental Health Services. : Issue Brief: Strategies for Expanding the System of Care Approach. https://www.fredla.org/wp-content/uploads/2016/01/SOC-ExpansionStrategies-Issue-Brief-FINAL.pdf (2011). Accessed 15 Nov 2023.

Stroul BA, Manteuffel BA. The sustainability of systems of care for children’s mental health: lessons learned. J Behav Health Serv Res. 2007;34:237–59.

Substance Abuse and Mental Health Services Administration. Substance Use And Mental Health Services Administration (SAMHSA) Fiscal Year (FY) 2024 Budget Request. https://www.samhsa.gov/sites/default/files/samhsa-fy-2024-cj.pdf (2023). Accessed 23 Apr 2024.

Slavin LA, Suarez E. Insights in Public Health: Project Kealahou—forging a New Pathway for girls in Hawai ‘i’s public Mental Health System. Hawaiʻi J Med Public Health. 2013;72(9):325.

PubMed   Google Scholar  

Gochros SL. Reinventing Governance of Hawaii’s Public Mental Health Delivery System: Problems, Options, and Possibilities. Legislative Reference Bureau, 1994. https://lrb.hawaii.gov/wp-content/uploads/1994_ReinventingGovernanceOfHawaiisPublicMentalHealthDeliverySystem.pdf . Accessed 15 Nov 2023.

Chorpita BF, Donkervoet C. Implementation of the Felix Consent Decree in Hawaii: the impact of policy and practice development efforts on service delivery. In: Steel RG, MC, Roberts, editors. Handbook of Mental Health Services for Children, adolescents, and families. New York, NY: Academic/Plenum; 2005.

Google Scholar  

Nakamura BJ, Higa-McMillan C, Chorpita BF. Sustaining Hawaii’s evidence-based service system in children’s mental health. In: Barlow RKMDH, editor. Dissemination and implementation of evidence-based psychological interventions. New York, NY: Oxford University Press; 2012.

Denzin NK, Lincoln YS, editors. Handbook of qualitative research. 2nd ed. Thousand Oaks, CA: Sage; 2000.

Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280:112516.

Vindrola-Padros C, Johnson GA. Rapid techniques in qualitative research: a critical review of the literature. Qual Health Res. 2020;30(10):1596–604.

Averill JB. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual Health Res. 2002;12(6):855–66.

Martinez RG, Weiner BJ, Meza RD, Dorsey S, Palazzo LG, Matson A, Bain C, Mettert KD, Pullmann MD, Lewis CC. Study protocol: novel methods for implementing measurement-based care with youth in low-resource environments (NIMBLE). Implement Sci Commun. 2023;4(1):1–13.

Article   Google Scholar  

The Institute for Innovation and Implementation. Medicaid Funding for Family and Youth Peer Support Programs in the United States. https://youthmovenational.org/wp-content/uploads/2021/01/Medicaid-Funded-Youth-and-Family-Peer-Support-Guide-2020.pdf (2020). Accessed 15 Nov 2023.

Dopp AR, Hunter SB, Godley MD, González I, Bongard M, Han B, Cantor J, Hindmarch G, Lindquist K, Wright B, Schlang D. Comparing organization-focused and state-focused financing strategies on provider-level reach of a youth substance use treatment model: a mixed-method study. Implement Sci. 2023;18(1):50.

National Child Traumatic Stress Network (NCTSN). NCTSN Trauma-Informed Organizational Assessment. https://www.nctsn.org/trauma-informed-care/nctsn-trauma-informed-organizational-assessment . Accessed 15 Nov 2023.

U.S. Department of Health and Human Services. Six Guiding Principles to a Trauma Informed Approach. www.cdc.gov/orr/infographics/00_docs/TRAINING_EMERGENCY_RESPONDERS_FINAL_2022.pdf (2022). Accessed 15 Nov 2023.

Download references

Acknowledgements

Not applicable.

This work was supported by the Substance Abuse and Mental Health Services Administration (H79 SM082961). Okamura is also supported by the National Institute on Drug Abuse (L60 DA059132) and the National Institute of General Medical Sciences (U54 GM138062).

Author information

Authors and affiliations.

The Baker Center for Children and Families, Harvard Medical School, Cambridge, MA, USA

Kelsie H. Okamura & Danielle L. Carreira Ching

Department of Psychology, University of Hawaiʻi at Mānoa, Honolulu, HI, USA

Kelsie H. Okamura, David Jackson & Da Eun Suh

Hawaiʻi State Office of Wellness and Resilience, Honolulu, HI, USA

Tia L. R. Hartsock

Hawaiʻi State Child and Adolescent Mental Health Division, Honolulu, HI, USA

David Jackson, Puanani J. Hee & Scott K. Shimabukuro

You can also search for this author in PubMed   Google Scholar

Contributions

KHO and DJ contributed to the design and analysis of the study as well as interpretation of the results. DS and DLCC reviewed project final reports for discrepancies in themes identified in analyses. KHO drafted the first version of the manuscript. TH and PH provided writing support. DLCC created figures for the manuscript. DS reviewed and edited the manuscript. All authors made a significant contribution to the research and the development of the manuscript and approved the final version for publication. SKS supervised the study.

Corresponding author

Correspondence to Kelsie H. Okamura .

Ethics declarations

Ethics approval and consent to participate.

The Institutional Review Board of Hawaiʻi State Department of Health waived the need for approval for this study as non-human subject research based on 45 CFR 46.102(l) of the Department of Health and Human Services. The need for informed consent was waived by the Hawaiʻi State Department of Health Institutional Review Board.

Consent for publication

Not applicable

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Okamura, K.H., Jackson, D., Ching, D.L.C. et al. 30 years of youth system of care lessons learned – a qualitative study of Hawaiʻi’s partnership with the Substance Abuse and Mental Health Services Administration. BMC Health Serv Res 24 , 658 (2024). https://doi.org/10.1186/s12913-024-11114-9

Download citation

Received : 29 January 2024

Accepted : 14 May 2024

Published : 23 May 2024

DOI : https://doi.org/10.1186/s12913-024-11114-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

BMC Health Services Research

ISSN: 1472-6963

research and gather information on partnerships

Find Info For

  • Current Students
  • Prospective Students
  • Research and Partnerships
  • Entrepreneurship and Commercialization

Quick Links

  • Health and Life Sciences
  • Info Security and AI
  • Transformative Education
  • Purdue Today
  • Purdue Global
  • Purdue in the News

May 24, 2024

Purdue, Google announce sustainability and AI research partnership

google-purdue

Purdue University and Google announced a partnership that seeks to develop innovative solutions for eco-friendly industrial buildings. Left to right: Joe Sinfield, civil engineering professor and director of Purdue’s Institute for Innovation Science; Travis Horton, civil engineering professor; Ben Townsend, Google global head of infrastructure and sustainability; Alyssa Wilcox, Purdue senior vice president of partnerships and chief of staff. (Purdue University photo provided)

Collaboration seeks to develop innovative solutions for eco-friendly industrial buildings  

WEST LAFAYETTE, Ind. — Purdue University and Google announced Thursday (May 23) a new collaborative research project aimed at exploring the use of AI to develop innovative solutions for low-carbon industrial building design. The project seeks to leverage AI’s power to explore new materials, technologies and design strategies that can significantly reduce the carbon footprint of industrial buildings, such as data centers, not only across the U.S. but globally.

“Google is committed to using AI to address some of the world’s most pressing challenges, including climate change,” said Ben Townsend, global head of infrastructure and sustainability at Google. “We are excited to partner with Purdue University on this important research project, which has the potential to accelerate the adoption of low-carbon building practices in the industrial sector.”

ADDITIONAL INFORMATION

  • Purdue Computes
  • Purdue University to offer Google Career Certificates for learning in-demand tech skills

Conventional construction processes and physical plant operations of industrial structures pose persistent challenges when builders seek to reduce greenhouse gas emissions. That’s one reason why Purdue and Google are partnering on research efforts to develop new, sustainable building design approaches.

One focus area will be to prioritize the use of low-carbon building materials. The partners said research findings from this collaboration will be shared with the wider research community and industry stakeholders, aiming to boost the adoption of sustainable building practices.

“By combining Purdue’s expertise in engineering and building science with Google’s cutting-edge AI capabilities, we aim to develop innovative solutions that can significantly reduce the carbon footprint of industrial buildings,” said Travis Horton, a professor in Purdue’s Lyles School of Civil Engineering who also holds a courtesy appointment in mechanical engineering. “This collaboration is a testament to our commitment to sustainability and our belief in the transformative potential of AI.” 

AI is a foundational component of the Institute for Physical Artificial Intelligence , a Purdue Computes initiative.

The partners said AI has the potential to have a profound impact on industrial construction, as it could provide innovative applications of low-carbon building materials, which would support reducing operational costs and making progress on global sustainability goals. 

“At Google we are committed to sustainability, and we believe that technology can play a critical role in reducing carbon emissions,” said Townsend, who earned bachelor’s and master’s degrees in civil engineering from Purdue.

Google, which owns and operates data centers all over the world, is continually examining systems, including innovative construction design processes, to reduce the carbon footprint of its facilities to ensure efficiency and eco-friendly construction.

“The potential to combine Purdue’s understanding of innovation and design processes with Google’s AI capabilities to enable highly efficient and scalable sustainable design foreshadows the far-reaching promise of AI to help society address its most complex challenges,” said Joe Sinfield, a professor in Purdue’s Lyles School of Civil Engineering and the director of Purdue’s Institute for Innovation Science. 

About Purdue University

Purdue University is a public research institution demonstrating excellence at scale. Ranked among top 10 public universities and with two colleges in the top four in the United States, Purdue discovers and disseminates knowledge with a quality and at a scale second to none. More than 105,000 students study at Purdue across modalities and locations, including nearly 50,000 in person on the West Lafayette campus. Committed to affordability and accessibility, Purdue’s main campus has frozen tuition 13 years in a row. See how Purdue never stops in the persistent pursuit of the next giant leap — including its first comprehensive urban campus in Indianapolis, the new Mitchell E. Daniels, Jr. School of Business, and Purdue Computes — at https://www.purdue.edu/president/strategic-initiatives .

Writer/Media contact: Wes Mills, [email protected]

Source: Travis Horton

Research News

Communication.

  • OneCampus Portal
  • Brightspace
  • BoilerConnect
  • Faculty and Staff
  • Human Resources
  • Colleges and Schools

Info for Staff

  • Purdue Moves
  • Board of Trustees
  • University Senate
  • Center for Healthy Living
  • Information Technology
  • Ethics & Compliance
  • Campus Disruptions

Purdue University, 610 Purdue Mall, West Lafayette, IN 47907, (765) 494-4600

© 2015-24 Purdue University | An equal access/equal opportunity university | Copyright Complaints | Maintained by Office of Strategic Communications

Trouble with this page? Disability-related accessibility issue? Please contact News Service at [email protected] .

Sitewide search

UCLA CHPR PNG Logo Black Text

The UCLA Center for Health Policy Research (CHPR) is one of the nation's leading health policy research centers and the premier source of health policy information for California.

Browse Publications

Find an expert, view all projects.

parks after dark evaluation brief cover with little girls wearing PAD shirts and showing medals and infographic in the background

Parks After Dark Evaluation Brief, May 2024

Summary: In this infographic brief, the UCLA Center for Health Policy Research summarizes information from their evaluation of the 2022 Parks After Dark (PAD) program in Los Angeles County. PAD is a county initiative led by the Department of Parks and Recreation in partnership with other county departments and community-based organizations. PAD programming — including sports, entertainment, activities, and more — was offered for eight weeks on Thursday, Friday, and Saturday evenings at 34 parks between June and August 2023.

Findings: Evaluators found that PAD has made significant progress in achieving its intended goals through the provision of quality recreational programming in a safe and family-friendly environment. Besides ensuring participant’s sense of safety at parks while attending PAD programming, evidence indicates that PAD may have reduced crime in PAD parks and their surrounding areas since its inception in 2010. In addition, PAD encouraged meaningful collaboration between participating county departments and community-based organizations; contributed to participant’s feelings of well-being, family togetherness, and social cohesion; and involved a diverse range of participants in community-driven programming in a meaningful way. PAD may also have reduced the burden of disease for those that engaged in exercise opportunities. Read the Publications:

  • Parks After Dark Evaluation Brief, May 2024 ( English )  
  • Parks After Dark Evaluation Brief, May 2024 ( Spanish )
  • Full Evaluation Report: Parks After Dark Evaluation Report, May 2024  

Previous years:

  • Parks After Dark Evaluation Brief, July 2023  
  • Parks After Dark Evaluation Report, July 2023  
  • Parks After Dark Evaluation Brief, July 2018  
  • Parks After Dark Evaluation Report, July 2018  
  • Parks After Dark Evaluation Brief, May 2017  
  • Parks After Dark Evaluation Report, May 2017  

Social media links

Copied to clipboard

An official website of the United States government

Here's how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS. A lock ( Lock Locked padlock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Alert: Due to extended maintenance, NSF.gov will be unavailable from 11:00 PM on 5/31 to 2:00 AM on 6/1. Most other NSF systems, including Research.gov, will be unavailable from 11:00 PM on 5/31 to 1:00 PM on 6/1. We apologize for any inconvenience.

Dr. Thiagarajan Soundappan leads a class at Navajo Technical University. The university’s NSF-funded partnership with Harvard is creating new research opportunities for students and offering paths to graduate study.

Navajo Technical University partners with NSF center, creating new opportunities in materials research and education

A partnership that began in 2017 between Navajo Technical University (NTU) and the U.S. National Science Foundation Materials Research Science and Engineering Center at Harvard University is providing Navajo students with new opportunities to pursue advanced degrees in science and engineering while helping to address critical issues facing the Navajo Nation. 

The partnership is supported through NSF's Partnerships for Research and Education in Materials (PREM) program , which has been building productive collaborations between minority-serving institutions and research-intensive centers and facilities for nearly 20 years. Associate Professor of Chemistry Thiagarajan Soundappan at NTU leads their partnership with Harvard, which focuses on materials research topics with valuable applications for the Navajo community, like electrochemical sensors and microfluidics that can be used to detect groundwater contaminated with heavy metals from abandoned mines.

"There is no limit to the value that science can produce for a community when it is led by and for the people of that community," says Germano Iannacchione, director of NSF's Division of Materials Research. "NSF's PREM program demonstrates how federal funding can help build new scientific capacity and lasting educational pathways in any community in America."

research and gather information on partnerships

This article was originally published by Native Science Report. It has been edited for length and style.

  • Read the full story: Building a Stronger Nation with Soft Materials

Research areas

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • 15 May 2024

‘Quantum internet’ demonstration in cities is most advanced yet

  • Davide Castelvecchi

You can also search for this author in PubMed   Google Scholar

You have full access to this article via your institution.

A pair of researchers work at electronic equipment lit up in green and pink.

A quantum network node at Delft University of Technology in the Netherlands. Credit: Marieke de Lorijn for QuTech

Three separate research groups have demonstrated quantum entanglement — in which two or more objects are linked so that they contain the same information even if they are far apart — over several kilometres of existing optical fibres in real urban areas. The feat is a key step towards a future quantum internet , a network that could allow information to be exchanged while encoded in quantum states.

Together, the experiments are “the most advanced demonstrations so far” of the technology needed for a quantum internet, says physicist Tracy Northup at the University of Innsbruck in Austria. Each of the three research teams — based in the United States, China and the Netherlands — was able to connect parts of a network using photons in the optical-fibre-friendly infrared part of the spectrum, which is a “major milestone”, says fellow Innsbruck physicist Simon Baier.

research and gather information on partnerships

How to build a quantum internet

A quantum internet could enable any two users to establish almost unbreakable cryptographic keys to protect sensitive information . But full use of entanglement could do much more, such as connecting separate quantum computers into one larger, more powerful machine. The technology could also enable certain types of scientific experiment, for example by creating networks of optical telescopes that have the resolution of a single dish hundreds of kilometres wide.

Two of the studies 1 , 2 were published in Nature on 15 May. The third was described last month in a preprint posted on arXiv 3 , which has not yet been peer reviewed.

Impractical environment

Many of the technical steps for building a quantum internet have been demonstrated in the laboratory over the past decade or so. And researchers have shown that they can produce entangled photons using lasers in direct line of sight of each other, either in separate ground locations or on the ground and in space.

But going from the lab to a city environment is “a different beast”, says Ronald Hanson, a physicist who led the Dutch experiment 3 at the Delft University of Technology. To build a large-scale network, researchers agree that it will probably be necessary to use existing optical-fibre technology. The trouble is, quantum information is fragile and cannot be copied; it is often carried by individual photons, rather than by laser pulses that can be detected and then amplified and emitted again. This limits the entangled photons to travelling a few tens of kilometres before losses make the whole thing impractical. “They also are affected by temperature changes throughout the day — and even by wind, if they’re above ground,” says Northup. “That’s why generating entanglement across an actual city is a big deal.”

The three demonstrations each used different kinds of ‘quantum memory’ device to store a qubit, a physical system such as a photon or atom that can be in one of two states — akin to the ‘1’ or ‘0’ of ordinary computer bits — or in a combination, or ‘quantum superposition’, of the two possibilities.

research and gather information on partnerships

The quantum internet has arrived (and it hasn’t)

In one of the Nature studies, led by Pan Jian-Wei at the University of Science and Technology of China (USTC) in Hefei, qubits were encoded in the collective states of clouds of rubidium atoms 1 . The qubits’ quantum states can be set using a single photon, or can be read out by ‘tickling’ the atomic cloud to emit a photon. Pan’s team had such quantum memories set up in three separate labs in the Hefei area. Each lab was connected by optical fibres to a central ‘photonic server’ around 10 kilometres away. Any two of these nodes could be put in an entangled state if the photons from the two atom clouds arrived at the server at exactly the same time.

By contrast, Hanson and his team established a link between individual nitrogen atoms embedded in small diamond crystals with qubits encoded in the electron states of the nitrogen and in the nuclear states of nearby carbon atoms 3 . Their optical fibre went from the university in Delft through a tortuous 25-kilometre path across the suburbs of The Hague to reach a second laboratory in the city.

In the US experiment, Mikhail Lukin, a physicist at Harvard University in Cambridge, Massachusetts, and his collaborators also used diamond-based devices, but with silicon atoms instead of nitrogen, making use of the quantum states of both an electron and a silicon nucleus 2 . Single atoms are less efficient than atomic ensembles at emitting photons on demand, but they are more versatile, because they can perform rudimentary quantum computations. “Basically, we entangled two small quantum computers,” says Lukin. The two diamond-based devices were in the same building at Harvard, but to mimic the conditions of a metropolitan network, the researchers used an optical fibre that snaked around the local Boston area. “It crosses the Charles River six times,” Lukin says.

Challenges ahead

The entanglement procedure used by the Chinese and the Dutch teams required photons to arrive at a central server with exquisite timing precision, which was one of the main challenges in the experiments. Lukin’s team used a protocol that does not require such fine-tuning: instead of entangling the qubits by getting them to emit photons, the researchers sent one photon to entangle itself with the silicon atom at the first node. The same photon then went around the fibre-optic loop and came back to graze the second silicon atom, thereby entangling it with the first.

Pan has calculated that at the current pace of advance, by the end of the decade his team should be able to establish entanglement over 1,000 kilometres of optical fibres using ten or so intermediate nodes, with a procedure called entanglement swapping . (At first, such a link would be very slow, creating perhaps one entanglement per second, he adds.) Pan is the leading researcher for a project using the satellite Micius , which demonstrated the first quantum-enabled communications in space, and he says there are plans for a follow-up mission.

“The step has now really been made out of the lab and into the field,” says Hanson. “It doesn’t mean it’s commercially useful yet, but it’s a big step.”

Nature 629 , 734-735 (2024)

doi: https://doi.org/10.1038/d41586-024-01445-2

Knaut, C. M. et al. Nature 629 , 573–578 (2024).

Article   PubMed   Google Scholar  

Liu, J. L. et al. Nature 629 , 579–585 (2024).

Stolk, A. J. et al. Preprint at arXiv https://doi.org/10.48550/arXiv.2404.03723 (2024).

Download references

Reprints and permissions

Related Articles

research and gather information on partnerships

The AI–quantum computing mash-up: will it revolutionize science?

  • Quantum physics
  • Quantum information

Entanglement of nanophotonic quantum memory nodes in a telecom network

Entanglement of nanophotonic quantum memory nodes in a telecom network

Article 15 MAY 24

Wavefunction matching for solving quantum many-body problems

Wavefunction matching for solving quantum many-body problems

Creation of memory–memory entanglement in a metropolitan quantum network

Creation of memory–memory entanglement in a metropolitan quantum network

An atomic boson sampler

An atomic boson sampler

Article 08 MAY 24

Professor, Division Director, Translational and Clinical Pharmacology

Cincinnati Children’s seeks a director of the Division of Translational and Clinical Pharmacology.

Cincinnati, Ohio

Cincinnati Children's Hospital & Medical Center

research and gather information on partnerships

Data Analyst for Gene Regulation as an Academic Functional Specialist

The Rheinische Friedrich-Wilhelms-Universität Bonn is an international research university with a broad spectrum of subjects. With 200 years of his...

53113, Bonn (DE)

Rheinische Friedrich-Wilhelms-Universität

research and gather information on partnerships

Recruitment of Global Talent at the Institute of Zoology, Chinese Academy of Sciences (IOZ, CAS)

The Institute of Zoology (IOZ), Chinese Academy of Sciences (CAS), is seeking global talents around the world.

Beijing, China

Institute of Zoology, Chinese Academy of Sciences (IOZ, CAS)

research and gather information on partnerships

Full Professorship (W3) in “Organic Environmental Geochemistry (f/m/d)

The Institute of Earth Sciences within the Faculty of Chemistry and Earth Sciences at Heidelberg University invites applications for a   FULL PROFE...

Heidelberg, Brandenburg (DE)

Universität Heidelberg

research and gather information on partnerships

Postdoc: deep learning for super-resolution microscopy

The Ries lab is looking for a PostDoc with background in machine learning.

Vienna, Austria

University of Vienna

research and gather information on partnerships

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Neurol Res Pract

Logo of neurrp

How to use and assess qualitative research methods

Loraine busetto.

1 Department of Neurology, Heidelberg University Hospital, Im Neuenheimer Feld 400, 69120 Heidelberg, Germany

Wolfgang Wick

2 Clinical Cooperation Unit Neuro-Oncology, German Cancer Research Center, Heidelberg, Germany

Christoph Gumbinger

Associated data.

Not applicable.

This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions, and focussing on intervention improvement. The most common methods of data collection are document study, (non-) participant observations, semi-structured interviews and focus groups. For data analysis, field-notes and audio-recordings are transcribed into protocols and transcripts, and coded using qualitative data management software. Criteria such as checklists, reflexivity, sampling strategies, piloting, co-coding, member-checking and stakeholder involvement can be used to enhance and assess the quality of the research conducted. Using qualitative in addition to quantitative designs will equip us with better tools to address a greater range of research problems, and to fill in blind spots in current neurological research and practice.

The aim of this paper is to provide an overview of qualitative research methods, including hands-on information on how they can be used, reported and assessed. This article is intended for beginning qualitative researchers in the health sciences as well as experienced quantitative researchers who wish to broaden their understanding of qualitative research.

What is qualitative research?

Qualitative research is defined as “the study of the nature of phenomena”, including “their quality, different manifestations, the context in which they appear or the perspectives from which they can be perceived” , but excluding “their range, frequency and place in an objectively determined chain of cause and effect” [ 1 ]. This formal definition can be complemented with a more pragmatic rule of thumb: qualitative research generally includes data in form of words rather than numbers [ 2 ].

Why conduct qualitative research?

Because some research questions cannot be answered using (only) quantitative methods. For example, one Australian study addressed the issue of why patients from Aboriginal communities often present late or not at all to specialist services offered by tertiary care hospitals. Using qualitative interviews with patients and staff, it found one of the most significant access barriers to be transportation problems, including some towns and communities simply not having a bus service to the hospital [ 3 ]. A quantitative study could have measured the number of patients over time or even looked at possible explanatory factors – but only those previously known or suspected to be of relevance. To discover reasons for observed patterns, especially the invisible or surprising ones, qualitative designs are needed.

While qualitative research is common in other fields, it is still relatively underrepresented in health services research. The latter field is more traditionally rooted in the evidence-based-medicine paradigm, as seen in " research that involves testing the effectiveness of various strategies to achieve changes in clinical practice, preferably applying randomised controlled trial study designs (...) " [ 4 ]. This focus on quantitative research and specifically randomised controlled trials (RCT) is visible in the idea of a hierarchy of research evidence which assumes that some research designs are objectively better than others, and that choosing a "lesser" design is only acceptable when the better ones are not practically or ethically feasible [ 5 , 6 ]. Others, however, argue that an objective hierarchy does not exist, and that, instead, the research design and methods should be chosen to fit the specific research question at hand – "questions before methods" [ 2 , 7 – 9 ]. This means that even when an RCT is possible, some research problems require a different design that is better suited to addressing them. Arguing in JAMA, Berwick uses the example of rapid response teams in hospitals, which he describes as " a complex, multicomponent intervention – essentially a process of social change" susceptible to a range of different context factors including leadership or organisation history. According to him, "[in] such complex terrain, the RCT is an impoverished way to learn. Critics who use it as a truth standard in this context are incorrect" [ 8 ] . Instead of limiting oneself to RCTs, Berwick recommends embracing a wider range of methods , including qualitative ones, which for "these specific applications, (...) are not compromises in learning how to improve; they are superior" [ 8 ].

Research problems that can be approached particularly well using qualitative methods include assessing complex multi-component interventions or systems (of change), addressing questions beyond “what works”, towards “what works for whom when, how and why”, and focussing on intervention improvement rather than accreditation [ 7 , 9 – 12 ]. Using qualitative methods can also help shed light on the “softer” side of medical treatment. For example, while quantitative trials can measure the costs and benefits of neuro-oncological treatment in terms of survival rates or adverse effects, qualitative research can help provide a better understanding of patient or caregiver stress, visibility of illness or out-of-pocket expenses.

How to conduct qualitative research?

Given that qualitative research is characterised by flexibility, openness and responsivity to context, the steps of data collection and analysis are not as separate and consecutive as they tend to be in quantitative research [ 13 , 14 ]. As Fossey puts it : “sampling, data collection, analysis and interpretation are related to each other in a cyclical (iterative) manner, rather than following one after another in a stepwise approach” [ 15 ]. The researcher can make educated decisions with regard to the choice of method, how they are implemented, and to which and how many units they are applied [ 13 ]. As shown in Fig.  1 , this can involve several back-and-forth steps between data collection and analysis where new insights and experiences can lead to adaption and expansion of the original plan. Some insights may also necessitate a revision of the research question and/or the research design as a whole. The process ends when saturation is achieved, i.e. when no relevant new information can be found (see also below: sampling and saturation). For reasons of transparency, it is essential for all decisions as well as the underlying reasoning to be well-documented.

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig1_HTML.jpg

Iterative research process

While it is not always explicitly addressed, qualitative methods reflect a different underlying research paradigm than quantitative research (e.g. constructivism or interpretivism as opposed to positivism). The choice of methods can be based on the respective underlying substantive theory or theoretical framework used by the researcher [ 2 ].

Data collection

The methods of qualitative data collection most commonly used in health research are document study, observations, semi-structured interviews and focus groups [ 1 , 14 , 16 , 17 ].

Document study

Document study (also called document analysis) refers to the review by the researcher of written materials [ 14 ]. These can include personal and non-personal documents such as archives, annual reports, guidelines, policy documents, diaries or letters.

Observations

Observations are particularly useful to gain insights into a certain setting and actual behaviour – as opposed to reported behaviour or opinions [ 13 ]. Qualitative observations can be either participant or non-participant in nature. In participant observations, the observer is part of the observed setting, for example a nurse working in an intensive care unit [ 18 ]. In non-participant observations, the observer is “on the outside looking in”, i.e. present in but not part of the situation, trying not to influence the setting by their presence. Observations can be planned (e.g. for 3 h during the day or night shift) or ad hoc (e.g. as soon as a stroke patient arrives at the emergency room). During the observation, the observer takes notes on everything or certain pre-determined parts of what is happening around them, for example focusing on physician-patient interactions or communication between different professional groups. Written notes can be taken during or after the observations, depending on feasibility (which is usually lower during participant observations) and acceptability (e.g. when the observer is perceived to be judging the observed). Afterwards, these field notes are transcribed into observation protocols. If more than one observer was involved, field notes are taken independently, but notes can be consolidated into one protocol after discussions. Advantages of conducting observations include minimising the distance between the researcher and the researched, the potential discovery of topics that the researcher did not realise were relevant and gaining deeper insights into the real-world dimensions of the research problem at hand [ 18 ].

Semi-structured interviews

Hijmans & Kuyper describe qualitative interviews as “an exchange with an informal character, a conversation with a goal” [ 19 ]. Interviews are used to gain insights into a person’s subjective experiences, opinions and motivations – as opposed to facts or behaviours [ 13 ]. Interviews can be distinguished by the degree to which they are structured (i.e. a questionnaire), open (e.g. free conversation or autobiographical interviews) or semi-structured [ 2 , 13 ]. Semi-structured interviews are characterized by open-ended questions and the use of an interview guide (or topic guide/list) in which the broad areas of interest, sometimes including sub-questions, are defined [ 19 ]. The pre-defined topics in the interview guide can be derived from the literature, previous research or a preliminary method of data collection, e.g. document study or observations. The topic list is usually adapted and improved at the start of the data collection process as the interviewer learns more about the field [ 20 ]. Across interviews the focus on the different (blocks of) questions may differ and some questions may be skipped altogether (e.g. if the interviewee is not able or willing to answer the questions or for concerns about the total length of the interview) [ 20 ]. Qualitative interviews are usually not conducted in written format as it impedes on the interactive component of the method [ 20 ]. In comparison to written surveys, qualitative interviews have the advantage of being interactive and allowing for unexpected topics to emerge and to be taken up by the researcher. This can also help overcome a provider or researcher-centred bias often found in written surveys, which by nature, can only measure what is already known or expected to be of relevance to the researcher. Interviews can be audio- or video-taped; but sometimes it is only feasible or acceptable for the interviewer to take written notes [ 14 , 16 , 20 ].

Focus groups

Focus groups are group interviews to explore participants’ expertise and experiences, including explorations of how and why people behave in certain ways [ 1 ]. Focus groups usually consist of 6–8 people and are led by an experienced moderator following a topic guide or “script” [ 21 ]. They can involve an observer who takes note of the non-verbal aspects of the situation, possibly using an observation guide [ 21 ]. Depending on researchers’ and participants’ preferences, the discussions can be audio- or video-taped and transcribed afterwards [ 21 ]. Focus groups are useful for bringing together homogeneous (to a lesser extent heterogeneous) groups of participants with relevant expertise and experience on a given topic on which they can share detailed information [ 21 ]. Focus groups are a relatively easy, fast and inexpensive method to gain access to information on interactions in a given group, i.e. “the sharing and comparing” among participants [ 21 ]. Disadvantages include less control over the process and a lesser extent to which each individual may participate. Moreover, focus group moderators need experience, as do those tasked with the analysis of the resulting data. Focus groups can be less appropriate for discussing sensitive topics that participants might be reluctant to disclose in a group setting [ 13 ]. Moreover, attention must be paid to the emergence of “groupthink” as well as possible power dynamics within the group, e.g. when patients are awed or intimidated by health professionals.

Choosing the “right” method

As explained above, the school of thought underlying qualitative research assumes no objective hierarchy of evidence and methods. This means that each choice of single or combined methods has to be based on the research question that needs to be answered and a critical assessment with regard to whether or to what extent the chosen method can accomplish this – i.e. the “fit” between question and method [ 14 ]. It is necessary for these decisions to be documented when they are being made, and to be critically discussed when reporting methods and results.

Let us assume that our research aim is to examine the (clinical) processes around acute endovascular treatment (EVT), from the patient’s arrival at the emergency room to recanalization, with the aim to identify possible causes for delay and/or other causes for sub-optimal treatment outcome. As a first step, we could conduct a document study of the relevant standard operating procedures (SOPs) for this phase of care – are they up-to-date and in line with current guidelines? Do they contain any mistakes, irregularities or uncertainties that could cause delays or other problems? Regardless of the answers to these questions, the results have to be interpreted based on what they are: a written outline of what care processes in this hospital should look like. If we want to know what they actually look like in practice, we can conduct observations of the processes described in the SOPs. These results can (and should) be analysed in themselves, but also in comparison to the results of the document analysis, especially as regards relevant discrepancies. Do the SOPs outline specific tests for which no equipment can be observed or tasks to be performed by specialized nurses who are not present during the observation? It might also be possible that the written SOP is outdated, but the actual care provided is in line with current best practice. In order to find out why these discrepancies exist, it can be useful to conduct interviews. Are the physicians simply not aware of the SOPs (because their existence is limited to the hospital’s intranet) or do they actively disagree with them or does the infrastructure make it impossible to provide the care as described? Another rationale for adding interviews is that some situations (or all of their possible variations for different patient groups or the day, night or weekend shift) cannot practically or ethically be observed. In this case, it is possible to ask those involved to report on their actions – being aware that this is not the same as the actual observation. A senior physician’s or hospital manager’s description of certain situations might differ from a nurse’s or junior physician’s one, maybe because they intentionally misrepresent facts or maybe because different aspects of the process are visible or important to them. In some cases, it can also be relevant to consider to whom the interviewee is disclosing this information – someone they trust, someone they are otherwise not connected to, or someone they suspect or are aware of being in a potentially “dangerous” power relationship to them. Lastly, a focus group could be conducted with representatives of the relevant professional groups to explore how and why exactly they provide care around EVT. The discussion might reveal discrepancies (between SOPs and actual care or between different physicians) and motivations to the researchers as well as to the focus group members that they might not have been aware of themselves. For the focus group to deliver relevant information, attention has to be paid to its composition and conduct, for example, to make sure that all participants feel safe to disclose sensitive or potentially problematic information or that the discussion is not dominated by (senior) physicians only. The resulting combination of data collection methods is shown in Fig.  2 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig2_HTML.jpg

Possible combination of data collection methods

Attributions for icons: “Book” by Serhii Smirnov, “Interview” by Adrien Coquet, FR, “Magnifying Glass” by anggun, ID, “Business communication” by Vectors Market; all from the Noun Project

The combination of multiple data source as described for this example can be referred to as “triangulation”, in which multiple measurements are carried out from different angles to achieve a more comprehensive understanding of the phenomenon under study [ 22 , 23 ].

Data analysis

To analyse the data collected through observations, interviews and focus groups these need to be transcribed into protocols and transcripts (see Fig.  3 ). Interviews and focus groups can be transcribed verbatim , with or without annotations for behaviour (e.g. laughing, crying, pausing) and with or without phonetic transcription of dialects and filler words, depending on what is expected or known to be relevant for the analysis. In the next step, the protocols and transcripts are coded , that is, marked (or tagged, labelled) with one or more short descriptors of the content of a sentence or paragraph [ 2 , 15 , 23 ]. Jansen describes coding as “connecting the raw data with “theoretical” terms” [ 20 ]. In a more practical sense, coding makes raw data sortable. This makes it possible to extract and examine all segments describing, say, a tele-neurology consultation from multiple data sources (e.g. SOPs, emergency room observations, staff and patient interview). In a process of synthesis and abstraction, the codes are then grouped, summarised and/or categorised [ 15 , 20 ]. The end product of the coding or analysis process is a descriptive theory of the behavioural pattern under investigation [ 20 ]. The coding process is performed using qualitative data management software, the most common ones being InVivo, MaxQDA and Atlas.ti. It should be noted that these are data management tools which support the analysis performed by the researcher(s) [ 14 ].

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig3_HTML.jpg

From data collection to data analysis

Attributions for icons: see Fig. ​ Fig.2, 2 , also “Speech to text” by Trevor Dsouza, “Field Notes” by Mike O’Brien, US, “Voice Record” by ProSymbols, US, “Inspection” by Made, AU, and “Cloud” by Graphic Tigers; all from the Noun Project

How to report qualitative research?

Protocols of qualitative research can be published separately and in advance of the study results. However, the aim is not the same as in RCT protocols, i.e. to pre-define and set in stone the research questions and primary or secondary endpoints. Rather, it is a way to describe the research methods in detail, which might not be possible in the results paper given journals’ word limits. Qualitative research papers are usually longer than their quantitative counterparts to allow for deep understanding and so-called “thick description”. In the methods section, the focus is on transparency of the methods used, including why, how and by whom they were implemented in the specific study setting, so as to enable a discussion of whether and how this may have influenced data collection, analysis and interpretation. The results section usually starts with a paragraph outlining the main findings, followed by more detailed descriptions of, for example, the commonalities, discrepancies or exceptions per category [ 20 ]. Here it is important to support main findings by relevant quotations, which may add information, context, emphasis or real-life examples [ 20 , 23 ]. It is subject to debate in the field whether it is relevant to state the exact number or percentage of respondents supporting a certain statement (e.g. “Five interviewees expressed negative feelings towards XYZ”) [ 21 ].

How to combine qualitative with quantitative research?

Qualitative methods can be combined with other methods in multi- or mixed methods designs, which “[employ] two or more different methods [ …] within the same study or research program rather than confining the research to one single method” [ 24 ]. Reasons for combining methods can be diverse, including triangulation for corroboration of findings, complementarity for illustration and clarification of results, expansion to extend the breadth and range of the study, explanation of (unexpected) results generated with one method with the help of another, or offsetting the weakness of one method with the strength of another [ 1 , 17 , 24 – 26 ]. The resulting designs can be classified according to when, why and how the different quantitative and/or qualitative data strands are combined. The three most common types of mixed method designs are the convergent parallel design , the explanatory sequential design and the exploratory sequential design. The designs with examples are shown in Fig.  4 .

An external file that holds a picture, illustration, etc.
Object name is 42466_2020_59_Fig4_HTML.jpg

Three common mixed methods designs

In the convergent parallel design, a qualitative study is conducted in parallel to and independently of a quantitative study, and the results of both studies are compared and combined at the stage of interpretation of results. Using the above example of EVT provision, this could entail setting up a quantitative EVT registry to measure process times and patient outcomes in parallel to conducting the qualitative research outlined above, and then comparing results. Amongst other things, this would make it possible to assess whether interview respondents’ subjective impressions of patients receiving good care match modified Rankin Scores at follow-up, or whether observed delays in care provision are exceptions or the rule when compared to door-to-needle times as documented in the registry. In the explanatory sequential design, a quantitative study is carried out first, followed by a qualitative study to help explain the results from the quantitative study. This would be an appropriate design if the registry alone had revealed relevant delays in door-to-needle times and the qualitative study would be used to understand where and why these occurred, and how they could be improved. In the exploratory design, the qualitative study is carried out first and its results help informing and building the quantitative study in the next step [ 26 ]. If the qualitative study around EVT provision had shown a high level of dissatisfaction among the staff members involved, a quantitative questionnaire investigating staff satisfaction could be set up in the next step, informed by the qualitative study on which topics dissatisfaction had been expressed. Amongst other things, the questionnaire design would make it possible to widen the reach of the research to more respondents from different (types of) hospitals, regions, countries or settings, and to conduct sub-group analyses for different professional groups.

How to assess qualitative research?

A variety of assessment criteria and lists have been developed for qualitative research, ranging in their focus and comprehensiveness [ 14 , 17 , 27 ]. However, none of these has been elevated to the “gold standard” in the field. In the following, we therefore focus on a set of commonly used assessment criteria that, from a practical standpoint, a researcher can look for when assessing a qualitative research report or paper.

Assessors should check the authors’ use of and adherence to the relevant reporting checklists (e.g. Standards for Reporting Qualitative Research (SRQR)) to make sure all items that are relevant for this type of research are addressed [ 23 , 28 ]. Discussions of quantitative measures in addition to or instead of these qualitative measures can be a sign of lower quality of the research (paper). Providing and adhering to a checklist for qualitative research contributes to an important quality criterion for qualitative research, namely transparency [ 15 , 17 , 23 ].

Reflexivity

While methodological transparency and complete reporting is relevant for all types of research, some additional criteria must be taken into account for qualitative research. This includes what is called reflexivity, i.e. sensitivity to the relationship between the researcher and the researched, including how contact was established and maintained, or the background and experience of the researcher(s) involved in data collection and analysis. Depending on the research question and population to be researched this can be limited to professional experience, but it may also include gender, age or ethnicity [ 17 , 27 ]. These details are relevant because in qualitative research, as opposed to quantitative research, the researcher as a person cannot be isolated from the research process [ 23 ]. It may influence the conversation when an interviewed patient speaks to an interviewer who is a physician, or when an interviewee is asked to discuss a gynaecological procedure with a male interviewer, and therefore the reader must be made aware of these details [ 19 ].

Sampling and saturation

The aim of qualitative sampling is for all variants of the objects of observation that are deemed relevant for the study to be present in the sample “ to see the issue and its meanings from as many angles as possible” [ 1 , 16 , 19 , 20 , 27 ] , and to ensure “information-richness [ 15 ]. An iterative sampling approach is advised, in which data collection (e.g. five interviews) is followed by data analysis, followed by more data collection to find variants that are lacking in the current sample. This process continues until no new (relevant) information can be found and further sampling becomes redundant – which is called saturation [ 1 , 15 ] . In other words: qualitative data collection finds its end point not a priori , but when the research team determines that saturation has been reached [ 29 , 30 ].

This is also the reason why most qualitative studies use deliberate instead of random sampling strategies. This is generally referred to as “ purposive sampling” , in which researchers pre-define which types of participants or cases they need to include so as to cover all variations that are expected to be of relevance, based on the literature, previous experience or theory (i.e. theoretical sampling) [ 14 , 20 ]. Other types of purposive sampling include (but are not limited to) maximum variation sampling, critical case sampling or extreme or deviant case sampling [ 2 ]. In the above EVT example, a purposive sample could include all relevant professional groups and/or all relevant stakeholders (patients, relatives) and/or all relevant times of observation (day, night and weekend shift).

Assessors of qualitative research should check whether the considerations underlying the sampling strategy were sound and whether or how researchers tried to adapt and improve their strategies in stepwise or cyclical approaches between data collection and analysis to achieve saturation [ 14 ].

Good qualitative research is iterative in nature, i.e. it goes back and forth between data collection and analysis, revising and improving the approach where necessary. One example of this are pilot interviews, where different aspects of the interview (especially the interview guide, but also, for example, the site of the interview or whether the interview can be audio-recorded) are tested with a small number of respondents, evaluated and revised [ 19 ]. In doing so, the interviewer learns which wording or types of questions work best, or which is the best length of an interview with patients who have trouble concentrating for an extended time. Of course, the same reasoning applies to observations or focus groups which can also be piloted.

Ideally, coding should be performed by at least two researchers, especially at the beginning of the coding process when a common approach must be defined, including the establishment of a useful coding list (or tree), and when a common meaning of individual codes must be established [ 23 ]. An initial sub-set or all transcripts can be coded independently by the coders and then compared and consolidated after regular discussions in the research team. This is to make sure that codes are applied consistently to the research data.

Member checking

Member checking, also called respondent validation , refers to the practice of checking back with study respondents to see if the research is in line with their views [ 14 , 27 ]. This can happen after data collection or analysis or when first results are available [ 23 ]. For example, interviewees can be provided with (summaries of) their transcripts and asked whether they believe this to be a complete representation of their views or whether they would like to clarify or elaborate on their responses [ 17 ]. Respondents’ feedback on these issues then becomes part of the data collection and analysis [ 27 ].

Stakeholder involvement

In those niches where qualitative approaches have been able to evolve and grow, a new trend has seen the inclusion of patients and their representatives not only as study participants (i.e. “members”, see above) but as consultants to and active participants in the broader research process [ 31 – 33 ]. The underlying assumption is that patients and other stakeholders hold unique perspectives and experiences that add value beyond their own single story, making the research more relevant and beneficial to researchers, study participants and (future) patients alike [ 34 , 35 ]. Using the example of patients on or nearing dialysis, a recent scoping review found that 80% of clinical research did not address the top 10 research priorities identified by patients and caregivers [ 32 , 36 ]. In this sense, the involvement of the relevant stakeholders, especially patients and relatives, is increasingly being seen as a quality indicator in and of itself.

How not to assess qualitative research

The above overview does not include certain items that are routine in assessments of quantitative research. What follows is a non-exhaustive, non-representative, experience-based list of the quantitative criteria often applied to the assessment of qualitative research, as well as an explanation of the limited usefulness of these endeavours.

Protocol adherence

Given the openness and flexibility of qualitative research, it should not be assessed by how well it adheres to pre-determined and fixed strategies – in other words: its rigidity. Instead, the assessor should look for signs of adaptation and refinement based on lessons learned from earlier steps in the research process.

Sample size

For the reasons explained above, qualitative research does not require specific sample sizes, nor does it require that the sample size be determined a priori [ 1 , 14 , 27 , 37 – 39 ]. Sample size can only be a useful quality indicator when related to the research purpose, the chosen methodology and the composition of the sample, i.e. who was included and why.

Randomisation

While some authors argue that randomisation can be used in qualitative research, this is not commonly the case, as neither its feasibility nor its necessity or usefulness has been convincingly established for qualitative research [ 13 , 27 ]. Relevant disadvantages include the negative impact of a too large sample size as well as the possibility (or probability) of selecting “ quiet, uncooperative or inarticulate individuals ” [ 17 ]. Qualitative studies do not use control groups, either.

Interrater reliability, variability and other “objectivity checks”

The concept of “interrater reliability” is sometimes used in qualitative research to assess to which extent the coding approach overlaps between the two co-coders. However, it is not clear what this measure tells us about the quality of the analysis [ 23 ]. This means that these scores can be included in qualitative research reports, preferably with some additional information on what the score means for the analysis, but it is not a requirement. Relatedly, it is not relevant for the quality or “objectivity” of qualitative research to separate those who recruited the study participants and collected and analysed the data. Experiences even show that it might be better to have the same person or team perform all of these tasks [ 20 ]. First, when researchers introduce themselves during recruitment this can enhance trust when the interview takes place days or weeks later with the same researcher. Second, when the audio-recording is transcribed for analysis, the researcher conducting the interviews will usually remember the interviewee and the specific interview situation during data analysis. This might be helpful in providing additional context information for interpretation of data, e.g. on whether something might have been meant as a joke [ 18 ].

Not being quantitative research

Being qualitative research instead of quantitative research should not be used as an assessment criterion if it is used irrespectively of the research problem at hand. Similarly, qualitative research should not be required to be combined with quantitative research per se – unless mixed methods research is judged as inherently better than single-method research. In this case, the same criterion should be applied for quantitative studies without a qualitative component.

The main take-away points of this paper are summarised in Table ​ Table1. 1 . We aimed to show that, if conducted well, qualitative research can answer specific research questions that cannot to be adequately answered using (only) quantitative designs. Seeing qualitative and quantitative methods as equal will help us become more aware and critical of the “fit” between the research problem and our chosen methods: I can conduct an RCT to determine the reasons for transportation delays of acute stroke patients – but should I? It also provides us with a greater range of tools to tackle a greater range of research problems more appropriately and successfully, filling in the blind spots on one half of the methodological spectrum to better address the whole complexity of neurological research and practice.

Take-away-points

Acknowledgements

Abbreviations, authors’ contributions.

LB drafted the manuscript; WW and CG revised the manuscript; all authors approved the final versions.

no external funding.

Availability of data and materials

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

IMAGES

  1. How to Gather Information for Your Research Smartly

    research and gather information on partnerships

  2. Research: Gather Your Facts for Better Decision Making

    research and gather information on partnerships

  3. Tips and tools to gather information from clients

    research and gather information on partnerships

  4. Ways to Gather&Organize Research

    research and gather information on partnerships

  5. How to Gather Information for a Proposed Business

    research and gather information on partnerships

  6. How to Gather Data Using Dissertation Research Methodology

    research and gather information on partnerships

VIDEO

  1. Using Evidence to Support EdTech Adoption in Schools

  2. Consolidarity: Exploring Patterns of Social Commonality Among File Directories at Work

  3. Empowering Partnerships: Enhancing Research Through Collaboration

  4. Building a relevant partnership

  5. The OCLC Research Library Partnership connects research to practice

  6. Idea Validation: Startup Essentials

COMMENTS

  1. Partnership Research: A Pathway to Realize Multistakeholder

    This article elaborates on the concept of "partnership research" as a form of participatory action-oriented research and with a focus on multistakeholder participation. It is based on both existing theoretical insights from scientific literature and the description of a practical case in which partnership research is being implemented.

  2. Practical Tips for Establishing Partnerships with Academic Researchers

    INTRODUCTION. Research exists on strategies for successful conduct of community-based research and for establishing effective community-academic partnerships. 1-5 In addition, a growing body of literature highlights the value of community-based organizations' (CBOs) involvement in research. CBOs advocate for a focus on relevant questions around service provision and the health of the ...

  3. A review protocol on research partnerships: a Coordinated Multicenter

    Background Research partnership approaches, in which researchers and stakeholders work together collaboratively on a research project, are an important component of research, knowledge translation, and implementation. Despite their growing use, a comprehensive understanding of the principles, strategies, outcomes, and impacts of different types of research partnerships is lacking. Generating ...

  4. Together We Can Do So Much: A Systematic Review and Conceptual

    The systematic review focuses on using a common definition for research and the practice of collaboration. ... Murray N., Burton D. L., Levin B. L., Massey O. T., Baldwin J. A. (2016). Community-university partnership for research and practice: Application of an interactive and contextual model of collaboration. Journal of Higher Education ...

  5. Strategies of community engagement in research: definitions and

    The consortium proposed the definition of research engagement as "an active partnership between stakeholders and researchers in production of new healthcare knowledge and evidence" (p. 7). The field's acceptance of this foundational definition would suggest the need to clarify definitions of classifications and the continua of engagement.

  6. Collaborative and partnership research for improvement of health and

    Collaborative and partnership research poses specific demands regarding project management. According to a review [], collaborative research can be characterised by heterogeneity of actors, collective responsibilities, demands for applicability in addition to scientific requirements, and by being funded by public agencies with specific agendas.. Project management in collaborative projects ...

  7. Building a Knowledge Base on Research-Practice Partnerships

    These research-practice partnerships (RPPs) share some common features, as identified by Coburn, Penuel, and Geil (2013).First, they are long term, lasting beyond a single project and reflecting an open-ended commitment of partners to one another. Second, they seek to give partners a say in the purpose and direction of the work.

  8. Informing 'good' global health research partnerships: A scoping review

    Methods: A scoping review was conducted to gather documents outlining 'principles' of good global health research partnerships. A broad search of academic databases to gather peerreviewed literature was conducted, complemented by a hand-search of key global health funding institutions for grey literature guidelines.

  9. The Power of Partnerships: How to maximise the impact of research for

    Part 1. Key Qualities of Research-Policy Partnerships "The Impact Initiative has really helped explore some of the practical opportunities and approaches for research policy partnerships to make a stronger contribution to development policy and practice. This has important implications for UK funded researchers seeking lasting and beneficial impacts for developing countries." Mark Claydon ...

  10. Data sharing in the context of community-engaged research partnerships

    Nine best practice principles have been articulated to guide community-engaged research (NCATS, 2011), providing evidence-based and practical guidance for engaging community partners in research.The principles place particular attention on the need to engage communities affected by health issues (see Table 1).Further, these principles convey the importance of actively engaging community ...

  11. Exploring the "how" in research partnerships with young partners by

    Background Involvement of young partners by experience in research is on the rise and becoming expected practice. However, literature on how to promote equitable and meaningful involvement of young people is scarce. The purpose of this paper is to describe and reflect on different approaches between researchers and young partners by experience based on six research projects conducted in Canada ...

  12. Rethinking research partnerships: Evidence and the politics of

    Research identities develop and evolve through partnerships as partners learn new skills, gain confidence, evolve a sense of self-as-researcher and consider new research pathways. Learning from those with different skillsets, knowledge and experience and collectively recognizing (and potentially renegotiating) the different roles in ...

  13. How are evidence generation partnerships between researchers and policy

    Evidence generation partnerships between researchers and policy-makers are a potential method for producing more relevant research with greater potential to impact on policy and practice. Little is known about how such partnerships are enacted in practice, however, or how to increase their effectiveness. We aimed to determine why researchers and policy-makers choose to work together, how they ...

  14. PDF Fostering Partnerships for Community Engagement

    partnership can assist with this: (1) do your homework; (2) create equity; (3) commit to transparency; and (4) build and maintain trust. Do Your Homework Develop an understanding of the community grounded in its history, issues, and people. Through engagement and research, project staff should understand the community's unique attributes and ...

  15. How to Gather and Analyze Data for Partnership Negotiations

    1. Identify your goals and criteria. 2. Research the partner and the market. 3. Ask open-ended questions. 4. Use data visualization tools. 5.

  16. Data Use and Inquiry in Research-Practice Partnerships: Four Case

    The four case examples presented in this brief are drawn from the Gardner Center's substantial experience conducting rigorous research in research-practice partnerships. The first case describes a partnership approach that enhances a school district's capacity to use integrated longitudinal data to tackle persistent problems of practice and ...

  17. Developing effective research collaborations: Strategies for building

    Research collaborations offer a range of benefits, from increased funding opportunities to access to specialized expertise and resources. However, building successful partnerships requires careful planning and execution, as well as a willingness to overcome the challenges that can arise when working with others.

  18. The Digital Variome: Understanding the Implications of Digital Tools on

    Few of these data networks or partnerships could gather information needed for DHT post-market surveillance, so investigators turned their attention to tools which would both facilitate efficient ...

  19. How to Do Market Research: The Complete Guide

    Monitor and adapt. Now that you have gained insights into the various market research methods at your disposal, let's delve into the practical aspects of how to conduct market research effectively. Here's a quick step-by-step overview, from defining objectives to monitoring market shifts. 1. Set clear objectives.

  20. Are Research-Practice Partnerships Responsive to Partners' Needs

    Toward a Theory of RPP "Responsivity" One of the defining features of research-practice partnerships that has been established in the literature is that "research priorities are set in response to district [or state] needs" (Coburn et al., 2013, p. 3; emphasis added).This is the very reason why many research-focused organizations and practice-facing organizations come together to work ...

  21. How to Research and Gather Information for a Business Plan

    1. Identify your research questions. 2. Use primary and secondary sources. 3. Evaluate the quality and reliability of your sources. 4. Organize and analyze your information. 5.

  22. 30 years of youth system of care lessons learned

    The Hawaiʻi State Department of Health, Child and Adolescent Mental Health Division (CAMHD) has maintained a longstanding partnership with Substance Abuse and Mental Health Services Administration (SAMHSA) to enhance capacity and quality of community-based mental health services. The current study explored CAMHD's history of SAMHSA system of care (SOC) awards and identified common themes ...

  23. Purdue, Google announce sustainability and AI research partnership

    Purdue University and Google announced Thursday (May 23) a new collaborative research project aimed at exploring the use of AI to develop innovative solutions for low-carbon industrial building design. The project seeks to leverage AI's power to explore new materials, technologies and design strategies that can significantly reduce the carbon footprint of industrial buildings, such as data ...

  24. Parks After Dark Evaluation Brief, May 2024

    Summary: In this infographic brief, the UCLA Center for Health Policy Research summarizes information from their evaluation of the 2022 Parks After Dark (PAD) program in Los Angeles County.PAD is a county initiative led by the Department of Parks and Recreation in partnership with other county departments and community-based organizations. PAD programming — including sports, entertainment ...

  25. Navajo Technical University partners with NSF center, creating new

    A partnership that began in 2017 between Navajo Technical University (NTU) and the U.S. National Science Foundation Materials Research Science and Engineering Center at Harvard University is providing Navajo students with new opportunities to pursue advanced degrees in science and engineering while helping to address critical issues facing the Navajo Nation.

  26. Partnership with families in early childhood education: Exploratory

    In early childhood education and care (ECEC) services, educators communicate with parents 1 about a range of matters related to children's development and learning. Educators are seen as trusted sources of information about child development and parents often seek educators' advice before seeking support from specialised services (Parenting Research Centre, 2019).

  27. Brock, Niagara Parks to strengthen community impact through renewed

    The original memorandum included projects with Brock's Environmental Sustainability Research Centre (ESRC). The partnership has since flourished, producing a multitude of impactful projects and initiatives aimed at advancing environmental stewardship, sustainability research and experiential education. ... Today this gathering place is home ...

  28. NFPA

    NFPA's Firewise USA® program teaches people how to adapt to living with wildfire and encourages neighbors to work together and take action.

  29. 'Quantum internet' demonstration in cities is most advanced yet

    The feat is a key step towards a future quantum internet, a network that could allow information to be exchanged while encoded in quantum states. Together, the experiments are "the most advanced ...

  30. How to use and assess qualitative research methods

    Abstract. This paper aims to provide an overview of the use and assessment of qualitative research methods in the health sciences. Qualitative research can be defined as the study of the nature of phenomena and is especially appropriate for answering questions of why something is (not) observed, assessing complex multi-component interventions ...