National Center for Science and Engineering Statistics

  • All previous cycle years

The HERD Survey is the primary source of information on research and development expenditures at U.S. colleges and universities.

Survey Info

  • tag for use when URL is provided --> Methodology
  • tag for use when URL is provided --> Data
  • tag for use when URL is provided --> Analysis

The HERD Survey is an annual census of U.S. colleges and universities that expended at least $150,000 in separately accounted-for R&D in the fiscal year. The survey collects information on R&D expenditures by field of research and source of funds and also gathers information on types of research, expenses, and headcounts of R&D personnel.

Areas of Interest

  • Higher Education Research and Development
  • Research and Development

Survey Administration

The FY 2022 survey was conducted by ICF under contract to the National Center for Science and Engineering Statistics.

Survey Details

  • Survey Description (PDF 119 KB)
  • Data Tables (PDF 27.1 MB)

Featured Survey Analysis

R&D Expenditures at U.S. Universities Increased by $8 Billion in FY 2022.

R&D Expenditures at U.S. Universities Increased by $8 Billion in FY 2022

Image 2352

HERD Survey Overview

Data highlights, in current dollars, higher education research and development (r&d) grew at an average annual rate of 4.0% since fy 2012.

Figure 1

Roughly $1.5 billion in foreign funds supported R&D at higher education institutions in FY 2022

Figure 1

In the News

Image 2512

Which Colleges Spent the Most Money on Research?

Image 2513

HERD Survey Reveals Top U.S. Universities for R&D Funding In Engineering

Methodology, survey description, survey overview (fy 2022 survey cycle).

The Higher Education Research and Development (HERD) Survey is the primary source of information on separately accounted-for research and development (R&D) expenditures within higher education institutions in the United States and outlying areas.

Data collection authority

The information is solicited under the authority of the National Science Foundation Act of 1950, as amended, and the America COMPETES Reauthorization Act of 2010. The Office of Management and Budget control number is 3145–0100, with an expiration date of 31 July 2025. The survey is sponsored by the National Center for Science and Engineering Statistics (NCSES) within the National Science Foundation (NSF).

Major changes to recent survey cycle

Key survey information, initial survey year.

In 2010, the HERD Survey replaced a previous annual collection, the NSF Survey of Research and Development Expenditures at Universities and Colleges (Academic R&D Expenditures Survey), which was conducted from FY 1972 through FY 2009.

Reference period

The academic fiscal year ending in 2022; for most institutions this was 1 July 2021 to 30 June 2022.

Response unit

Establishment; U.S. academic institutions reporting at least $150,000 in R&D expenditures in the previous fiscal year.

Sample or census

Population size.

A total of 900 institutions.

Sample size

The survey was a census of all known eligible universities and colleges.

Key variables

Key variables of interest are listed below.

  • R&D expenditures by field and source of funds (i.e., federal government, state and local government, business, nonprofit, institutional, and other)
  • R&D expenditures funded from foreign sources
  • R&D expenditures within medical schools
  • Clinical trial R&D expenditures (Phases I–III)
  • R&D expenditures by type of R&D (i.e., basic research, applied research, and experimental development)
  • Total and federally funded R&D expenditures passed through to subrecipients or received as a subrecipient
  • Federally funded R&D expenditures by field and federal agency
  • R&D expenditures by cost categories (e.g., salaries, software, equipment, indirect costs)
  • Total and federally funded R&D equipment expenditures by field
  • Headcounts and full-time equivalents of R&D personnel functions (researchers, R&D technicians, and R&D support staff)
  • Institutional characteristics (i.e., highest degree granted, historically Black college or university [HBCU], high Hispanic enrollment [HHE], public or private control)
  • Geographic location within the United States

Survey Design

Target population.

Public and private nonprofit postsecondary institutions in the United States, Guam, Puerto Rico, and the U.S. Virgin Islands that granted a bachelor’s degree or higher in any field, expended at least $150,000 in separately accounted-for R&D in FY 2022, and were geographically separate campuses headed by a president, chancellor, or equivalent.

Sampling frame

The survey is a census of all eligible institutions as defined above. In the FY 2022 cycle, there were 900 academic institutions surveyed.

Sample design

Not applicable.

Data Collection and Processing

Data collection.

The FY 2022 survey was conducted by ICF under contract to NCSES. Surveys were distributed to designated contacts at each institution. The data collection period was from November 2022 through July 2023. Respondents submitted their data using a Web-based questionnaire. Telephone and e-mail were used for follow-up contacts with respondents.

Data processing

Questionnaires were carefully examined by survey staff upon receipt. Reviews focused on unexplained missing data and explanations provided for changes in reporting patterns. If additional explanations or data revisions were needed, respondents were sent personalized e-mail messages asking them to provide any necessary revisions before the final processing and tabulation of data.

Estimation techniques

Missing values were imputed based on the previous year’s data and the reported data of peer institutions in the current cycle.

Survey Quality Measures

Sampling error, coverage error.

Coverage error of large research institutions is minimal because comprehensive lists exist. These institutions are easily identified using the NCSES Survey of Federal Science and Engineering Support to Universities, Colleges, and Nonprofit Institutions. However, institutions with smaller amounts of R&D expenditures have been more difficult to identify because they often do not receive federal funding for R&D.

NCSES annually screens all 4-year and above institutions reporting nonzero amounts of research expenses to the Department of Education Integrated Postsecondary Education Data System (IPEDS) to determine if new institutions qualify for inclusion in the survey.

Nonresponse error

The unit nonresponse was 4.0% in FY 2022. Nonresponse rates were less than 5.0% for all but three questions. Question 6, R&D expenditures by type of R&D (basic research, applied research, and experimental development); Question 15, R&D personnel headcount; and Question 16, R&D full-time equivalents (FTEs) had nonresponse rates of 6.3%, 11.8%, and 29.5%, respectively.

Measurement error

Potential sources of measurement errors include incomplete administrative data or differing categories used by the institutions to identify R&D.

Data Availability and Comparability

Data availability.

Annual data are available for FYs 1972–2022.

Data comparability

When the review for consistency between each year’s data and submissions in prior years reveals discrepancies, it is sometimes necessary to modify prior years’ data. This is especially likely to affect trends for certain institutions that fail to report every year, because current-year data are used to impute prior-year data.

For accurate historical data, use only the most recently released data tables. Individuals wishing to analyze trends other than those in the most recent data tables are encouraged to contact the Survey Manager for more information about comparability of data over time.

Data Products

Publications.

NCSES publishes data from this survey annually in detailed tables and analytic reports available at the HERD Survey page . Information from this survey is also included in Science and Engineering Indicators .

Electronic access

Microdata beginning with the FY 2010 survey are available in NCSES’s interactive data tool . Public use files beginning with the FY 1972 are available at http://www.nsf.gov/statistics/herd/pub_data.cfm .

Technical Notes

Survey overview (fy 2022 survey cycle).

P urpose. The Higher Education Research and Development (HERD) Survey is the primary source of information on separately accounted-for R&D expenditures within higher education institutions in the United States and outlying areas.

Data c oll ec tion authorit y . The information is solicited under the authority of the National Science Foundation Act of 1950, as amended, and the America COMPETES Reauthorization Act of 2010. The Office of Management and Budget control number is 3145-0100, with an expiration date of 31 July 2025.

Survey contractor . ICF.

Survey sponsor . The HERD Survey is sponsored by the National Center for Science and Engineering Statistics (NCSES) within the National Science Foundation.

Frequency. Annual.

Initial survey year . In 2010, the HERD Survey replaced a previous annual collection, the Survey of Research and Development Expenditures at Universities and Colleges (Academic R&D Expenditures Survey), which was conducted from FY 1972 through FY 2009.

Reference period. The academic fiscal year ending in 2022; for most institutions, this was 1 July 2021 to 30 June 2022.

Response unit . Establishment; U.S. academic institutions reporting at least $150,000 in R&D expenditures in the previous fiscal year.

Sample or census . Census.

Population size . A total of 900 institutions.

Sample size . The survey was a census of all known eligible universities and colleges.

Target population . Public and private nonprofit postsecondary institutions in the United States, Guam, Puerto Rico, and the U.S. Virgin Islands that granted a bachelor’s degree or higher in any field; expended at least $150,000 in separately accounted-for R&D in FY 2022; and were geographically separate campuses headed by a president, chancellor, or equivalent. A list of all accredited, degree-granting institutions in the United States (the Higher Education Directory ) was obtained from Higher Education Publications (HEP). More information about HEP and its sources can be found at https://hepinc.com/about/ .

The survey population was reviewed before data collection began to ensure that each institutional classification was accurate. Characteristics of the schools were reviewed before and during the survey to determine whether changes had occurred (e.g., name; highest degree granted; school openings, closings, or mergers). Table A-1 shows all institution name changes or mergers between the FY 2021 and FY 2022 surveys.

After data collection closed, institutions were reviewed to verify that only those reporting at least $150,000 in separately accounted-for R&D were included in the population. Of the 919 institutions surveyed, 19 completed the survey but reported total R&D expenditures of less than $150,000. These institutions were excluded from the population, and their data are not included in the FY 2022 survey totals. The total and federally funded R&D expenditures for these 19 institutions are listed in table A-2 .

Sampling frame . The frame for the FY 2022 HERD Survey included (1) all institutions considered in scope for the FY 2021 survey, (2) institutions that granted a bachelor’s degree or higher and reported an amount greater than $0 for research on the Integrated Postsecondary Education Data System (IPEDS) 2020 Finance Survey, (3) all U.S. service institutions that granted a bachelor’s degree or higher and were not already part of the HERD Survey population, and (4) institutions that granted doctoral degrees but did not report to IPEDS and were not already part of the HERD Survey population. The information in the Higher Education Directory was used to locate institutions meeting the conditions listed in 3 and 4. When FY 2021 R&D expenditures were not known, institutions in the frame were sent a brief questionnaire asking whether the institution had R&D expenditures during FY 2021 and FY 2022 and whether those expenditures were less than $150,000, were $150,000 to $999,999, or were $1 million or more.

The population review screener was sent to 106 institutions. A total of 22 institutions were added to the survey population during the population review. One other institution was added when representatives of university systems contacted data collection staff about a campus that newly qualified for the survey. During data collection, 31 institutions were removed from the population after they indicated that their R&D expenditures were less than $150,000 for FY 2022, or that they did not qualify for the survey for another reason. After accounting for these additions and subtractions, the number of academic institutions in the final population decreased from 908 in FY 2021 to 900 in FY 2022 ( table A-3 ).

Sample design . The FY 1997 survey was the last one conducted as a sample survey. Since FY 1998, the survey has been a census of all known eligible universities and colleges.

Data Collection and Processing Methods

Data collection. The FY 2022 questionnaires were sent by e-mail in November 2022. Respondents could choose to submit a questionnaire downloaded from the Web or use the Web-based data collection system. Every effort was made to maintain close contact with respondents to preserve both the consistency and continuity of the resulting data. Survey data reports for each institution were available on the survey website; these showed comparisons between the current and 2 prior years of data and noted any substantive disparities. Questionnaires were carefully examined for completeness upon receipt. Respondents were sent personalized e-mail messages asking them to provide any necessary revisions before the final processing and tabulation of data. These e-mail messages included a link to the HERD Survey Web-based data collection system, allowing respondents to view and correct their data online.

Respondents were asked to explain significant differences between current-year reporting and established patterns of reporting verified for prior years. They were encouraged to correct prior-year data, if necessary. When respondents updated or amended figures from past years, NCSES made corresponding changes to trend data in the 2022 data tables and to the underlying microdata. For accurate historical data, use only the most recently released data tables.

Mode . Respondents could choose to submit a questionnaire downloaded from the Web or use the Web-based questionnaire. All institutions submitted data using the Web-based questionnaire.

Response rates . By the survey’s closing date in July 2023, forms had been received from 864 universities and colleges out of a population of 900—a response rate of 96.0%. Responses were received from 98.4% of all doctorate-granting institutions. The R&D expenditures reported by these doctoral institutions constituted 99.1% of the estimated national R&D expenditures for FY 2022. Table A-4 displays a detailed breakdown of response rates by survey form and highest degree granted, and table A-5 displays a breakdown of response rates for each survey question.

Data editing . The HERD Survey was subject to very little editing. Respondents were contacted and asked to resolve possible self-reporting issues themselves. Questionnaires were carefully examined by survey staff upon receipt. Reviews focused on unexplained missing data, expenditures that were outliers compared to those of peer institutions, and explanations provided for changes in reporting patterns. If additional explanations or data revisions were needed, respondents were sent personalized e-mail messages asking them to provide any necessary revisions before the final processing and tabulation of data.

Imputation. Missing values were imputed based on the previous year’s data and the reported data of peer institutions in the current cycle. For the 34 institutions that had not responded by the closing date of the survey and had been included in the FY 2021 HERD Survey population, R&D expenditures were imputed by applying inflator and deflator factors to the prior year’s key totals. The key totals for FY 2022 included total R&D expenditures, federal R&D expenditures, expenditures received as a subrecipient from higher education sources, expenditures received as a subrecipient from non-higher education sources, expenditures passed through to higher education entities, and expenditures passed through to non-higher education entities. Imputation factors were ratios derived from the 2-year-trend data of responding institutions with similar characteristics, including highest degree granted, type of institutional control (public or private), and level of total R&D expenditures. Other values that were not identified as key totals were imputed by applying ratios from the previous year’s data.

For two institutions that were new to the survey population, no past-year data were available. For these institutions, total R&D expenditures were assumed to be $150,000 or $1,000,000 based on the institutions’ responses to the population review screener. Other values were then imputed as a proportion of total R&D expenditures based on the data of institutions with similar characteristics. Data for partial nonresponse were imputed using similar techniques.

Table A-6 through table A-18 present imputed amounts for each applicable survey variable. The dollar amount imputed is displayed, along with the percentage it represents of the national estimate for universities and colleges for a variable. The imputed total R&D was $87.5 million, or 0.1%, of the $97.8 billion in total R&D expenditures ( table A-6 ).

Several surveyed institutions have responded intermittently in past years. For years in which no response was received, data have been imputed as previously described. Although the imputation algorithm accurately reflects national trends, it cannot account for specific trends at individual institutions. For this reason, a re-imputation of institutional data for prior years is also performed. For each institution, previously imputed values from the HERD Survey (FYs 2010–21) were recomputed to ensure that the imputed data are consistent with reporting patterns from the FY 2022 survey. These procedures result in much more consistent reporting trends for individual institutions but have little effect on aggregate figures reflecting national totals. In the data tables, the letter i is used to identify imputed data.

R&D expenditures from unspecified federal agencies (Question 10) and capitalization thresholds for software and equipment (Question 13) were not imputed. Response summaries for these questions can be found in table A-19 and table A-20 .

Weighting . Survey data were not weighted.

Variance estimation . No variance estimation techniques were used.

Sampling error . Because the FY 2022 survey was a census of all eligible institutions, there was no sampling error.

Coverage error . Coverage error of large research institutions is minimal because of comprehensive lists. These institutions are easily identified using the NCSES Survey of Federal Science and Engineering Support to Universities, Colleges, and Nonprofit Institutions. However, institutions with smaller amounts of R&D expenditures have been more difficult to identify because they often do not receive federal funding for science and engineering (S&E) R&D.

Non response error . Thirty-six universities and colleges did not respond in FY 2022, out of a total of 900 eligible institutions, for a nonresponse rate of 4.0%. Table A-4 displays a detailed breakdown of response rates by survey population and highest degree granted.

Nonresponse rates were less than 5.0% for all but three questions. Question 6, R&D expenditures by type of R&D (basic research, applied research, and experimental development); Question 15, R&D personnel headcount; and Question 16, R&D full-time equivalents (FTEs) had nonresponse rates of 6.3%, 11.8%, and 29.5%, respectively. Table A-5 displays a breakdown of response rates for each question in each of the two surveys. See section “Imputation” for mitigation of item nonresponses. Table A-6 through table A-18 present imputed amounts for each applicable survey variable.

Measurement error . The most likely source of measurement error is institutional records containing categories different from those on the survey. For example, institutions were asked to report all R&D expenditures by field. The NCSES-designed fields do not always translate to an institution’s departmental structure, and adjustments must be made by the institution to complete the survey. Fields were revised for the FY 2016 survey to better reflect the R&D currently being conducted at universities and colleges and make HERD Survey fields more consistent with those used by other NCSES surveys as well as with the National Center for Education Statistics Classification of Instructional Programs (CIP) codes. Details of this change are included in the methodology report and technical notes for the FY 2016 survey. Minor revisions were also made in FY 2020.

Another source of error is the survey’s category of institutionally funded research. The survey requests that institutions report discretionary internal funds used for research. NCSES discovered through debriefings conducted at the conclusion of the FY 2010 survey that there were varying definitions of what should be included on the HERD Survey as institutionally funded research. Some institutions included all expenditures from separate accounts designated for research; others included only internal R&D projects that are competitively awarded and have detailed budgets. A workshop was held in summer 2012 to discuss these differences in definitional interpretation. Based on the findings from the workshop, the FY 2012 survey was modified to clarify that all expenditures designated for research can be included in this category. This includes expenditures for organized research and expenditures of other funds designated for research but not categorized as organized research. A checklist question (Question 1.1) was also added to encourage the inclusion of all eligible expenditures and to determine the full extent of the variations in reporting across institutions. This question has been on the survey since FY 2012. An analysis of Question 1.1 responses from FY 2022 indicated that some institutions still could not report institutionally funded research that was not organized research (6.3% could not report startup packages, bridge funding, or seed funding and 8.1% were unable to identify other departmental funds designated for research). Therefore, survey totals are missing expenditures for R&D that come from multipurpose accounts, and as such, they represent an undercount of the total amount of internal discretionary funding that institutions make available to conduct R&D.

The reporting of unrecovered indirect costs is another known source of error. The survey requests that the total amount of indirect costs associated with a research grant or contract be calculated and reported, including costs that were not reimbursed by the external funding source. The unrecovered indirect cost is calculated by multiplying the institution’s negotiated indirect cost rate by the corresponding base and then subtracting the actual indirect cost recovery, preferably on a project-by-project basis. In FY 2022, 5.1% of respondents reported unrecovered indirect costs as unavailable. Respondents who were unable to provide values were asked to provide information on their nonresponse. Based on the collected information, survey guidance is revised to encourage response.

The reporting of expenditures from projects that are not R&D is another possible source of error. The R&D definition in the HERD Survey excludes public service and outreach programs, curriculum development (unless included as part of an overall research project), and training grants supporting work on non-research projects. As part of a federal government effort to reduce administrative burdens associated with research grants and contracts, agencies began adopting Research Terms and Conditions (RTC) to be consistent with Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Federal Awards issued by the U.S. Office of Management and Budget (OMB) in the Federal Register [80 FR 61849, October 14, 2015]. In these RTCs, agencies employed a broader definition of research, resulting in many institutions reporting projects as research that do not match HERD Survey definitions. Additionally, in recent years R&D expenditure data have been used more frequently in university and college benchmarking, which may encourage institutions to employ broader definitions of research than those provided on the survey.

It should also be noted that because institutions were asked to include funds passed through to higher education institutions as well as subrecipient funding from higher education institutions, there is double counting included in national and group totals. For example, Institution A’s survey included the $2 million passed through to Institution B, and Institution B’s survey also included the $2 million in subrecipient funding that it received from Institution A. Overall, institutions reported $4.7 billion in expenditures from subrecipient funding received from other universities in FY 2022 and $4.7 billion in funds passed through to higher education subrecipients in FY 2022. Adjustments are made to R&D totals presented in the NCSES National Patterns of R&D Resources publications ( https://ncses.nsf.gov/data-collections/national-patterns/ ).

Data Comparability (Changes)

Annual data are available for FYs 1972–2022. When the review for consistency between each year’s data and submissions in prior years reveals discrepancies, it is sometimes necessary to modify prior years’ data. This is especially likely to affect trends for certain institutions that fail to report every year, because current-year data are used to impute prior-year data. For accurate historical data, use only the most recently released data tables. Individuals wishing to analyze trends other than those in the most recent data tables are encouraged to contact the Survey Manager for more information about comparability of data over time.

Changes in s urvey c overage and p opulation . Before FY 2010, the population included only institutions with R&D expenditures and degree programs in S&E fields. Institutions that performed R&D in only non-S&E fields were excluded from the population. Although not a change in the coverage or population, each campus headed by a campus-level president, chancellor, or equivalent began completing a separate survey in 2010 rather than combining its response with the responses of other campuses in the university system. As a result, the overall number of academic institutions in the population increased from 711 in FY 2009 to 742 in FY 2010.

To compare HERD Survey data across university systems by aggregating member campuses, table 6 shows all institutions in the FY 2022 population by state, institutional control, and system.

Universities and colleges can merge or separate, possibly resulting in large changes in data from previous years.

  • For FY 2015, the Indiana University School of Medicine (IUSM) reporting line was changed from the Chancellor of the Indiana University-Purdue University, Indianapolis (IUPUI) campus to the President of Indiana University. As such, the research expenditures for IUSM are now included in the Indiana University–Bloomington figures, resulting in an increase in total R&D expenditures of approximately $280 million for this campus. IUPUI total R&D expenditures decreased from $324 million in FY 2014 to $56 million in FY 2015.
  • In September 2015, Yeshiva University relinquished control of Albert Einstein College of Medicine to Montefiore Health System. As a result, FY 2016 data for Yeshiva University included only 2 months (July–August 2015) of R&D expenditures from the college of medicine. Albert Einstein College of Medicine reported separately for its entire FY 2016 (January–December 2016). Because of this change, FY 2016 research expenditures for Yeshiva University decreased by $260 million. For FY 2017, Yeshiva University included no expenditures from the college of medicine, and total research expenditures decreased by $43 million.
  • In 2016, the Maryland General Assembly approved legislation to create a strategic partnership between University of Maryland (UMD) College Park and UMD Baltimore. The two schools began reporting as one unit, University of Maryland, in FY 2019. In FY 2018, UMD Baltimore reported over $475 million in total R&D expenditures, and UMD College Park reported over $540 million in total R&D expenditures. In FY 2019, the new combined institution reported $1,096 million in total R&D expenditures.
  • In 2019, University of Tennessee reorganized University of Tennessee, Knoxville, Institute of Agriculture, and University of Tennessee, Knoxville. The two schools began reporting as one unit, University of Tennessee, Knoxville, in FY 2020. In 2019, University of Tennessee, Institute of Agriculture reported over $71 million in total R&D expenditures, and University of Tennessee, Knoxville reported over $240 million in total R&D expenditures. In FY 2020, the new combined institution reported $320 million in total R&D expenditures.
  • In 2021, the University of South Florida consolidated its three separate institutions, University of South Florida, Tampa; University of South Florida, St. Petersburg; and University of South Florida, Sarasota-Manatee, into one, singularly-accredited university. The three schools began reporting as one unit, University of South Florida, in FY 2021. In FY 2020, University of South Florida, Tampa reported over $333 million in total R&D expenditures, University of South Florida, St. Petersburg reported over $18 million in total R&D expenditures, and University South Florida, Sarasota-Manatee reported over $2 million in total R&D expenditures. In FY 2021, the new combined institution reported over $405 million in total R&D expenditures.
  • In FY 2021, the University of Colorado Denver and Anschutz Medical Campus began reporting to the survey separately. Prior to FY 2021, the two campuses reported as one unit. In FY 2020, the combined unit reported over $554 million in total R&D expenditures. In FY 2021, University of Colorado Anschutz Medical Campus reported over $562 million in total R&D expenditures and University of Colorado Denver reported over $18 million in total R&D expenditures.

Changes in q uestionnaire . Tables include data from the Academic R&D Expenditures Survey (FYs 1972–2009) and the HERD Survey (FYs 2010–20). Analysts should be cautious when examining trend data. Although many variables are similar across the two surveys because of clarification of which funds are to be included in the definition of R&D and the inclusion of non-S&E expenditures, exact comparisons may be misleading. In prior years, the Academic R&D Expenditures Survey collected expenditures for S&E and non-S&E fields separately. Institutions were not always able to provide non-S&E expenditures, and those data were not imputed previously. Also, revisions to the instructions on what types of activities are included as R&D in 2010 may have influenced reported values to varying degrees, depending on the number of clinical trials and training grants at an institution. Specific changes are described below:

  • For the FY 2012 data collection, NCSES modified the survey instructions to clarify what types of institutionally funded activities should be included in reported data. The instructions explained that all expenditures for R&D from an institution’s current operating funds that are separately accounted for should be reported. This includes expenditures separately budgeted for organized research and expenditures of other funds designated for research but not categorized as organized research. The instructions also specified that funds from an institution’s 501(c)3 foundation should be reported under institutionally financed research.
  • For the FY 2013 collection, the instructions were revised to clarify that funds from foreign and U.S. universities and colleges should be reported under All other sources (Question 1, row f). The instructions also specified that gifts designated by donors for research should be included in Question 1, row f.
  • Several changes were made to the FY 2016 questionnaire:
  • Question 16 on the FY 2015 questionnaire, regarding the number of postdocs paid from R&D expenditures, was removed from the survey. Question 2, regarding foreign funding of R&D, was expanded to identify sources of foreign funding. The question now collects R&D expenditures funded by foreign governments, businesses, nonprofit organizations, and higher education (see foreign sources in the “ Definitions ” section for more information).
  • Questions 9, 11, and 14 include revisions to the fields of R&D that better reflect the R&D currently being conducted at universities and colleges. The revisions make the HERD Survey fields more consistent with those used by other NCSES surveys as well as with the CIP codes.
  • For the FY 2017 collection, the instructions were revised to clarify that funding from federally funded R&D centers (FFRDCs) should be reported as direct federal funding from the FFRDC’s sponsoring agency.
  • For the FY 2018 collection, the instructions were revised to clarify that expenditures for institution research administration and support (e.g., office of sponsored programs) should be excluded from the institutionally financed research totals.
  • Several changes were made to the FY 2020 questionnaire:
  • Question 15, which asked for a headcount of research personnel, was revised to ask for a headcount by three R&D functions (see “ Definitions ” for more information), as well as sex, citizenship status, and education level, for those functioning as researchers.
  • Question 16, which asked for the full-time equivalents (FTEs) by three R&D functions, was added to the survey.
  • Institutions were given guidance to include throughout the survey equipment-only R&D awards, such as Major Research Instrumentation grants.
  • Questions 9, 11, and 14 include revisions to the fields of R&D that better reflect the R&D currently being conducted at universities and colleges. The revisions make the HERD Survey fields more consistent with those used by other NCSES surveys.

Changes in r eporting p rocedures or c lassification . To reduce the burden for institutions with minimal amounts of R&D expenditures, NCSES introduced a shorter version of the HERD Survey, beginning with the FY 2012 collection. The short-form survey includes four core questions. For the FY 2022 cycle, the short-form population included 263 institutions that reported R&D expenditures between $150,000 and $1 million during FY 2021. The remainder of the institutions (637) received the full version of the survey.

Short-form survey data for FYs 2012–2022 appear only in those tables that specify in their title that the data presented include data from the short-form version of the survey. Data from the short-form survey population are included in the year totals prior to FY 2012, aggregated under “all other surveyed institutions.” The total FY 2022 R&D expenditures reported by institutions in the short-form survey population ($161.5 million) represent 0.2% of the expenditures reported by all institutions ($97.8 billion).

Definitions

  • Clinical trials. Research studies designed to answer specific questions about the effects of drugs, vaccines, medical devices, tests, treatments, or other therapies for patients. Clinical trials are used to determine safety and effectiveness. Includes Phase I, Phase II, and Phase III clinical trials with human patients but excludes Phase IV clinical trials.
  • Contracts. Legal commitments in which a good or service was provided by the reporting institution and benefited the sponsor. The sponsor specified the deliverables and gained the rights to the results.
  • Federal agency. Any agency of the U.S. government. Expenditures were reported by six specific agency funding sources (the Department of Agriculture; Department of Defense; Department of Energy; Department of Health and Human Services, including the National Institutes of Health; National Aeronautics and Space Administration; and National Science Foundation). Any expenditures funded by other federal agencies were reported under Other. The names of agencies included in the Other category are also requested.
  • F ield s of R&D . A list of the 40 fields of R&D reported on can be found on the survey questionnaire. In the data tables, the fields are grouped into 10 major areas: computer and information sciences; engineering; geosciences, atmospheric sciences, and ocean sciences; life sciences; mathematics and statistics; physical sciences; psychology; social sciences; other sciences; and non-science and engineering.
  • Fiscal year. Institution’s financial year.
  • Foreign sources :
  • Foreign government. All levels of foreign government, including national, regional, municipal, or other local government.
  • Business. Foreign for-profit organizations. Projects sponsored by a U.S. location of a foreign company were not considered foreign. Funds from a company’s nonprofit foundation were not reported here; they were reported under Nonprofit organizations.
  • Nonprofit organizations . Foreign nonprofit foundations and organizations, except higher education institutions. Funds from foreign universities were reported under Higher education.
  • Higher education. Foreign colleges and universities and units owned, operated, and controlled by such institutions.
  • All other sources. International governmental organizations located in the United States, such as the United Nations, the World Bank, and the International Monetary Fund, and all other entities sending funds to the United States from a location outside the United States and its territories.
  • Full-time equivalents (FTEs) . Calculated as the total working effort spent on research during a specific period divided by the total effort representing a full-time schedule within the same period.
  • Medical schools . A medical school awards MD or DO degrees. Expenditures from projects assigned to the medical school or to research centers that were organizationally part of the medical school were included.
  • Pass-through entity. Organizations that pass through grant or contract funds to subrecipient organizations. Vendor relationships were not included.
  • Research and development (R&D). R&D activity is creative and systematic work undertaken to increase the stock of knowledge—including knowledge of humankind, culture, and society—and to devise new applications of available knowledge. R&D covers three activities: basic research, applied research, and experimental development. R&D does not include public service or outreach programs, curriculum development (unless included as part of an overall research project), or non-research training grants. R&D as measured on this survey does not include capital projects (i.e., construction or renovation of research facilities).
  • R&D expenditures. Expenditures for R&D activities from the institution’s current operating funds that were separately accounted for. For the purposes of the survey, R&D includes expenditures for organized research as defined by 2 CFR 220 Part 200 Appendix III and expenditures from funds designated for research. Expenditures came from internal or external funding and included recovered and unrecovered indirect costs. Funds passed through to subrecipient organizations were also included. R&D was excluded if it was conducted by university faculty or staff at outside institutions and was not accounted for in the reporting institution’s financial records.
  • R&D functions :
  • Researchers. Professionals engaged in the conception or creation of new knowledge, products, processes, methods, and systems and also in the management of the projects concerned. Researchers contribute more to the creative aspects of R&D , whereas technicians provide technical support .
  • R&D technicians. Persons whose main tasks require technical knowledge and experience in one or more fields of science or engineering, but who contribute to R&D by performing technical tasks such as computer programming, data analysis, ensuring accurate testing, operating lab equipment, and preparing and processing samples under the supervision of researchers.
  • R&D support staff. Employees not directly involved with the conduct of a research project, but support the researchers and technicians. These employees might include clerical staff, financial and personnel administrators, report writers, patent agents, safety trainers, equipment specialists, and other related employees.
  • Sources of funds :
  • U.S. federal government. Any agency of the U.S. government. Federal funds that were passed through to the reporting institution from another institution were included.
  • State and local government. Any state, county, municipality, or other local government entity in the United States, including state health agencies. State funds that supported R&D at agricultural and other experiment stations were included. Public institutions reported state appropriations restricted for R&D activities in this category.
  • Business. Domestic or foreign for-profit organizations. Funds from a company’s nonprofit foundation were not reported here; they were reported under Nonprofit organizations.
  • Nonprofit organizations. Domestic or foreign nonprofit foundations and organizations, except universities and colleges. Funds from the reporting institution’s 501(c)3 foundation were reported under Institutional funds. Funds from other universities and colleges were reported under All other sources.
  • Institutional funds. Includes institutionally financed research (all R&D funded by the institution from accounts that are only used for research, excluding institution research administration and support), cost sharing (committed), and unrecovered indirect costs (the portion of indirect costs associated with a sponsored project that was not reimbursed by the sponsor in accordance with the institution’s negotiated indirect cost rate).
  • All other sources. Sources not reported in other categories, such as funds from foreign governments, foreign or U.S. universities, and gifts designated by the donors for research.
  • Subrecipient. The subrecipient for an award carries out the work but receives the funds from a pass-through entity rather than directly from the original funding source. Subrecipients tend to be the coauthors of publications, writers of technical reports discussing findings, inventors, and similar. Vendor relationships were not included.
  • T ype of cost. R&D expenditures were reported in the following categories:
  • Salaries, wages, and fringe benefits. Includes compensation for all R&D personnel whether full time or part time; temporary or permanent; including salaries, wages, and fringe benefits paid from institution funds and from external support.
  • Software purchases, noncapitalized and capitalized. Includes payments for all software — both purchases of software packages and license fees for systems.
  • Capitalized equipment. Includes payments for movable equipment exceeding the institution’s capitalization threshold, including ancillary costs such as delivery and setup.
  • Pass-throughs to other organizations. See the definition for s ubrecipient .
  • Other direct costs. Other costs that do not fit into one of the above categories, including (but not limited to) travel, tuition waivers, services such as consulting, computer usage fees, and supplies.
  • Indirect costs. Includes both recovered and unrecovered indirect costs.
  • Type of R&D. R&D expenditures were reported in the following categories:
  • Basic research. Experimental or theoretical work undertaken primarily to acquire new knowledge of the underlying foundations of phenomena and observable facts, without any particular application or use in view.
  • Applied research. Original investigation undertaken to acquire new knowledge. It is directed primarily toward a specific, practical aim or objective.
  • Experimental development. Systematic work, drawing on knowledge gained from research and practical experience and producing additional knowledge, which is directed toward producing new products or processes or toward improving existing products or processes.

Technical Tables

Response rates: fy 2022, imputed amounts for higher education r&d expenditures: fy 2022, imputed amounts for federally funded higher education r&d expenditures: fy 2022, response details: fy 2022, questionnaires, view archived questionnaires, key data tables.

Recommended data tables

Higher education R&D expenditures reported by all institutions (standard form and short form populations)

Higher education r&d expenditures within standard form population, institution rankings, by all r&d expenditures, institution rankings, by federally financed r&d expenditures, data tables, higher education r&d expenditures by selected area, institution rankings, by nonfederally financed r&d expenditures, institution rankings, by special focus, higher education r&d expenditures in science fields, ranked by fy 2022 total, higher education r&d expenditures in engineering fields, ranked by fy 2022 total, higher education r&d expenditures in non-s&e fields and subfields, ranked by fy 2022 total, federally financed expenditures, ranked by federal agency total, by r&d field: fy 2022, geographic distribution: fys 2010–22, geographic distribution, by institutional control and institution: fy 2022, higher education r&d expenditures passed through to and received as subrecipients: fy 2022, ftes and r&d personnel at higher education institutions, higher education r&d expenditures in science and engineering fields only: fys 2010–22, higher education r&d expenditures reported by the short form population, general notes.

This report provides data from the FY 2022 Higher Education Research and Development (HERD) Survey. The survey is an annual census of institutions that expended at least $150,000 in separately accounted-for research and development (R&D) in the fiscal year.

The tables present data on R&D expenditures at higher education institutions across all academic disciplines and include R&D expenditures by institution, R&D field, geographic area, source of funds, type of R&D (basic research, applied research, and experimental development), cost categories (salaries, software, equipment, and indirect costs), and trends over time.

Acknowledgments and Suggested Citation

Acknowledgments.

Michael T. Gibbons of the National Center for Science and Engineering Statistics (NCSES) developed and coordinated this report under the guidance of Amber Levanon Seligson, NCSES Program Director, and the leadership of Emilda B. Rivers, NCSES Director; Christina Freyman, NCSES Deputy Director; and John Finamore, NCSES Chief Statistician. Jock Black (NCSES) reviewed the report.

Under contract to NCSES, ICF conducted the survey and prepared the data. ICF staff members who made significant contributions include Kathryn Harper, Project Director; Madelyn Roeder, Deputy Project Director; Jennifer Greer, Data Management Lead; Sindhura Geda, Data Management Specialist; Bridget Beavers, Data Management Specialist; Alison Celigoi, Data Management Specialist; Cameron Shanton, Data Collection Specialist; Audrey Nankobogo, Data Collection Specialist; Henry LeVee, Data Collection Specialist; Vladimer Shioshvili, Survey Systems Lead.

NCSES thanks the research-performing academic institutions that provided information for this report.

Suggested Citation

National Center for Science and Engineering Statistics (NCSES). 2023. Higher Education Research and Development: Fiscal Year 20 2 2 . NSF 24-308. Alexandria, VA: National Science Foundation. Available at https://ncses.nsf.gov/surveys/higher-education-research-development/2022 .

Featured Analysis

Definitions of research and development, related content, related collections, survey contact.

For additional information about this survey or the methodology, contact

Get e-mail updates from NCSES

NCSES is an official statistical agency. Subscribe below to receive our latest news and announcements.

  • U.S. Department of Health & Human Services

National Institutes of Health (NIH) - Turning Discovery into Health

  • Virtual Tour
  • Staff Directory
  • En Español

You are here

Nih research matters.

December 20, 2022

2022 Research Highlights — Human Health Advances

Disease prevention, diagnosis, and treatment .

With NIH support, scientists across the United States and around the world conduct wide-ranging research to discover ways to enhance health, lengthen life, and reduce illness and disability. Groundbreaking NIH-funded research often receives top scientific honors. In 2022, these honors included  two NIH-supported scientists who received Nobel Prizes . Here’s just a small sample of the NIH-supported human health advances in 2022. For more health and medical research findings from NIH, visit  NIH Research Matters .

Printer-friendly version of full 2022 NIH Research Highlights

20220405-covid.jpg

Senior man getting vaccinated at a medical clinic

Insights into Covid-19 vaccines

NIH researchers continued to make advances toward understanding the immune system’s complex response to vaccines against SARS-CoV-2, the virus that causes COVID-19. These insights could lead to more effective COVID-19 vaccines. Booster doses of COVID-19 vaccine were shown to elicit neutralizing antibodies against a range of SARS-CoV-2 variants, including the Omicron variant that gained dominance in 2022. Other researchers found the immune cells that make antibodies continue to evolve for months after vaccination , which improves protection over time. Scientists reported that immune cells called T cells play a crucial role in protecting against COVID-19 , too. Studies also showed that COVID-19 vaccines do not reduce fertility nor do they significantly affect menstrual cycle length .

20220628-covid.jpg

Older man holding his head with one hand and a tissue in the other

Long COVID symptoms linked to inflammation

The effects of COVID-19 can persist long after initial symptoms fade. The lingering effects, called Long COVID, can include brain fog, fatigue, and dizziness. Scientists found that after infection with the virus that causes COVID-19, prolonged inflammation led to lasting problems, such as lung and kidney damage, in an animal model. Inflammation also affected the brain and correlated with behavioral changes. The results suggest a mechanism to explain the symptoms of Long COVID in people.

20220322-diabetes.jpg

Young woman sitting on a couch doing a glucose test

Advances for type 1 diabetes 

In type 1 diabetes, the immune system attacks insulin-producing cells in the pancreas. Affected people must depend on insulin treatments to survive. Researchers found that a common blood pressure drug called verapamil could protect insulin-producing cells in the pancreas of people with type 1 diabetes and reduce the need for insulin treatments. Other scientists developed a “ bionic pancreas ” that helped manage blood glucose levels in people with type 1 diabetes better and with less user input than existing methods. Notably, in November 2022 the FDA approved the first-ever treatment to delay the onset of type 1 diabetes, based in part on NIH-supported clinical trials completed the previous year. 

20220614-amd.jpg

Two children obstructed by black splotch in center.

Improved dietary supplement for age-related macular degeneration

Age-related macular degeneration (AMD) is the most common cause of blindness in older Americans. An NIH-funded study 20 years ago showed that a dietary supplement could slow AMD progression, but the safety of one ingredient in the supplement has been questioned. A new supplement formulation replaced the questionable ingredient. A follow-up study showed that the new supplement was safer and better at slowing AMD progression over a 10-year period than the earlier supplement.

20220111-exercise.jpg

Overweight woman doing weight exercise in gym

Testing ways to encourage exercise

Fewer than 1 in 4 adults in the U.S. get the amount of exercise recommended to maintain health and prevent chronic disease. A large nationwide study identified inexpensive interventions that boosted weekly gym visits by up to 27%. The results point to affordable strategies to help increase the amount of exercise Americans get on a regular basis.

20221025-exo.jpg

Feet in exoskeletons

Robotic exoskeleton helps people walk

Researchers created an ankle-worn robotic device, called an exoskeleton, that provides personalized walking assistance under real-world conditions. Compared with walking in normal shoes, the exoskeleton increased walking speed by 9% on average while expending 17% less energy. Robotic exoskeletons could help people who have mobility impairments or physically demanding jobs, such as firefighters or laborers.

20220927-tele.jpg

Mobile phone being used for telehealth visit

Treating opioid use disorder

The class of drugs called opioids is a mainstay of pain treatment, but opioids can lead to dependency or addiction when misused. A study found that people with opioid use disorder who received telehealth services during the COVID-19 pandemic were more likely to keep taking their medications to treat opioid use disorder. They also had a lower risk of overdose. Other scientists found that men in a rural jail who received medications to treat opioid use disorder had a reduced likelihood of being arrested or returning to jail or prison after release. Researchers have also been making progress in finding alternatives to opioids .

20220913-brain-stimulation -participant.png

Transcranial stimulation treatment

Brain stimulation can affect memory in older adults

A noninvasive method that stimulates specific brain regions led to month-long memory improvements in older adults. The approach hints at the potential for a drug-free treatment to reverse or prevent memory loss in the aging population. More research is needed to see if this experimental technique can have longer-lasting effects or help improve memory in people with brain disorders.

2022 Research Highlights — Promising Medical Findings >>

Connect with Us

  • More Social Media from NIH

Suggestions or feedback?

MIT News | Massachusetts Institute of Technology

  • Machine learning
  • Social justice
  • Black holes
  • Classes and programs

Departments

  • Aeronautics and Astronautics
  • Brain and Cognitive Sciences
  • Architecture
  • Political Science
  • Mechanical Engineering

Centers, Labs, & Programs

  • Abdul Latif Jameel Poverty Action Lab (J-PAL)
  • Picower Institute for Learning and Memory
  • Lincoln Laboratory
  • School of Architecture + Planning
  • School of Engineering
  • School of Humanities, Arts, and Social Sciences
  • Sloan School of Management
  • School of Science
  • MIT Schwarzman College of Computing

MIT’s top research stories of 2022

Press contact :, media download, *terms of use:.

Images for download on the MIT News office website are made available to non-commercial entities, press and the general public under a Creative Commons Attribution Non-Commercial No Derivatives license . You may not alter the images provided, other than to crop them to size. A credit line must be used when reproducing images; if one is not provided below, credit the images to "MIT."

Previous image Next image

The dizzying pace of research and innovation at MIT can make it hard to keep up. To mark the end of the year, MIT News is looking back at 10 of the research stories that generated the most excitement in 2022.

We’ve also rounded up the year’s  top MIT community-related stories .

  • Designing a heat engine with no moving parts . In April, engineers at MIT and the National Renewable Energy Laboratory (NREL) designed a heat engine that might someday enable a fully decarbonized power grid. In demonstrations, the engine was able to convert heat to electricity with over 40 percent efficiency — a performance better than that of traditional steam turbines.
  • Creating a lightweight material stronger than steel . In February, MIT chemical engineers used a new polymerization process to form a material that that is stronger than steel and as light as plastic, and can be easily manufactured in large quantities. The material could be used as a coating for car parts or as a building material for bridges and other structures.
  • Enabling portable desalination at the push of a button . MIT researchers developed a suitcase-sized device that can remove particles and salts to generate drinking water. Unlike other desalination units that rely on filters, this device uses electrical power to purify the water. It requires less power to operate than a cell phone charger and can be driven by a small solar panel. Just push start.
  • Linking human genes to function . A team of researchers created the first map tying every gene expressed in human cells to its job in the cell. The map, which is available for other scientists to use, makes it easier to study a range of biological questions. The map was created using a CRISPR-based single-cell sequencing method known as Perturb-seq.
  • Improving supercomputing with a new programming language . A team of researchers based mainly at MIT invented a faster and more reliable programming language for high-performance computing. The language, which was tested on a number of small programs, could one day help computers with a number of deep learning tasks like image processing.
  • Lifting people out of extreme poverty . A study co-authored by an MIT economist showed that a one-time capital boost (in this case, a cow) helped poor people in rural Bangladesh improve their lives in the long run. The study suggests the very poor are in a poverty trap, in which an initial lack of resources prevents them from improving their circumstances, and implies that large asset transfers are an effective way to reduce global poverty.
  • Helping robots fly . Inspired by fireflies, MIT researchers created tiny actuators that emit light to allow insect-scale robots to communicate. Weighing barely more than a paper clip, the robots are too small to make use of traditional means of sensing and communication. Instead, the actuators that control the robots’ wings light up in different colors and patterns, which could enable them to do things like share their location and call for help.
  • Detecting a radio signal in a far-off galaxy . In July, astronomers at MIT and elsewhere were surprised to find a periodic fast radio burst (FRB) originating billions of light-years from Earth. It is the longest lasting FRB pattern detected to date and is made up of intensely strong radio waves that repeat every 0.2 seconds, similar to a heartbeat. Astronomers suspect the signal is coming from a neutron star.
  • Proposal for a new, low-cost battery design . Researchers at MIT developed a battery made from abundant, inexpensive materials to complement the rise of lithium-ion batteries. The new battery uses aluminum and sulfur as its two electrode materials and a molten salt electrolyte in between. It could be ideal for powering single homes or small to medium sized businesses, producing a few tens of kilowatt-hours of storage capacity.
  • Immigrants as job creators . A study co-authored by an MIT economist found that compared to native-born citizens, immigrants are about 80 percent more likely to found a firm. The study, which looked at registered businesses of all types across the country, suggests that immigrants act more as "job creators" than "job takers" and play outsized roles in high-growth entrepreneurship in the U.S.

Share this news article on:

Related topics.

  • MIT Sloan School of Management
  • School of Architecture and Planning
  • School of Humanities Arts and Social Sciences

Related Articles

In background, aerial photo shows top of MIT Great Dome. 4 circles show a conga line at an MIT party, Reif and other community members during Raimondo’s visit, a Wakanda still showing 2 actors, Kornbluth posing with 2 community members.

MIT community in 2022: A year in review

clips of research photos from the year

MIT’s top research stories of 2021

students entering campus during pandemic

MIT community in 2021: A year in review

Previous item Next item

More MIT News

Portrait headshot of Robert Gilliard standing in front of pine trees

An expansive approach to making new compounds

Read full story →

A young man wearing a long-sleeve T-shirt, jeans, and sneakers scrambles over a rocky ledge atop a high mountain. Clouds, a broad sky, and forested hilltops are visible in the background.

Q&A: A graduating student looks back on his MIT experience

11 portrait photos arranged in two rows of four and one row of three.

Eleven from MIT awarded 2024 Fulbright fellowships

Sandra Liu poses for the camera holding her GelPalm prototype, a robotic hand with sensors. She is in a lab workspace with two computer monitors, a Rubik's cube, and electronic equipment.

Robotic palm mimics human touch

On left is photo of Ben Ross Schneider smiling with arms crossed. On right is the cover to the book, which has the title and author’s name. It features an cubist illustration of a person and trees in green and orange.

Trying to make the grade

Janabel Xia dancing in front of a blackboard. Her back is arched, head thrown back, hair flying, and arms in the air as she looks at the camera and smiles.

Science, Research and Innovation performance of the EU 2022 report

Description.

The Commission has released the 2022 edition of the  Science, Research and Innovation Performance (SRIP) report , analysing the EU's innovation performance in a global context. It provides insights into how research and innovation policies can help build an inclusive, sustainable, competitive and resilient Europe by leveraging the essential role of research and innovation as a source of prosperity and as a catalyst for change. The report also highlights how the coronavirus pandemic and Russia's invasion of Ukraine call for Europe to reinforce its preparedness to quickly and adequately react to new, unexpected challenges.

Cover

  • Deutsch (327.33 KB - HTML) Download
  • français (328.53 KB - HTML) Download

Share this page

  • Skip to Content
  • Skip to Main Navigation
  • Skip to Search

NSSE logo

NSSE NSSE NSSE

Open Search

  • Conceptual Framework
  • Positions and Policies
  • Topical Modules
  • Engagement Indicators
  • High-Impact Practices
  • Campus Contacts
  • Administration Checklist
  • Population File
  • Customizing NSSE
  • Recruitment Method
  • Encouraging Participation
  • Terms of Participation
  • Accessibility
  • Data Security
  • NSSE Overview
  • Sample Report
  • Institutional Report Guide
  • Report Builder
  • Data Summaries & Interactive Displays
  • NSSE Data User’s Guide
  • Data Codebooks
  • Response Rates
  • FSSE Portal Log-in
  • FSSE Scales
  • Disciplinary Areas
  • Consortium Questions
  • Admin Preparation
  • Confidentiality
  • Administration Customization
  • Institutional Review Board (IRB)
  • Locating Your Data & Results
  • Administration Summary
  • FSSE Respondent Profile
  • FSSE Topical Module Report
  • FSSE Disciplinary Area Report
  • FSSE-NSSE Combined Report
  • FSSE Frequency Report
  • FSSE Snapshot Report
  • FSSE Overview
  • Content Summaries
  • Data Visualizations
  • Data Use Examples
  • Analysis Resources
  • Data User's Guide
  • Psychometric Portfolio
  • About BCSSE
  • BCSSE Scales
  • New Student Check-In
  • Registration & Pricing
  • Administration Protocol and Procedures
  • BCSSE Contacts
  • Demonstration Dashboard Portal
  • Institution Participation Agreement
  • Data Security and Accessibility
  • BCSSE Overview
  • Accessing BCSSE Data
  • Summary Tables
  • Additional Resources
  • Dashboard Log-in
  • NSSE Institute
  • Institution Examples
  • NSSE Data Use in Brief
  • Institution website displays
  • Data Use Teams
  • Search for examples
  • Participating Institution Search
  • NSSE Channel
  • Tips for More Inclusive Data Sharing and Analysis
  • Guidelines for Display of NSSE Results
  • Accreditation Toolkits
  • Item Campuswide Mapping
  • Sharing and Disseminating NSSE Results
  • Effective Educational Practice Documented
  • Contextualizing NSSE Effect Sizes
  • Custom Analysis
  • Annual Results 2023
  • 1. Rebounding Engagement
  • 2. Digging Deeper Into HIP Quality
  • 3. Hot Topics in Higher Ed
  • Past Annual Results
  • Foundational Publications
  • Featured Publications
  • Recent Presentations
  • Lessons from the Field
  • DEEP Practice Briefs
  • Special Projects
  • NSSE Essentials
  • Search Posts
  • Institution Login

Our Research: Projects, Publications, and More

  • Annual Results

Annual Results 2022

Engagement insights: survey findings on the quality of undergraduate education.

Headshots of NSSE co-directors Jillian Kinzie and Cindy Ann Kilgo

A Message From the Interim Co-Directors

We are excited to continue our brief tenure as Interim Co-Directors of NSSE! Jillian Kinzie, Ph.D., and Cindy Ann Kilgo, Ph.D.

Welcome to the 2022 edition of Engagement Insights, NSSE’s annual dissemination of selected research findings and institutional stories that have broad relevance to the improvement of undergraduate education.

In the winter months of 2023, this page will present three data-informed treatments of important topics for higher education with special value for institutions that participated in NSSE, FSSE, and BCSSE in 2022. In addition to new findings, we feature related webinars, examples of institutional use, suggestions for institutional assessment, faculty insights, and more.

Annual Results 2022 Topics

Latest findings from NSSE, FSSE, and BCSSE in 2022.

research and results 2022

Rebounding Engagement: Has Higher Education Returned to “Normal”?

COVID-19 brought about some noticeable shifts in student engagement in 2020 and 2021. We explore patterns of engagement and student time allocations, focusing on trends from 2019 to 2022. We also use data from the Experiences in Online Learning Topical Module to look at perceptions of effective course structure.

Read story 1

research and results 2022

Digging Deeper Into the Quality of High-Impact Practices

High-impact practice participation is widely accepted as a beneficial student experience, but the aspects and implementation of these can vary widely. We use data from the NSSE HIP Quality Topical Module, launched in 2022, to gain a more nuanced understanding of three selected HIPs: internships, service-learning, and research with faculty. Parallel FSSE data also inform some of these findings.

Read story 2

research and results 2022

Hot Topics in Higher Education: Mental Well-Being, Affordability, and Transferable Skills

This final story addresses trending issues for colleges and universities. We explore results from a 2022 experimental set on mental wellness, BCSSE data from regarding perceptions of cost and affordability, and how data from the Development of Transferable Skills Topical Module illustrate areas of skill development and their connections to student engagement.

Read story 3

Did You Know?

67% of seniors and 63% of first-year students felt their family was a substantial source of mental health support.  

32% of beginning college students expect a high amount of difficulty paying for their college expenses.  

3 Areas of skill development featured in the Transferable Skills Topical Module

Read Past Annual Reports

Evidence-based improvement in higher education resources and social media channels.

Evidence-Based Improvement in Higher Education

Center for Postsecondary Research Indiana University School of Education 201 N. Rose Avenue Bloomington, IN 47405-1006 Phone: 812.856.5824 Email:  [email protected]

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications

Our Methods

  • Short Reads
  • Tools & Resources

Read Our Research On:

How Latinas View Hispanic Women’s Current and Future Situation

Teens and video games today, an early look at black voters’ views on biden, trump and election 2024.

While Black voters remain overwhelmingly Democratic and support Joe Biden over Donald Trump by a wide margin, Biden’s advantage among this group is not as wide as it was four years ago.

Support for legal abortion is widespread in many places, especially in Europe

A majority of latinas face pressures at home or at work, sign up for our weekly newsletter.

Fresh data delivered Saturday mornings

Latest Publications

Black voters are more confident in Biden than Trump when it comes to having the qualities needed to serve another term.

When Online Content Disappears

A quarter of all webpages that existed at one point between 2013 and 2023 are no longer accessible.

More Americans want the journalists they get news from to share their politics than any other personal trait

Most Americans say it is not important that the news they get comes from journalists who share their political views, age, gender or other traits.

Majorities in most of the 27 places around the world surveyed in 2023 and 2024 say abortion should be legal in all or most cases.

Half of Latinas Say Hispanic Women’s Situation Has Improved in the Past Decade and Expect More Gains

Government data shows gains in education, employment and earnings for Hispanic women, but gaps with other groups remain.

All publications >

Most Popular

Sign up for the briefing.

Weekly updates on the world of news & information

  • Politics & Policy

Broad Public Support for Legal Abortion Persists 2 Years After Dobbs

Views are split by political party, but support for legal abortion has risen modestly in both groups since before the 2022 Dobbs decision.

As Biden and Trump seek reelection, who are the oldest – and youngest – current world leaders?

More than 80% of americans believe elected officials don’t care what people like them think, a growing share of americans have little or no confidence in netanyahu, what the data says about crime in the u.s..

All Politics and Policy research >

The Hardships and Dreams of Asian Americans Living in Poverty

What public k-12 teachers want americans to know about teaching, how people in 24 countries think democracy can improve, religious restrictions around the world.

All Features >

  • International Affairs

Americans are less likely than others around the world to feel close to people in their country or community

A median of 83% across 24 nations surveyed say they feel close to other people in their country, while 66% of Americans hold this view.

Growing Partisan Divisions Over NATO and Ukraine

58% of Americans see NATO favorably, down 4 points since 2023. Democrats and Republicans are increasingly divided on the alliance and on Ukraine aid.

Americans Remain Critical of China

About eight-in-ten Americans report an unfavorable view of China, and Chinese President Xi Jinping receives similarly negative ratings.

Younger Americans stand out in their views of the Israel-Hamas war

33% of adults under 30 say their sympathies lie either entirely or mostly with the Palestinian people, while 14% say their sympathies lie with the Israeli people.

All INTERNATIONAL AFFAIRS RESEARCH >

  • Internet & Technology

Americans’ Views of Technology Companies

Most Americans are wary of social media’s role in politics and its overall impact on the country, and these concerns are ticking up among Democrats. Still, Republicans stand out on several measures, with a majority believing major technology companies are biased toward liberals.

6 facts about Americans and TikTok

62% of U.S. adults under 30 say they use TikTok, compared with 39% of those ages 30 to 49, 24% of those 50 to 64, and 10% of those 65 and older.

Many Americans think generative AI programs should credit the sources they rely on

22% of Americans say they interact with artificial intelligence almost constantly or several times a day. 27% say they do this about once a day or several times a week.

All INTERNET & TECHNOLOGY RESEARCH >

  • Race & Ethnicity

How Hispanic Americans Get Their News

U.S.-born Latinos mostly get their news in English and prefer it in English, while immigrant Latinos have much more varied habits.

Key facts about Asian Americans living in poverty

Burmese (19%) and Hmong Americans (17%) were among the Asian origin groups with the highest poverty rates in 2022.

Latinos’ Views on the Migrant Situation at the U.S.-Mexico Border

U.S. Hispanics are less likely than other Americans to say increasing deportations or a larger wall along the border will help the situation.

Black Americans’ Views on Success in the U.S.

While Black adults define personal and financial success in different ways, most see these measures of success as major sources of pressure in their lives.

5 facts about Black Americans and health care 

More Black Americans say health outcomes for Black people in the United States have improved over the past 20 years than say outcomes have worsened.

All Race & Ethnicity RESEARCH >

research and results 2022

U.S. Surveys

Pew Research Center has deep roots in U.S. public opinion research. Launched as a project focused primarily on U.S. policy and politics in the early 1990s, the Center has grown over time to study a wide range of topics vital to explaining America to itself and to the world.

research and results 2022

International Surveys

Pew Research Center regularly conducts public opinion surveys in countries outside the United States as part of its ongoing exploration of attitudes, values and behaviors around the globe.

research and results 2022

Data Science

Pew Research Center’s Data Labs uses computational methods to complement and expand on the Center’s existing research agenda.

research and results 2022

Demographic Research

Pew Research Center tracks social, demographic and economic trends, both domestically and internationally.

research and results 2022

All Methods research >

Our Experts

“A record 23 million Asian Americans trace their roots to more than 20 countries … and the U.S. Asian population is projected to reach 46 million by 2060.”

A headshot of Neil Ruiz, head of new research initiatives and associate director of race and ethnicity research.

Neil G. Ruiz , Head of New Research Initiatives

Key facts about asian americans >

Methods 101 Videos

Methods 101: random sampling.

The first video in Pew Research Center’s Methods 101 series helps explain random sampling – a concept that lies at the heart of all probability-based survey research – and why it’s important.

Methods 101: Survey Question Wording

Methods 101: mode effects, methods 101: what are nonprobability surveys.

All Methods 101 Videos >

Add Pew Research Center to your Alexa

Say “Alexa, enable the Pew Research Center flash briefing”

Signature Reports

Race and lgbtq issues in k-12 schools, representative democracy remains a popular ideal, but people around the world are critical of how it’s working, americans’ dismal views of the nation’s politics, measuring religion in china, diverse cultures and shared experiences shape asian american identities, parenting in america today, editor’s pick, religious ‘nones’ in america: who they are and what they believe, among young adults without children, men are more likely than women to say they want to be parents someday, fewer young men are in college, especially at 4-year schools, about 1 in 5 u.s. teens who’ve heard of chatgpt have used it for schoolwork, women and political leadership ahead of the 2024 election, #blacklivesmatter turns 10.

  • Immigration & Migration

Migrant encounters at the U.S.-Mexico border hit a record high at the end of 2023

How americans view the situation at the u.s.-mexico border, its causes and consequences, what we know about unauthorized immigrants living in the u.s., latinos’ views of and experiences with the spanish language, social media, how teens and parents approach screen time, 5 facts about how americans use facebook, two decades after its launch, a declining share of adults, and few teens, support a u.s. tiktok ban, 81% of u.s. adults – versus 46% of teens – favor parental consent for minors to use social media, how americans view data privacy.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Age & Generations
  • Coronavirus (COVID-19)
  • Economy & Work
  • Family & Relationships
  • Gender & LGBTQ
  • Methodological Research
  • News Habits & Media
  • Non-U.S. Governments
  • Other Topics
  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Korean Med Sci
  • v.37(16); 2022 Apr 25

Logo of jkms

A Practical Guide to Writing Quantitative and Qualitative Research Questions and Hypotheses in Scholarly Articles

Edward barroga.

1 Department of General Education, Graduate School of Nursing Science, St. Luke’s International University, Tokyo, Japan.

Glafera Janet Matanguihan

2 Department of Biological Sciences, Messiah University, Mechanicsburg, PA, USA.

The development of research questions and the subsequent hypotheses are prerequisites to defining the main research purpose and specific objectives of a study. Consequently, these objectives determine the study design and research outcome. The development of research questions is a process based on knowledge of current trends, cutting-edge studies, and technological advances in the research field. Excellent research questions are focused and require a comprehensive literature search and in-depth understanding of the problem being investigated. Initially, research questions may be written as descriptive questions which could be developed into inferential questions. These questions must be specific and concise to provide a clear foundation for developing hypotheses. Hypotheses are more formal predictions about the research outcomes. These specify the possible results that may or may not be expected regarding the relationship between groups. Thus, research questions and hypotheses clarify the main purpose and specific objectives of the study, which in turn dictate the design of the study, its direction, and outcome. Studies developed from good research questions and hypotheses will have trustworthy outcomes with wide-ranging social and health implications.

INTRODUCTION

Scientific research is usually initiated by posing evidenced-based research questions which are then explicitly restated as hypotheses. 1 , 2 The hypotheses provide directions to guide the study, solutions, explanations, and expected results. 3 , 4 Both research questions and hypotheses are essentially formulated based on conventional theories and real-world processes, which allow the inception of novel studies and the ethical testing of ideas. 5 , 6

It is crucial to have knowledge of both quantitative and qualitative research 2 as both types of research involve writing research questions and hypotheses. 7 However, these crucial elements of research are sometimes overlooked; if not overlooked, then framed without the forethought and meticulous attention it needs. Planning and careful consideration are needed when developing quantitative or qualitative research, particularly when conceptualizing research questions and hypotheses. 4

There is a continuing need to support researchers in the creation of innovative research questions and hypotheses, as well as for journal articles that carefully review these elements. 1 When research questions and hypotheses are not carefully thought of, unethical studies and poor outcomes usually ensue. Carefully formulated research questions and hypotheses define well-founded objectives, which in turn determine the appropriate design, course, and outcome of the study. This article then aims to discuss in detail the various aspects of crafting research questions and hypotheses, with the goal of guiding researchers as they develop their own. Examples from the authors and peer-reviewed scientific articles in the healthcare field are provided to illustrate key points.

DEFINITIONS AND RELATIONSHIP OF RESEARCH QUESTIONS AND HYPOTHESES

A research question is what a study aims to answer after data analysis and interpretation. The answer is written in length in the discussion section of the paper. Thus, the research question gives a preview of the different parts and variables of the study meant to address the problem posed in the research question. 1 An excellent research question clarifies the research writing while facilitating understanding of the research topic, objective, scope, and limitations of the study. 5

On the other hand, a research hypothesis is an educated statement of an expected outcome. This statement is based on background research and current knowledge. 8 , 9 The research hypothesis makes a specific prediction about a new phenomenon 10 or a formal statement on the expected relationship between an independent variable and a dependent variable. 3 , 11 It provides a tentative answer to the research question to be tested or explored. 4

Hypotheses employ reasoning to predict a theory-based outcome. 10 These can also be developed from theories by focusing on components of theories that have not yet been observed. 10 The validity of hypotheses is often based on the testability of the prediction made in a reproducible experiment. 8

Conversely, hypotheses can also be rephrased as research questions. Several hypotheses based on existing theories and knowledge may be needed to answer a research question. Developing ethical research questions and hypotheses creates a research design that has logical relationships among variables. These relationships serve as a solid foundation for the conduct of the study. 4 , 11 Haphazardly constructed research questions can result in poorly formulated hypotheses and improper study designs, leading to unreliable results. Thus, the formulations of relevant research questions and verifiable hypotheses are crucial when beginning research. 12

CHARACTERISTICS OF GOOD RESEARCH QUESTIONS AND HYPOTHESES

Excellent research questions are specific and focused. These integrate collective data and observations to confirm or refute the subsequent hypotheses. Well-constructed hypotheses are based on previous reports and verify the research context. These are realistic, in-depth, sufficiently complex, and reproducible. More importantly, these hypotheses can be addressed and tested. 13

There are several characteristics of well-developed hypotheses. Good hypotheses are 1) empirically testable 7 , 10 , 11 , 13 ; 2) backed by preliminary evidence 9 ; 3) testable by ethical research 7 , 9 ; 4) based on original ideas 9 ; 5) have evidenced-based logical reasoning 10 ; and 6) can be predicted. 11 Good hypotheses can infer ethical and positive implications, indicating the presence of a relationship or effect relevant to the research theme. 7 , 11 These are initially developed from a general theory and branch into specific hypotheses by deductive reasoning. In the absence of a theory to base the hypotheses, inductive reasoning based on specific observations or findings form more general hypotheses. 10

TYPES OF RESEARCH QUESTIONS AND HYPOTHESES

Research questions and hypotheses are developed according to the type of research, which can be broadly classified into quantitative and qualitative research. We provide a summary of the types of research questions and hypotheses under quantitative and qualitative research categories in Table 1 .

Research questions in quantitative research

In quantitative research, research questions inquire about the relationships among variables being investigated and are usually framed at the start of the study. These are precise and typically linked to the subject population, dependent and independent variables, and research design. 1 Research questions may also attempt to describe the behavior of a population in relation to one or more variables, or describe the characteristics of variables to be measured ( descriptive research questions ). 1 , 5 , 14 These questions may also aim to discover differences between groups within the context of an outcome variable ( comparative research questions ), 1 , 5 , 14 or elucidate trends and interactions among variables ( relationship research questions ). 1 , 5 We provide examples of descriptive, comparative, and relationship research questions in quantitative research in Table 2 .

Hypotheses in quantitative research

In quantitative research, hypotheses predict the expected relationships among variables. 15 Relationships among variables that can be predicted include 1) between a single dependent variable and a single independent variable ( simple hypothesis ) or 2) between two or more independent and dependent variables ( complex hypothesis ). 4 , 11 Hypotheses may also specify the expected direction to be followed and imply an intellectual commitment to a particular outcome ( directional hypothesis ) 4 . On the other hand, hypotheses may not predict the exact direction and are used in the absence of a theory, or when findings contradict previous studies ( non-directional hypothesis ). 4 In addition, hypotheses can 1) define interdependency between variables ( associative hypothesis ), 4 2) propose an effect on the dependent variable from manipulation of the independent variable ( causal hypothesis ), 4 3) state a negative relationship between two variables ( null hypothesis ), 4 , 11 , 15 4) replace the working hypothesis if rejected ( alternative hypothesis ), 15 explain the relationship of phenomena to possibly generate a theory ( working hypothesis ), 11 5) involve quantifiable variables that can be tested statistically ( statistical hypothesis ), 11 6) or express a relationship whose interlinks can be verified logically ( logical hypothesis ). 11 We provide examples of simple, complex, directional, non-directional, associative, causal, null, alternative, working, statistical, and logical hypotheses in quantitative research, as well as the definition of quantitative hypothesis-testing research in Table 3 .

Research questions in qualitative research

Unlike research questions in quantitative research, research questions in qualitative research are usually continuously reviewed and reformulated. The central question and associated subquestions are stated more than the hypotheses. 15 The central question broadly explores a complex set of factors surrounding the central phenomenon, aiming to present the varied perspectives of participants. 15

There are varied goals for which qualitative research questions are developed. These questions can function in several ways, such as to 1) identify and describe existing conditions ( contextual research question s); 2) describe a phenomenon ( descriptive research questions ); 3) assess the effectiveness of existing methods, protocols, theories, or procedures ( evaluation research questions ); 4) examine a phenomenon or analyze the reasons or relationships between subjects or phenomena ( explanatory research questions ); or 5) focus on unknown aspects of a particular topic ( exploratory research questions ). 5 In addition, some qualitative research questions provide new ideas for the development of theories and actions ( generative research questions ) or advance specific ideologies of a position ( ideological research questions ). 1 Other qualitative research questions may build on a body of existing literature and become working guidelines ( ethnographic research questions ). Research questions may also be broadly stated without specific reference to the existing literature or a typology of questions ( phenomenological research questions ), may be directed towards generating a theory of some process ( grounded theory questions ), or may address a description of the case and the emerging themes ( qualitative case study questions ). 15 We provide examples of contextual, descriptive, evaluation, explanatory, exploratory, generative, ideological, ethnographic, phenomenological, grounded theory, and qualitative case study research questions in qualitative research in Table 4 , and the definition of qualitative hypothesis-generating research in Table 5 .

Qualitative studies usually pose at least one central research question and several subquestions starting with How or What . These research questions use exploratory verbs such as explore or describe . These also focus on one central phenomenon of interest, and may mention the participants and research site. 15

Hypotheses in qualitative research

Hypotheses in qualitative research are stated in the form of a clear statement concerning the problem to be investigated. Unlike in quantitative research where hypotheses are usually developed to be tested, qualitative research can lead to both hypothesis-testing and hypothesis-generating outcomes. 2 When studies require both quantitative and qualitative research questions, this suggests an integrative process between both research methods wherein a single mixed-methods research question can be developed. 1

FRAMEWORKS FOR DEVELOPING RESEARCH QUESTIONS AND HYPOTHESES

Research questions followed by hypotheses should be developed before the start of the study. 1 , 12 , 14 It is crucial to develop feasible research questions on a topic that is interesting to both the researcher and the scientific community. This can be achieved by a meticulous review of previous and current studies to establish a novel topic. Specific areas are subsequently focused on to generate ethical research questions. The relevance of the research questions is evaluated in terms of clarity of the resulting data, specificity of the methodology, objectivity of the outcome, depth of the research, and impact of the study. 1 , 5 These aspects constitute the FINER criteria (i.e., Feasible, Interesting, Novel, Ethical, and Relevant). 1 Clarity and effectiveness are achieved if research questions meet the FINER criteria. In addition to the FINER criteria, Ratan et al. described focus, complexity, novelty, feasibility, and measurability for evaluating the effectiveness of research questions. 14

The PICOT and PEO frameworks are also used when developing research questions. 1 The following elements are addressed in these frameworks, PICOT: P-population/patients/problem, I-intervention or indicator being studied, C-comparison group, O-outcome of interest, and T-timeframe of the study; PEO: P-population being studied, E-exposure to preexisting conditions, and O-outcome of interest. 1 Research questions are also considered good if these meet the “FINERMAPS” framework: Feasible, Interesting, Novel, Ethical, Relevant, Manageable, Appropriate, Potential value/publishable, and Systematic. 14

As we indicated earlier, research questions and hypotheses that are not carefully formulated result in unethical studies or poor outcomes. To illustrate this, we provide some examples of ambiguous research question and hypotheses that result in unclear and weak research objectives in quantitative research ( Table 6 ) 16 and qualitative research ( Table 7 ) 17 , and how to transform these ambiguous research question(s) and hypothesis(es) into clear and good statements.

a These statements were composed for comparison and illustrative purposes only.

b These statements are direct quotes from Higashihara and Horiuchi. 16

a This statement is a direct quote from Shimoda et al. 17

The other statements were composed for comparison and illustrative purposes only.

CONSTRUCTING RESEARCH QUESTIONS AND HYPOTHESES

To construct effective research questions and hypotheses, it is very important to 1) clarify the background and 2) identify the research problem at the outset of the research, within a specific timeframe. 9 Then, 3) review or conduct preliminary research to collect all available knowledge about the possible research questions by studying theories and previous studies. 18 Afterwards, 4) construct research questions to investigate the research problem. Identify variables to be accessed from the research questions 4 and make operational definitions of constructs from the research problem and questions. Thereafter, 5) construct specific deductive or inductive predictions in the form of hypotheses. 4 Finally, 6) state the study aims . This general flow for constructing effective research questions and hypotheses prior to conducting research is shown in Fig. 1 .

An external file that holds a picture, illustration, etc.
Object name is jkms-37-e121-g001.jpg

Research questions are used more frequently in qualitative research than objectives or hypotheses. 3 These questions seek to discover, understand, explore or describe experiences by asking “What” or “How.” The questions are open-ended to elicit a description rather than to relate variables or compare groups. The questions are continually reviewed, reformulated, and changed during the qualitative study. 3 Research questions are also used more frequently in survey projects than hypotheses in experiments in quantitative research to compare variables and their relationships.

Hypotheses are constructed based on the variables identified and as an if-then statement, following the template, ‘If a specific action is taken, then a certain outcome is expected.’ At this stage, some ideas regarding expectations from the research to be conducted must be drawn. 18 Then, the variables to be manipulated (independent) and influenced (dependent) are defined. 4 Thereafter, the hypothesis is stated and refined, and reproducible data tailored to the hypothesis are identified, collected, and analyzed. 4 The hypotheses must be testable and specific, 18 and should describe the variables and their relationships, the specific group being studied, and the predicted research outcome. 18 Hypotheses construction involves a testable proposition to be deduced from theory, and independent and dependent variables to be separated and measured separately. 3 Therefore, good hypotheses must be based on good research questions constructed at the start of a study or trial. 12

In summary, research questions are constructed after establishing the background of the study. Hypotheses are then developed based on the research questions. Thus, it is crucial to have excellent research questions to generate superior hypotheses. In turn, these would determine the research objectives and the design of the study, and ultimately, the outcome of the research. 12 Algorithms for building research questions and hypotheses are shown in Fig. 2 for quantitative research and in Fig. 3 for qualitative research.

An external file that holds a picture, illustration, etc.
Object name is jkms-37-e121-g002.jpg

EXAMPLES OF RESEARCH QUESTIONS FROM PUBLISHED ARTICLES

  • EXAMPLE 1. Descriptive research question (quantitative research)
  • - Presents research variables to be assessed (distinct phenotypes and subphenotypes)
  • “BACKGROUND: Since COVID-19 was identified, its clinical and biological heterogeneity has been recognized. Identifying COVID-19 phenotypes might help guide basic, clinical, and translational research efforts.
  • RESEARCH QUESTION: Does the clinical spectrum of patients with COVID-19 contain distinct phenotypes and subphenotypes? ” 19
  • EXAMPLE 2. Relationship research question (quantitative research)
  • - Shows interactions between dependent variable (static postural control) and independent variable (peripheral visual field loss)
  • “Background: Integration of visual, vestibular, and proprioceptive sensations contributes to postural control. People with peripheral visual field loss have serious postural instability. However, the directional specificity of postural stability and sensory reweighting caused by gradual peripheral visual field loss remain unclear.
  • Research question: What are the effects of peripheral visual field loss on static postural control ?” 20
  • EXAMPLE 3. Comparative research question (quantitative research)
  • - Clarifies the difference among groups with an outcome variable (patients enrolled in COMPERA with moderate PH or severe PH in COPD) and another group without the outcome variable (patients with idiopathic pulmonary arterial hypertension (IPAH))
  • “BACKGROUND: Pulmonary hypertension (PH) in COPD is a poorly investigated clinical condition.
  • RESEARCH QUESTION: Which factors determine the outcome of PH in COPD?
  • STUDY DESIGN AND METHODS: We analyzed the characteristics and outcome of patients enrolled in the Comparative, Prospective Registry of Newly Initiated Therapies for Pulmonary Hypertension (COMPERA) with moderate or severe PH in COPD as defined during the 6th PH World Symposium who received medical therapy for PH and compared them with patients with idiopathic pulmonary arterial hypertension (IPAH) .” 21
  • EXAMPLE 4. Exploratory research question (qualitative research)
  • - Explores areas that have not been fully investigated (perspectives of families and children who receive care in clinic-based child obesity treatment) to have a deeper understanding of the research problem
  • “Problem: Interventions for children with obesity lead to only modest improvements in BMI and long-term outcomes, and data are limited on the perspectives of families of children with obesity in clinic-based treatment. This scoping review seeks to answer the question: What is known about the perspectives of families and children who receive care in clinic-based child obesity treatment? This review aims to explore the scope of perspectives reported by families of children with obesity who have received individualized outpatient clinic-based obesity treatment.” 22
  • EXAMPLE 5. Relationship research question (quantitative research)
  • - Defines interactions between dependent variable (use of ankle strategies) and independent variable (changes in muscle tone)
  • “Background: To maintain an upright standing posture against external disturbances, the human body mainly employs two types of postural control strategies: “ankle strategy” and “hip strategy.” While it has been reported that the magnitude of the disturbance alters the use of postural control strategies, it has not been elucidated how the level of muscle tone, one of the crucial parameters of bodily function, determines the use of each strategy. We have previously confirmed using forward dynamics simulations of human musculoskeletal models that an increased muscle tone promotes the use of ankle strategies. The objective of the present study was to experimentally evaluate a hypothesis: an increased muscle tone promotes the use of ankle strategies. Research question: Do changes in the muscle tone affect the use of ankle strategies ?” 23

EXAMPLES OF HYPOTHESES IN PUBLISHED ARTICLES

  • EXAMPLE 1. Working hypothesis (quantitative research)
  • - A hypothesis that is initially accepted for further research to produce a feasible theory
  • “As fever may have benefit in shortening the duration of viral illness, it is plausible to hypothesize that the antipyretic efficacy of ibuprofen may be hindering the benefits of a fever response when taken during the early stages of COVID-19 illness .” 24
  • “In conclusion, it is plausible to hypothesize that the antipyretic efficacy of ibuprofen may be hindering the benefits of a fever response . The difference in perceived safety of these agents in COVID-19 illness could be related to the more potent efficacy to reduce fever with ibuprofen compared to acetaminophen. Compelling data on the benefit of fever warrant further research and review to determine when to treat or withhold ibuprofen for early stage fever for COVID-19 and other related viral illnesses .” 24
  • EXAMPLE 2. Exploratory hypothesis (qualitative research)
  • - Explores particular areas deeper to clarify subjective experience and develop a formal hypothesis potentially testable in a future quantitative approach
  • “We hypothesized that when thinking about a past experience of help-seeking, a self distancing prompt would cause increased help-seeking intentions and more favorable help-seeking outcome expectations .” 25
  • “Conclusion
  • Although a priori hypotheses were not supported, further research is warranted as results indicate the potential for using self-distancing approaches to increasing help-seeking among some people with depressive symptomatology.” 25
  • EXAMPLE 3. Hypothesis-generating research to establish a framework for hypothesis testing (qualitative research)
  • “We hypothesize that compassionate care is beneficial for patients (better outcomes), healthcare systems and payers (lower costs), and healthcare providers (lower burnout). ” 26
  • Compassionomics is the branch of knowledge and scientific study of the effects of compassionate healthcare. Our main hypotheses are that compassionate healthcare is beneficial for (1) patients, by improving clinical outcomes, (2) healthcare systems and payers, by supporting financial sustainability, and (3) HCPs, by lowering burnout and promoting resilience and well-being. The purpose of this paper is to establish a scientific framework for testing the hypotheses above . If these hypotheses are confirmed through rigorous research, compassionomics will belong in the science of evidence-based medicine, with major implications for all healthcare domains.” 26
  • EXAMPLE 4. Statistical hypothesis (quantitative research)
  • - An assumption is made about the relationship among several population characteristics ( gender differences in sociodemographic and clinical characteristics of adults with ADHD ). Validity is tested by statistical experiment or analysis ( chi-square test, Students t-test, and logistic regression analysis)
  • “Our research investigated gender differences in sociodemographic and clinical characteristics of adults with ADHD in a Japanese clinical sample. Due to unique Japanese cultural ideals and expectations of women's behavior that are in opposition to ADHD symptoms, we hypothesized that women with ADHD experience more difficulties and present more dysfunctions than men . We tested the following hypotheses: first, women with ADHD have more comorbidities than men with ADHD; second, women with ADHD experience more social hardships than men, such as having less full-time employment and being more likely to be divorced.” 27
  • “Statistical Analysis
  • ( text omitted ) Between-gender comparisons were made using the chi-squared test for categorical variables and Students t-test for continuous variables…( text omitted ). A logistic regression analysis was performed for employment status, marital status, and comorbidity to evaluate the independent effects of gender on these dependent variables.” 27

EXAMPLES OF HYPOTHESIS AS WRITTEN IN PUBLISHED ARTICLES IN RELATION TO OTHER PARTS

  • EXAMPLE 1. Background, hypotheses, and aims are provided
  • “Pregnant women need skilled care during pregnancy and childbirth, but that skilled care is often delayed in some countries …( text omitted ). The focused antenatal care (FANC) model of WHO recommends that nurses provide information or counseling to all pregnant women …( text omitted ). Job aids are visual support materials that provide the right kind of information using graphics and words in a simple and yet effective manner. When nurses are not highly trained or have many work details to attend to, these job aids can serve as a content reminder for the nurses and can be used for educating their patients (Jennings, Yebadokpo, Affo, & Agbogbe, 2010) ( text omitted ). Importantly, additional evidence is needed to confirm how job aids can further improve the quality of ANC counseling by health workers in maternal care …( text omitted )” 28
  • “ This has led us to hypothesize that the quality of ANC counseling would be better if supported by job aids. Consequently, a better quality of ANC counseling is expected to produce higher levels of awareness concerning the danger signs of pregnancy and a more favorable impression of the caring behavior of nurses .” 28
  • “This study aimed to examine the differences in the responses of pregnant women to a job aid-supported intervention during ANC visit in terms of 1) their understanding of the danger signs of pregnancy and 2) their impression of the caring behaviors of nurses to pregnant women in rural Tanzania.” 28
  • EXAMPLE 2. Background, hypotheses, and aims are provided
  • “We conducted a two-arm randomized controlled trial (RCT) to evaluate and compare changes in salivary cortisol and oxytocin levels of first-time pregnant women between experimental and control groups. The women in the experimental group touched and held an infant for 30 min (experimental intervention protocol), whereas those in the control group watched a DVD movie of an infant (control intervention protocol). The primary outcome was salivary cortisol level and the secondary outcome was salivary oxytocin level.” 29
  • “ We hypothesize that at 30 min after touching and holding an infant, the salivary cortisol level will significantly decrease and the salivary oxytocin level will increase in the experimental group compared with the control group .” 29
  • EXAMPLE 3. Background, aim, and hypothesis are provided
  • “In countries where the maternal mortality ratio remains high, antenatal education to increase Birth Preparedness and Complication Readiness (BPCR) is considered one of the top priorities [1]. BPCR includes birth plans during the antenatal period, such as the birthplace, birth attendant, transportation, health facility for complications, expenses, and birth materials, as well as family coordination to achieve such birth plans. In Tanzania, although increasing, only about half of all pregnant women attend an antenatal clinic more than four times [4]. Moreover, the information provided during antenatal care (ANC) is insufficient. In the resource-poor settings, antenatal group education is a potential approach because of the limited time for individual counseling at antenatal clinics.” 30
  • “This study aimed to evaluate an antenatal group education program among pregnant women and their families with respect to birth-preparedness and maternal and infant outcomes in rural villages of Tanzania.” 30
  • “ The study hypothesis was if Tanzanian pregnant women and their families received a family-oriented antenatal group education, they would (1) have a higher level of BPCR, (2) attend antenatal clinic four or more times, (3) give birth in a health facility, (4) have less complications of women at birth, and (5) have less complications and deaths of infants than those who did not receive the education .” 30

Research questions and hypotheses are crucial components to any type of research, whether quantitative or qualitative. These questions should be developed at the very beginning of the study. Excellent research questions lead to superior hypotheses, which, like a compass, set the direction of research, and can often determine the successful conduct of the study. Many research studies have floundered because the development of research questions and subsequent hypotheses was not given the thought and meticulous attention needed. The development of research questions and hypotheses is an iterative process based on extensive knowledge of the literature and insightful grasp of the knowledge gap. Focused, concise, and specific research questions provide a strong foundation for constructing hypotheses which serve as formal predictions about the research outcomes. Research questions and hypotheses are crucial elements of research that should not be overlooked. They should be carefully thought of and constructed when planning research. This avoids unethical studies and poor outcomes by defining well-founded objectives that determine the design, course, and outcome of the study.

Disclosure: The authors have no potential conflicts of interest to disclose.

Author Contributions:

  • Conceptualization: Barroga E, Matanguihan GJ.
  • Methodology: Barroga E, Matanguihan GJ.
  • Writing - original draft: Barroga E, Matanguihan GJ.
  • Writing - review & editing: Barroga E, Matanguihan GJ.

National Center for Science and Engineering Statistics

  • All previous cycle years

The Survey of State Government Research and Development (R&D) provides comprehensive, uniform statistics regarding the extent of R&D activity performed and funded by departments and agencies in each of the nation's 50 states, the District of Columbia, and Puerto Rico.

Survey Info

  • tag for use when URL is provided --> Methodology
  • tag for use when URL is provided --> Data
  • tag for use when URL is provided --> Analysis

The Survey of State Government Research and Development (R&D) measures the extent of R&D activity performed and funded by the governments of each of the nation’s 50 states, the District of Columbia, and Puerto Rico (collectively, states). By employing consistent, uniform definitions and collection techniques, the survey allows collection of state R&D expenditures data that are comparable nationwide. The survey is a census of state government departments, agencies, commissions, public authorities, and dependent entities with R&D activities.

Areas of Interest

  • Government Funding for Science and Engineering
  • Research and Development

Survey Administration

The survey was funded by the National Center for Science and Engineering Statistics within the National Science Foundation, and data collection was conducted by the Census Bureau.

Survey Details

  • Survey Description (PDF 130 KB)
  • Data Tables (PDF 1.6 MB)

Featured Survey Analysis

State Government Agencies’ Expenditures for R&D Totaled $2.6 Billion in FY 2022, an Increase of 5% from FY 2021.

State Government Agencies’ Expenditures for R&D Totaled $2.6 Billion in FY 2022, an Increase of 5% from FY 2021

Image 2454

Survey of State Government R&D Overview

Data highlights, state government agencies' research and experimental development (r&d) expenditures reached $2.644 billion in fy 2022.

Figure 1

Intramural performers of R&D, or the state agencies themselves, totaled $697 million in expenditures in FY 2022

Figure 1

Methodology

Survey description, survey overview (fy 2022 survey cycle).

The Survey of State Government Research and Development (R&D) is the only source for comprehensive, uniform statistics regarding the extent of R&D activity performed and funded by departments and agencies in each of the nation’s 50 state governments, the government of the District of Columbia, and the government of Puerto Rico.

Data collection authority

The information is solicited under the authority of the National Science Foundation Act of 1950, as amended, and the America COMPETES Reauthorization Act of 2010. It is collected under Office of Management and Budget control number 0607–0933, expiration date 31 July 2023. The survey is conducted by the Census Bureau under Title 13, U.S. Code, § 8(b) for the National Center for Science and Engineering Statistics (NCSES) within the National Science Foundation.

Major changes to recent survey cycle

Key survey information, initial survey year, reference period.

State government fiscal year ending in 2022.

Response unit

State government departments, agencies, commissions, public authorities, institutions, and other entities that operate separately or somewhat autonomously from the central state government—but where the state government maintains administrative or fiscal control over their activities—with the capacity to perform or fund R&D; units are collectively referred to as agencies .

Sample or census

Population size.

505 agencies.

Sample size

Not applicable.

Key variables

  • State government department or agency
  • Total expenditures for R&D
  • R&D expenditures by source of funds (federal, state, and other)
  • Expenditures for intramural performance by source of funds
  • Expenditures for intramural performance by type of work (basic research, applied research, and experimental development)
  • Expenditures for extramural performance by source of funds
  • Expenditures for extramural performance by type of performer (academic institutions, companies and individuals, and others)
  • Federal funds for R&D by state and federal agency
  • R&D expenditures by governmental function (agriculture, energy, environment and natural resources, health, transportation, and other)
  • Capital outlays for state government R&D-related facilities
  • R&D personnel and full-time equivalent by type (researchers, technicians, and support staff)

Survey Design

Target population.

The target population consists of all state departments, agencies, commissions, and dependent entities that funded R&D activities for state government fiscal years ending in 2022. Several industry-specific state commissions, which are generally chartered by state legislatures but are administered independently, are considered state agencies and included in the survey’s population of interest. Excluded are state-run colleges and universities, which are canvassed as part of NCSES’s Higher Education Research and Development (HERD) Survey. State-run laboratories or experiment stations controlled by state universities are also excluded from the respondent universe, as are any entities determined to be nonprofit or private as defined by the Census Bureau. Most state fiscal year periods begin 1 July and end the following 30 June. For example, FY 2022 is defined as the state fiscal period beginning on 1 July 2021 and ending on 30 June 2022. There are, however, five exceptions to the 30 June fiscal year end: New York (ends 31 March); Texas (ends 31 August); and Alabama, the District of Columbia, and Michigan (all end 30 September). For comparability, all states, the District of Columbia, and Puerto Rico are surveyed at the same time.

Sampling frame

The total universe includes all state government-dependent units with the capacity to perform or fund R&D, including those for the District of Columbia and Puerto Rico, as defined by the Census Bureau’s Government Finance and Employment Classification Manual . All units were identified with the aid of a state coordinator who was appointed by the governor of each state, the governor of Puerto Rico, and the mayor of the District of Columbia. For FY 2022, e-mails were sent to the chief of staff for each governor’s office asking them to appoint a state coordinator. For the FY 2022 survey, state coordinators were provided with the list of agencies that were previously identified as having the potential to perform or fund R&D from the FY 2021 survey cycle. In addition, the list included agencies identified from a systematic review of state session laws and additional review of agencies reporting to the Census Bureau’s Census of Governments program by staff members from the Census Bureau and NCSES. State coordinators were asked to review this list and add agencies that they believed were involved with R&D and were not already identified. State agencies that have reported $0 R&D for the last three survey cycles were marked as inactive for this survey cycle. These will be reactivated for the FY 2023 survey cycle. State coordinators also adjusted the agency universe to remove agencies that have never had any qualifying R&D to report to NCSES, to add any agencies they thought were missing or could possibly have R&D to report, to address organizational changes within their respective states since the previous survey, and to provide updated agency contact information.

Sample design

The Survey of State Government R&D is a census.

Data Collection and Processing

Data collection.

The survey was funded by NCSES. Data collection was conducted by the Census Bureau via an e-mail containing a fillable Portable Document Format (PDF) survey form. The survey was launched in October 2022, and responses were collected through mid-June 2023. The respondent questionnaire consisted of one screening question intended to reduce the burden on agency respondents who did not have qualifying R&D expenditures during FY 2022, seven questions regarding R&D-related expenditures, and two questions regarding counts of employees.

Data processing

Data collected under the survey are subject to automated data correction procedures using a combination of logical edits incorporated into the survey form, as well as telephone and e-mail follow-up with survey respondents by staff members from NCSES and the Census Bureau for any other data anomalies.

Estimation techniques

All state and national totals are summations of reported state agency data.

Survey Quality Measures

Sampling error, coverage error.

In addition to a Census Bureau review of state session laws to identify agencies with the capacity to fund R&D, NCSES utilizes the expertise of an appointed state coordinator to assist in identifying state government agencies that have the capacity to perform or fund R&D. State coordinators are also offered the opportunity to review survey responses from their respective state agencies before results are finalized for data release. In cases where the state coordinator refused to cooperate or where some agencies did not respond to the survey, it is possible there may be an undercount of state government R&D activities. The undercount may occur despite efforts by staff members of NCSES and the Census Bureau to conduct additional queries and conduct outreach with state agencies that did not appoint a state coordinator. In other instances, the appointed state coordinator could misinterpret the NCSES definition and examples of qualifying R&D activities and not identify all state government-dependent units with the capacity to perform or fund R&D. However, no measures of coverage error are produced.

Nonresponse error

Of the 505 agencies in the survey universe, 502 (99.4%) responded to the survey. Of the 502 respondents, 400 (79.7%) reported having R&D activities in FY 2022. A mathematical imputation method was used to impute for one nonresponding agency known to have R&D in the past, namely New Mexico Department of Cultural Affairs. An agency in Puerto Rico was removed from the survey because it reported expenditures for the entire agency budget rather than an R&D amount. Thus, nonresponse error is minimal.

Measurement error

All responses, including the initial agency data submissions and final state coordinator reviews, were received via e-mail or phone. Census Bureau staff performed basic logical edit checks and reviewed respondent comments, allowing staff to detect errors and work with respondents to correct them. Despite these efforts, some of the data reported could include expenditures for non-R&D activities, such as commercialization, environmental testing, or routine survey work. Similarly, some state data may also exclude minor R&D expenditure amounts from agencies not surveyed.

Data Availability and Comparability

Data availability.

Data presented in trend tables in this report are from the most recently completed survey cycle. Agency-level data are available beginning with FY 2009. No survey of state governments’ FY 2008 R&D activity was conducted.

Data comparability

References to data prior to FY 2022 should be restricted to those published in this report for three reasons: (1) when completing the current-year survey, survey respondents may revise their prior-year data; (2) state coordinators may identify additional agencies to be canvassed that were not initially surveyed during the prior survey cycle, and many of these agencies will provide prior-year data during the current survey collection cycle; and (3) NCSES reviews data from prior years for consistency with current-year responses and, if necessary, may revise these data in consultation with respondents.

For FYs 1995, 1988, and 1987, data collections of state government R&D were conducted by nonfederal organizations that were supported by NCSES grants. Prior to those efforts, NCSES collected state government R&D data for FYs 1977, 1973, 1972, 1968, 1967, 1965, and 1964 in collaboration with the Census Bureau’s Census of Governments and related programs. Because of differences in the survey populations, in definitions of covered R&D activities, and in collection methods over time, the results of these historical surveys are not comparable with the statistics collected for FY 2006 and subsequent Surveys of State Government R&D.

Data Products

Data from the Survey of State Government R&D are published in NCSES InfoBriefs and data tables available at https://ncses.nsf.gov/surveys/state-government-research-development/ . Data from the Survey of State Government R&D are also used in the annual report National Patterns of R&D and the biennial report Science and Engineering Indicators .

Technical Notes

Purpose. The Survey of State Government Research and Development (R&D) is the only source for comprehensive uniform statistics regarding the extent of R&D activity performed and funded by departments and agencies in each of the nation’s 50 state governments, the government of the District of Columbia, and the government of Puerto Rico.

Data collection a uthority. The information is solicited under the National Science Foundation (NSF) Act of 1950, as amended; the America COMPETES Reauthorization Act of 2010; and Title 13, U.S. Code, § 8(b). It is collected under Office of Management and Budget control number 0607-0933, expiration date 31 July 2023.

Survey c on tractor. The Census Bureau, under NSF interagency agreement number NCSE-2203296, collected, processed, and tabulated the statistics in this report.

Survey s ponsor. The National Center for Science and Engineering Statistics (NCSES) within NSF.

Frequency. Annual.

Initial survey y ear. FY 2006.

Reference p eriod. State government fiscal year ending in 2022.

Response unit. State government departments, agencies, commissions, public authorities, institutions, and other entities that operate separately or somewhat autonomously from the central state government—but where the state government maintains administrative or fiscal control over their activities—with the capacity to perform or fund R&D; units are collectively referred to as agencies .

Sample or census. Census.

Population size. The population comprised 505 agencies from the 50 state governments, the District of Columbia, and Puerto Rico with the capacity to perform or fund R&D during FY 2022. This year, agencies that reported $0 R&D the last three survey cycles were marked as inactive for this survey cycle. These will be reactivated for the FY 2023 survey cycle.

Sample size. Not applicable.

Target population. State government departments, agencies, commissions, public authorities, institutions, and other entities that operate separately or somewhat autonomously from the central state government but where the state government maintains administrative or fiscal control over their activities, as defined by the Census Bureau’s Government Finance and Employment Classification Manual (see chapter 1), and that funded or performed R&D for state government FY 2022. Several industry-specific state commissions, which are generally chartered by state legislatures but are administered independently, are considered state agencies and are included in the survey’s population. State-run colleges and universities, which are canvassed as part of NCSES’s Higher Education Research and Development (HERD) Survey, are excluded from the survey frame. State-run laboratories or experiment stations controlled by state universities are also excluded from the respondent universe, as are any entities determined to be nonprofit or private, as defined by the Census Bureau government classification criteria. However, because agricultural experiment stations in Connecticut are legally organized as a state government-dependent agency and because they are not affiliated with any university system, they are included in the survey’s population.

Sampl ing frame. The total universe includes all state government-dependent units, including those for the District of Columbia and Puerto Rico, with the capacity to perform and fund R&D, identified with the aid of a state coordinator who is appointed by the governor of each state. For the FY 2022 survey, state coordinators were provided with a list of agencies that were previously identified from the FY 2021 survey cycle as having the potential to perform or fund R&D. In addition, these lists included agencies identified from a systematic review of state session laws and additional review of agencies reporting to the Census Bureau’s Census of Governments program by Census Bureau and NCSES staff. Coordinators were asked to review this list and add agencies that they believed were involved with R&D and were not already identified. State coordinators also adjusted the agency universe to remove agencies that have never had any qualifying R&D to report to NCSES, to add any agencies that were missing or could have R&D to report, to address organizational changes within their respective states since the previous survey, and to provide updated agency contact information.

Sample design. The Survey of State Government R&D is a census.

Data Collection and Processing Methods

Data c ollection. For FY 2022, e-mails were sent to the chief of staff for all governors asking them to appoint a state coordinator. Census Bureau staff also reached out to prior state coordinators asking if they would be willing to continue to serve as the coordinators for the FY 2022 cycle. On a flow basis, state coordinators were sent a spreadsheet of agencies and contacts that were surveyed for the FY 2021 Survey of State Government R&D and asked to add agencies that might have some R&D, remove agencies from the survey universe that no longer perform or fund R&D or have been reorganized, and update agency points of contact. Once the state coordinators completed updates to the list of active agencies to be surveyed, they then sent introductory e-mails to the agency respondents stating that respondents would be receiving an e-mail with instructions and the survey form to be completed and e-mailed back to the Census Bureau. The state coordinators sent the updated spreadsheet of agencies and contacts back to the Census Bureau. After receiving the list of updated agencies, agencies identified as having the potential to perform or fund R&D were then e-mailed the survey form. Upon completion by all agencies within a state, the state coordinators were provided with a spreadsheet of agency responses to review the survey results before they were provided to NCSES for final analysis and dissemination.

Mode. State agencies were e-mailed a fillable Portable Document Format (PDF) form with auto summations and edits built in and were asked to complete the form. State coordinators were given a spreadsheet of potential state agencies and contact information to review and revise as necessary to add agencies to be surveyed, remove others from the survey as inactive, or make corrections to agency points of contact.

Response rates. Response rates were calculated for the FY 2022 Survey of State Government R&D and are available in table A-1 .

All 50 state governments and the District of Columbia participated in the survey. A total of 31 of 52 state coordinators updated their spreadsheet. However, only 13 of 52 state coordinators officially responded to verify the final aggregate data for their states. For those agencies that did not have a coordinator appointed, final aggregate data files were sent to staff at NCSES for review. Some or all agencies submitted data in those states where the coordinator did not verify data officially. A coordinator was not appointed in Arizona, Georgia, Indiana, Louisiana, Massachusetts, Nevada, New Mexico, New York, Pennsylvania, Puerto Rico, Rhode Island, South Carolina, Tennessee, and West Virginia. Historically, NCSES has partnered with the Puerto Rico Institute of Statistics to collect information on R&D spending from Puerto Rico agencies. The Institute conducts its own survey of R&D activities in the territory and uses the NCSES survey questions for its collections of government agencies and provides the results to NCSES. Although the governor’s office did not appoint a coordinator for this year’s survey, they were able to provide a contact at the Department of Economic Development and Commerce who was able to provide an updated list of agencies and current contacts. Surveys were then sent directly to each agency in Puerto Rico. Responses were received from 9 of the 10 agencies in Puerto Rico that were sent the survey for the FY 2022 cycle.

The final agency response was 502 of 505 agencies (99.4%).

The following agencies did not respond to the survey and as a result may contribute to an undercount in the estimated public expenditures for R&D activities:

  • New Mexico Department of Cultural Affairs
  • Vermont Agency of Education
  • Puerto Rico Molecular Sciences Research Center

Of the 502 agencies that responded, 400 (79.7%) reported having some R&D activity in FY 2022.

Data editing. Initial agency data submissions were received via fillable forms e-mailed to the Census Bureau analyst. Basic logical edit checks, review of respondent comments, and comparisons of data from previous surveys allowed Census Bureau and NCSES staff to detect data errors and work with respondents to correct them. Census Bureau and NCSES staff also conducted follow-up calls to agencies with major data changes in R&D between FY 2021 and FY 2022 to ensure the accuracy of the survey data. Major data changes are dependent on the state and type of agency. After all the agencies in a state submitted responses, a spreadsheet of aggregated agency data was sent to the coordinators. They were asked to perform a final verification of aggregated agency data.

Imputation. Data was imputed for one agency for this cycle. New Mexico Department of Cultural Affairs was imputed using the average of prior year reporting patterns. The reported total for Midwest Dairy was broken out across the divisions in Arkansas, Illinois, Iowa, Kansas, Minnesota, Nebraska, North Dakota, Oklahoma, and South Dakota based on percentages each division contributed as per the annual report. All state and national totals are aggregates of reported agency data. Each state government’s organizational structure, laws, and delegation of powers within its purview are unique. No universally applied methods of imputation can be used across all state government agencies and still account for these structural differences. This is consistent with basic statistical methods used by the Census Bureau’s Census of Governments, Survey of State Government Finance. Therefore, R&D expenditures may be underestimated in states where agencies failed to respond. When imputation methods are needed, they are handled on an individual state-by-state basis.

Weighting. Not applicable.

Variance estimation. Not applicable.

Sampling error . Not applicable.

Coverage error. In addition to a Census Bureau review of state session laws to identify agencies with the capacity to fund R&D, NCSES utilizes the expertise of an appointed state coordinator to assist in identifying state government agencies that have the capacity to perform or fund R&D. State coordinators are also offered the opportunity to review survey responses from their respective state agencies before results are finalized for data release. In cases where the state coordinator refused to cooperate or where some agencies failed to respond to the survey, it is possible there may be an undercount of state government R&D activities. The undercount may occur despite efforts by NCSES and Census Bureau staff to conduct additional queries and outreach with state agencies that did not appoint a state coordinator. In other instances, the appointed state coordinator could misinterpret the survey definition and examples of qualifying R&D activities and thus fail to identify all state government-dependent units with the capacity to perform or fund R&D. However, no measures of coverage error are produced.

Nonresponse error. Of the 505 agencies in the survey universe, 502 (99.4%) responded to the survey. Of the 502 respondents, 400 (79.7%) reported having R&D activities in FY 2022. A mathematical statistical method was used to impute one nonresponding agency known to have R&D in the past, namely the New Mexico Department of Cultural Affairs. No other imputation was used for the other nonresponding agencies because the other agencies historically have reported not having R&D. An agency in Puerto Rico was removed from the survey because it reported expenditures for the entire agency budget rather than an R&D amount. Nonresponse error is minimal.

Measurement error . The most common form of nonsampling error in the Survey of State Government R&D is in respondents’ interpretation of the survey definition of qualifying R&D activities. To mitigate any potential misinterpretations, several steps were taken. NCSES provided a series of examples specific to the types of activities performed or funded by state government agencies in the survey questionnaire’s definitions and examples. All responses, including the initial agency data submissions and final state coordinator verifications, were received via e-mail or phone. Census Bureau staff performed basic logical edit checks and reviewed respondent comments, allowing staff to detect errors and work with state respondents to correct them. Despite these efforts, some of the data reported could include expenditures for non-R&D activities, such as non-R&D salaries, commercialization, environmental testing, or routine survey or monitoring work. Similarly, some state data may also exclude minor R&D expenditure amounts from agencies not surveyed.

Data Comparability

State government R&D totals can display considerable volatility between survey cycles. For example, state agency expenditures are influenced by several national and state-specific factors, and large changes (either increases or decreases) are not unusual, especially for discretionary spending items such as R&D. States often will create special funds to support specific research activities for a limited time. These funds may have a one-time appropriation from the legislature and expire within 2–5 fiscal years; state agencies obligate those funds for specific R&D projects, depending on availability and expiration of funding authority, as well as other program-specific and administrative considerations. Data reported are agency direct expenditures for R&D in a given fiscal year, not obligations. As such, in the case of multiyear grants to extramural performers, an agency’s expenditures for that fiscal year may be greater than its obligations because expenditures may include spending from the previous year’s appropriations, depending on the specific budget authority granted by the legislature. It is likely that some portion of the reported changes reflects measurement and coverage errors. In the case of R&D funds for extramural performers, some agencies were able to report only multiyear obligations rather than single-year expenditures.

The survey asked about state agencies’ expenditures for R&D at the end of FY 2022. Most states and Puerto Rico have a fiscal year that begins 1 July and ends the following 30 June. For example, FY 2022 is the state fiscal period beginning on 1 July 2021 and ending on 30 June 2022. There are, however, five exceptions to the 30 June fiscal year end: New York (ends 31 March); Texas (ends 31 August); and Alabama, the District of Columbia, and Michigan (all end 30 September). For comparability, all states, the District of Columbia, and Puerto Rico are surveyed at the same time.

A state’s R&D priorities may be shaped by the state’s unique legislative and budgeting processes. State budget practices vary considerably due to both political and historical reasons. Nineteen states enact biennial budgets. Of these states, Montana, Nevada, North Dakota, and Texas have both biennial legislative sessions and biennial budgets. The remaining 15 states of Connecticut, Hawaii, Indiana, Kentucky, Maine, Minnesota, Nebraska, New Hampshire, North Carolina, Ohio, Oregon, Virginia, Washington, Wisconsin, and Wyoming hold annual legislative sessions but maintain biennial budgeting. Only North Dakota and Wyoming enact consolidated 2-year budgets; other biennial budget states enact two annual budgets at one time. As such, the nature of a state’s budget priorities for R&D may be determined on a biennial basis in some states; in others, however, it may be determined on an annual basis. In states with biennial budgets, the legislatures will often make supplemental appropriations to the second-year budget, which may result in further changes to the initial funding priorities.

The data exclude R&D expenditures by state governments that did not flow through state agencies’ budgets. The state totals do not include direct appropriations from state legislatures to colleges and universities. Higher education institutions’ expenditures from state government appropriations are available from NCSES’s HERD Survey . For FY 2022, state government agencies reported $1 billion in expenditures used to support R&D performance by academic institutions. A major factor for the difference between totals reported in NCSES’s HERD Survey and the Survey of State Government R&D is that direct appropriations to state-run universities are included in the former but not in the latter. Another likely factor is the exclusion of R&D at agricultural experiment stations from the state survey totals because they are generally associated with land-grant colleges and universities and are canvassed on the HERD Survey.

Direct comparison of state agency expenditures should also be viewed with caution because state governments often reorganize departments and agencies such that some divisions and offices that were part of one agency may be moved to another agency. In other instances, entire departments may be reorganized into newly created departments. Although the FY 2022 Survey of State Government R&D encountered several instances of these organizational changes in several states, the survey itself is not designed to measure specific changes in state government organization. To account for these and other changes in the data between FY 2021 and FY 2022, staff from the Census Bureau and NCSES conducted follow-up calls for some agencies with major data changes, depending on the type of agency and state, to ensure the accuracy of the survey data.

Data specific to state government agencies were first released with the FY 2009 survey results and are also included in the FY 2022 data tables. Specific agency-level data for FY 2006 and FY 2007 are not available.

The current Survey of State Government R&D has been conducted for FY 2006, FY 2007, FY 2009, FYs 2010–11, FYs 2012–13, FYs 2014–15, FY 2016, FY 2017, FY 2018, FY 2019, FY 2020, FY 2021, and FY 2022. (No survey was conducted for state governments for FY 2008.) Data presented in trend tables in this report are from the most recently completed survey cycle. References to prior-year data should be restricted to those published in this report for two reasons: (1) when completing the current year’s survey, survey respondents may revise their prior year’s data, and (2) NCSES reviews data for prior years for consistency with current-year responses and, if necessary, may revise these data in consultation with respondents.

NCSES has collected state government R&D data for FY 1964, FY 1965, FY 1967, FY 1968, FY 1972, FY 1973, and FY 1977 in collaboration with the Census Bureau’s Census of Governments and related programs. For FY 1987, FY 1988, and FY 1995, data collections of state government R&D were conducted by nonfederal organizations that were supported by NSF grants. As a result of differences in the survey populations, in definitions of covered R&D activities, and in collection methods over time, the results of these historical surveys are not comparable with the statistics collected for the FY 2006 and subsequent Surveys of State Government R&D.

Changes in survey coverage and population . Each year, state coordinators update the universe of agencies most likely to have funded or performed R&D based on changes in funding authority, organization changes within the government, or other initiatives by the legislature. No survey was conducted for state governments for FY 2008. Beginning with the FY 2009 survey cycle, state coordinators were no longer able to overwrite the aggregate R&D data reported by state agencies to correct or modify the state total. Any changes or revisions were now required to be made at the state government agency level.

Changes in questionnaire .

  • FY 2009. The FY 2009 questionnaire was the first to collect state government R&D activities by governmental functions of agriculture, environment and natural resources, health, transportation, and other.
  • FYs 2010 and 2011. The survey was reorganized as a biennial survey and collected 2 fiscal years of data on one questionnaire. In addition, the energy category was added to the list of specific government functions of R&D.
  • FYs 2012 and 2013. A minor change to the instructions in question 1 for extramural performers was made from “R&D done for your department/agency” to “R&D funded by your department/agency” to ensure that all R&D-related projects that the agency funds regardless of the end result (i.e., grants) were properly included.
  • FYs 2014 and 2015. The survey was revised to collect additional details about R&D funding and performance to better align with the Organisation for Economic Co-operation and Development (OECD) 2002 Frascati Manual , the most recent edition available at survey launch. These changes include source of funds for extramural R&D performance supported by federal funds, state funds, or other funds. For all federal funds received for both intramural and extramural R&D, respondents were asked how much was received from specific federal agencies. For intramural R&D performance, respondents were asked how much of federal, state, and other funding was classified as basic research, applied research, or experimental development.
  • FY 2016. Survey reporting period changed from a biennial to annual survey. Questions remained the same with some minor additions to examples and wording changes. A remarks box was added for respondents to provide comments.
  • FY 2017. The “other” category for Internal Sources of R&D was split into multiple categories: nonfederal government funds, nonprofit organizations, businesses, and higher education institutions.
  • FY 2018. No changes were made.
  • FY 2019. A question asking about the amount of time it takes to complete the survey was added for administrative purposes.
  • FY 2020. Two new questions were added asking for the number of intramural R&D personnel and the number of full-time equivalent employees for researchers, technicians, and support staff. Data from these new questions are not available this year.
  • FY 2021. The survey was restructured to group related questions together. For example, questions on expenditures for R&D performed internally by type of R&D, internal R&D employees, and internal full-time equivalent R&D personnel were moved to follow the internal R&D expenditures question. Similarly, questions on expenditures for R&D performed externally and expenditures for R&D performed externally by type of entity were moved to follow the internal R&D questions. All questions asking for crosscuts on the total R&D expenditures were moved to follow both internal and external R&D-related questions.
  • FY 2022. The “other” category for external performance of R&D was split into multiple categories: nonprofit organizations, other governments (i.e., federal, federally funded research and development centers, other state governments, and local governments), and other performers not elsewhere classified. Expenditures for construction and acquisition of land and facilities was broken into internal and external. Neither one of these changes was published in the data tables.

Changes in reporting procedures or classification .

  • FY 2018. Online reporting was replaced with a fillable PDF with automated summations and edits built into the survey instrument.
  • FYs 2019–21. No changes were made.
  • FY 2022. Census Bureau staff reached out to Puerto Rico agencies directly with the fillable PDF survey form, and agencies responded to the Census Bureau. Puerto Rico did not report data for FYs 2016 through 2021 but resumed reporting in FY 2022. Between FY 2005 and 2016, the Puerto Rico Institute of Statistics (PRIS) conducted its own survey of R&D. Staff from the Census Bureau and NCSES worked with staff at PRIS to ensure questions relevant to the Survey of State Government R&D were consistent with those asked by the PRIS during their survey. PRIS would send agency responses to the Census Bureau and NCSES for addition into the Survey of State Government R&D. Between FYs 2016 and 2022, PRIS suspended its own survey and Census Bureau staff provided a copy of the fillable PDF survey to the Puerto Rico Coordinator to send to agencies and provide responses back to the Census Bureau. However, agencies did not respond to requests from the coordinator in Puerto Rico during this time. For the FY 2022 survey, the coordinator in Puerto Rico recommended Census Bureau staff reach out to specific agencies identified by the coordinator for survey responses.

Definitions

Applied research. Original investigation undertaken in order to acquire new knowledge. It is, however, directed primarily toward a specific, practical aim or objective.

Basic research . Experimental or theoretical work undertaken primarily to acquire new knowledge of the underlying foundations of phenomena and observable facts, without any particular application or use in view.

Construction and acquisition of facilities used primarily for R&D. Includes the acquisition of, construction of, and major repairs or alterations to structures, works, equipment, facilities, or land for use in R&D activities. Construction and acquisition of land and facilities used primarily for R&D includes major costs for construction and purchase of buildings to be primarily used as R&D facilities.

Experimental development. Systematic work, drawing on knowledge gained from research and practical experience and producing additional knowledge, which is directed to producing new products or processes or to improving existing products or processes.

Performers, ext ramural . Those outside the department or agency who perform R&D under the administrative oversight or control of that department or agency. This may include projects for the department or agency as well as the department’s or agency’s extramural research programs. Extramural performers include the following:

  • Academic institutions. Public or private universities and colleges.
  • Companies and individuals. Performers under contract for research projects or that received grants for research projects.
  • Nonprofit organizations . This would include foundations.
  • Other governments. Government departments and agencies; other departments or agencies within the state; other state governments; and county, city, special district, or regional local governments either within their own state or in other states.
  • Other performers not elsewhere classified .

Performers, int ramural . Department’s or agency’s own employees who perform R&D, which includes R&D performed by those employees and services performed by others in support of an internal R&D project (e.g., laboratory testing).

Research and development. Comprise creative and systematic work undertaken in order to increase the stock of knowledge—including knowledge of humankind, culture, and society—and to devise new applications of available knowledge. Sources and examples of R&D funding include the following:

  • Federal government . Grants, contracts, awards, and appropriations from the U.S. government.
  • State. Appropriations from the state legislature, agricultural commodity assessments, bond funds, general funds, restricted funds, revenue funds, state grants, tobacco settlement funds, lottery proceeds, funds from other agencies within the state, and revenue from charges, fees, or fines.
  • Nonprofit organizations . Includes funding from foundations.
  • Nonfederal governmen t. Funding from other state governments, county, city, regional, or other local governments.
  • Businesses . Grants and contracts from companies.
  • Higher education institutions . Funding from public or private universities and colleges.
  • R&D e mployee count .
  • Researchers . For example, biologists, psychologists, research scientists and engineers, primary investigators, and R&D managers.
  • Technicians . Equivalent staff includes wildlife technicians, lab technicians, and field staff.
  • Support s taff . For example, accountants, facilities management, grant specialist, and clerical staff.
  • Full-time equivalent R&D personnel . Calculated as the total working hours spent working on R&D during your state government’s fiscal year divided by the number of hours representing a full-time schedule within the same period.

Technical Tables

Questionnaires, view archived questionnaires, key data tables.

Recommended data tables

Data Tables

General notes.

These tables present the results of the FY 2022 Survey of State Government Research and Development (R&D), conducted by the Census Bureau under an interagency agreement with the National Center for Science and Engineering Statistics within the National Science Foundation. The Census Bureau employed a methodology similar to one they use to collect data from state and local governments on other censuses and surveys.

The survey was distributed to departments, agencies, commissions, public authorities, and other state-run entities within the 50 states, the District of Columbia, and Puerto Rico that were considered likely to conduct or fund R&D activities. (The District of Columbia and Puerto Rico are referred to as states and treated as state equivalents throughout this report.) Respondents that did not have a qualifying R&D activity were not required to complete the questionnaire beyond a screening question. All 50 states, the District of Columbia, and Puerto Rico participated in the survey. Respondents were asked to provide statistics on their total R&D expenditures and amounts for intramural and extramural performers; the amount of intramural R&D devoted to basic research, applied research, and experimental development activities by federal and nonfederal funds; the amount for which the federal government was an original source of funds for both intramural and extramural R&D and from which federal agency the funding came; and the amount of money allocated to particular government functions. An additional question asked for expenditures on construction and acquisition of R&D facilities and major equipment. Additional questions were asked about the number of employees (researchers, technicians, and support staff) and full-time equivalent personnel for intramural R&D.

Acknowledgments and Suggested Citation

Acknowledgments.

Christopher V. Pece of the National Center for Science and Engineering Statistics (NCSES) developed and coordinated this report under the guidance of Amber Levanon Seligson, NCSES Program Director, and under the leadership of Emilda B. Rivers, NCSES Director; Christina Freyman, NCSES Deputy Director; and John Finamore, NCSES Chief Statistician.

Under NCSES interagency agreement with the Census Bureau, Vicki Kuppala and Millicent Grant, under the supervision of Michael Flaherty, Chief, Research, Development, and Innovation Surveys Branch at the Census Bureau, compiled the tables in this report. Elizabeth Willhide and Millicent Grant at the Census Bureau also conducted legal review of state sessions laws and agency founding legislation to update the survey frame.

NCSES thanks the program and budget offices at the state agencies that provided information for this report.

Suggested Citation

National Center for Science and Engineering Statistics (NCSES). 2023. Survey of State Government Research and Development: FY 20 2 2 . NSF 24-305. Alexandria, VA: National Science Foundation. Available at https://ncses.nsf.gov/surveys/state-government-research-development/2022 .

Featured Analysis

Definitions of research and development, related content, related collections, survey contact.

For additional information about this survey or the methodology, contact

Get e-mail updates from NCSES

NCSES is an official statistical agency. Subscribe below to receive our latest news and announcements.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts

Research articles

research and results 2022

Potential biomechanical risk factors on developing lead knee osteoarthritis in the golf swing

  • Sung Eun Kim
  • Nicole Segovia Pham
  • Jangyun Lee

research and results 2022

Situation assessment in air combat considering incomplete frame of discernment in the generalized evidence theory

  • Yongchuan Tang
  • Xiaozhe Zhao

research and results 2022

Comparison of changes in lipid profiles of premenopausal women with early-stage breast cancer treated with different endocrine therapies

  • Kaiyue Wang
  • Suzhan Zhang

research and results 2022

Reverse engineering for reconstructing baseline features of dry age-related macular degeneration in optical coherence tomography

  • Shuxian Wang
  • Ziyuan Wang
  • Zhihong Jewel Hu

research and results 2022

Deep learning-derived cardiovascular age shares a genetic basis with other cardiac phenotypes

  • Julian Libiseller-Egger
  • Jody E. Phelan
  • Taane G. Clark

research and results 2022

The ground beetle Pseudoophonus rufipes gut microbiome is influenced by the farm management system

  • Serena Magagnoli
  • Daniele Alberoni
  • Giovanni Burgio

research and results 2022

On the role of tail in stability and energetic cost of bird flapping flight

  • Gianmarco Ducci
  • Gennaro Vitucci
  • Renaud Ronsse

research and results 2022

Prediction and risk assessment of sepsis-associated encephalopathy in ICU based on interpretable machine learning

  • Hongyu Kang

Effects of paraprobiotic as replacements for antibiotic on performance, immunity, gut health and carcass characteristics in broiler chickens

  • Nampalle Mukesh Tukaram
  • Avishek Biswas
  • Ashok Kumar Tiwari

research and results 2022

Significance of thermal radiation and bioconvection for Williamson nanofluid transportation owing to cone rotation

  • Sohaib Abdal
  • Imran Siddique
  • Sajjad Hussain

research and results 2022

Variation in heat shock protein 40 kDa relates to divergence in thermotolerance among cryptic rotifer species

  • R. Tiedemann

research and results 2022

Development of a high-performance open-source 3D bioprinter

  • Joshua W. Tashman
  • Daniel J. Shiwarski
  • Adam W. Feinberg

research and results 2022

Machine learning can aid in prediction of IDH mutation from H&E-stained histology slides in infiltrating gliomas

  • Benjamin Liechty
  • David J. Pisapia

research and results 2022

An ancestral hard-shelled sea turtle with a mosaic of soft skin and scutes

  • Randolph Glenn De La Garza
  • Henrik Madsen
  • Johan Lindgren

research and results 2022

Identification and analysis of odorant receptors expressed in the two main olfactory organs, antennae and palps, of Schistocerca americana

  • Alejandra Boronat-Garcia
  • Mark Stopfer

research and results 2022

The prognosis predictive score around primary debulking surgery (PPSP) improves diagnostic efficacy in predicting the prognosis of ovarian cancer

  • Naoki Kawahara
  • Ryuji Kawaguchi
  • Fuminori Kimura

research and results 2022

Higher-order interactions shape microbial interactions as microbial community complexity increases

  • Manon A. Morin
  • Anneliese J. Morrison
  • Rachel J. Dutton

research and results 2022

Outcome of multi-staged induced membrane technique based on post-debridement cultures for the management of critical-sized bone defect following fracture-related infection

  • Jae-Woo Cho
  • William T. Kent
  • Jong-Keon Oh

research and results 2022

Smartphones dependency risk analysis using machine-learning predictive models

  • Claudia Fernanda Giraldo-Jiménez
  • Javier Gaviria-Chavarro
  • André Luiz Felix Rodacki

research and results 2022

The severe impact of the COVID-19 pandemic on bullying victimization, mental health indicators and quality of life

  • June T. Forsberg
  • Steinar Thorvaldsen

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

research and results 2022

  • Share on twitter
  • Share on facebook

REF 2021: Quality ratings hit new high in expanded assessment

Four in five outputs judged to be either ‘world-leading’ or ‘internationally excellent’.

  • Share on linkedin
  • Share on mail

REF 2021 submission rules help push quality to new high

The quality of UK scholarship as rated by the Research Excellence Framework has hit a new high following reforms that required universities to submit all research-active staff to the 2021 exercise.

For the first time in the history of the UK’s national audit of research, all staff with a “significant responsibility” for research were entered for assessment – a rule change that resulted in 76,132 academics submitting at least one research output, up 46 per cent from 52,000 in 2014.

Overall, 41 per cent of outputs were deemed world-leading (4*) by assessment panels and 43 per cent judged internationally excellent (3*), which was described as an “exceptional achievement for UK university research” by Steven Hill, director of research at Research England, which runs the REF.

In the 2014 assessment 30 per cent of research got a 4* rating, with 46 per cent judged to be 3*.

Who’s up, who’s down? See how your institution performed in REF 2021 Output v impact: where is your institution strongest? Unit of assessment tables: see who's top in your subject More staff, more excellent research, great impacts: David Sweeney on REF 2021

Analysis of institutional performance by Times Higher Education now puts the grade point average at UK sector level at 3.16 for outputs, compared with 2.90 in 2014. Scores for research impact have also increased , from 3.24 to 3.35.

At least 15 per cent of research was considered world-leading in three-quarters of the UK’s universities.

And analysis by THE suggests that institutions outside London have improved their performance the most, with several Russell Group universities from outside the “golden triangle” of Oxford, Cambridge and London making major gains .

REF 2021 results at a glance: 

See here for full results table.

The results of the REF will be used to distribute quality-related research funding by the UK’s four higher education funding bodies, the value of which will stand at around £2 billion from 2022-23.

The requirement to submit all research-active staff was introduced following the review of the REF conducted by Lord Stern in 2016 and was designed to reduce institutional “game-playing” over which staff members were submitted.

Outputs in REF 2014 and 2021 

Outputs in REF 2014 and 2021

The uptick in quality may be driven by universities focusing instead on which of their researchers’ outputs should be submitted, allowing greater flexibility to pick “excellent” scholarship.

In the 2014 exercise, each participating researcher was expected to submit four outputs, but this time the number of outputs can range between one and five, with an average of 2.5 per full-time equivalent researcher expected. In the 2021 exercise, a single output was submitted for 44 per cent of researchers who participated.

University staff had welcomed the new submission rules which had removed the “emotional pressure” caused by deliberations over whether they would be “in or out” of the REF – a decision that often had consequences for future promotions, said David Sweeney, executive chair of Research England. “There is no longer that same pressure on individuals,” he reflected.

Methodology: how THE calculates its REF tables

However, the rule change has been linked to universities’ decisions to move many staff on to teaching-only contracts in recent years, with the latest data showing that about 20,000 academics  are employed on such terms compared with five years ago .

This change represented a welcome clarification of academics’ roles rather than “game-playing’ on behalf of institutions, insisted Mr Sweeney. “If these contracts represent the expectations of institutions and the responsibilities of academics, that is not game-playing, it is transparency,” he said.

David Price, vice-provost (research) at UCL and chair of the REF’s main panel B (physical sciences, engineering and mathematics), agreed that “the REF may have helped in resolving many contractual ambiguities. Game-playing has not been noticeable,” he said.

Dame Jessica Corner, pro vice-chancellor (research and knowledge exchange) at the University of Nottingham , said that “less focus on individuals with the partial separation of outputs from academics has been helpful”.  

“That outputs can be returned by institutions where individuals worked if they move jobs has reduced, though not entirely eliminated, the academic transfer market,” she added.

James Wilsdon, Digital Science professor of research policy at the University of Sheffield , agreed. “The large-scale transfers of people between institutions that we saw in the lead-up to REF 2014 have definitely reduced, which is positive,” said Professor Wilsdon, who said that while “choices around inclusion and exclusion of individuals with ‘significant responsibility’ have been complex in some institutions – particularly less research-intensive universities – some have welcomed the clarity that this brought to different roles in terms of research, teaching and hybrid roles.”

“The game-playing, where it occurs, is often more subtle: it’s about the gradual sifting and reordering of what kinds of research, and what kinds of impact, are deemed ‘excellent’,” he explained.

Kieron Flanagan, professor of science and technology policy at the University of Manchester , questioned the extent to which REF game-playing had been eliminated.

“There is bound to have been some of this happening because habits are hard to break – people who run university research have come in a management system informed by REFs over the past 10 to 20 years – some game-playing is inevitable,” he said.

However, Jane Millar, emerita professor of social policy at the University of Bath , who chaired the social sciences REF panel, believed the Stern review reforms had “worked well”.

“We saw a great diversity in the submissions, from very small to very large, from well-established and new units,” said Professor Millar, who added that there had been “examples of world-leading and internationally excellent quality across the range”.

THE Campus: How I plan to get through REF results day THE Campus: The good, the bad and the way forward: how UK universities should respond to REF results THE Campus: Don’t let the REF tail wag the academic dog

Interdisciplinary research was also “well presented”, with the Stern reforms encouraging greater links between subjects, with “sub-panels whose reach stretched through to design and engineering, physical and/or biological sciences, humanities, biomechanics, and medicine”, added Professor Millar.

With an international review body  examining the future of the REF , there has been some speculation that this could be its final incarnation.

But Mr Sweeney said that the REF remained an important tool in justifying the £9 billion or so in research funding given to institutions in open-ended funding that is likely to flow from the exercise.

[email protected]

POSTSCRIPT:

Print headline:  REF submission rules help push quality to new high

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter

Or subscribe for unlimited access to:

  • Unlimited access to news, views, insights & reviews
  • Digital editions
  • Digital access to THE’s university and college rankings analysis

Already registered or a current subscriber? Login

Related articles

Large research-intensives in regional centres rose in REF 2021

REF 2021: Golden triangle looks set to lose funding share

Although major players still dominate on research power, some large – and small – regional institutions have made their mark

REF 2021 impact case studies show ‘real differences made to people’s lives’

REF 2021: Increased impact weighting helps push up scores

Greater weighting helps medical institutions in particular improve overall positions

Art collage with antique sculpture of Apollo face and numbers, geometric shapes. Beauty, fashion and health theme. Science, research, discovery, technology concept. Pop art style. Zine culture.

REF 2021: More staff, more excellent research, great impacts

The latest iteration of the UK’s national research audit has fulfilled its aim to identify research quality across the whole system, says David Sweeney

Reader's comments (2)

You might also like.

A locked book

For open monographs, collective library subscription is the key

These initiatives don’t demand extra funding, undervalue publisher input or create institutional or disciplinary divides, say Anthony Cond and Jane Bunker

Concept of pages flying from a book held by a woman to illustrate REF rules on open access books will have major consequences

Open-access books will push art history out of the picture

Extra costs linked to proposed new Research Excellence Framework rules could send art history departments to the wall, warns Francesca Berry

A visitor looks at a book on the side of the Big Ben Lying Down installation to illustrate Forget book deals if REF open access rules proceed, warn scholars

Forget book deals if REF open access rules proceed, warn scholars

Researchers say precariously employed academics will lose out if universities are required to stump up fees for open access

Two people in silhouette have a conversation

‘Co-creation’ of REF 2029 research environment metrics promised

Pilot to test potential indicators announced amid sector uncertainty

Featured jobs

research and results 2022

Research Excellence Framework 2021

Results and submissions

Introduction to the ref results, view ref results, view analysis of the ref results, submitted outputs' details, impact case study database, environment database, download the results, download submissions.

IMAGES

  1. 2022 On-Farm Research Results Now Available!

    research and results 2022

  2. Multidisciplinary Research Conference 2022 (MRC 2022) Organized by KIU

    research and results 2022

  3. Undergraduate Research Report 2021-2022

    research and results 2022

  4. MIT’s top research stories of 2022

    research and results 2022

  5. 2022 Survey Results

    research and results 2022

  6. 2022 Global Funding Forecast: R&D variants cover more than the pandemic

    research and results 2022

VIDEO

  1. Thomson Reuters Institute: 2022 Year in Review

  2. All four friends became IAS at the same time

COMMENTS

  1. U.S. R&D Increased by $72 Billion in 2021 to $789 Billion; Estimate for

    New data from the National Center for Science and Engineering Statistics (NCSES) within the National Science Foundation indicate that research and experimental development (R&D) performed in the United States totaled $789.1 billion in 2021. The estimated total for 2022, based on performer-reported expectations, is $885.6 billion. The ratio of U.S. R&D to GDP was 3.34% in 2020, exceeding the ...

  2. Summary

    Between 2018 and 2022, mean performance in mathematics across OECD countries fell by a record 15 points. Reading fell 10 points, twice the previous record, whereas science performance did not change significantly. On average, reading and science trajectories had been falling for a decade, though math had remained stable between 2003-2018.

  3. Striking findings from 2022

    Pew Research Center's surveys have shed light on public opinion around some of the biggest news events of 2022 - from Russia's military invasion of Ukraine to the overturning of Roe v. Wade to Americans' experiences with extreme weather events.Here's a look back at the past year through 15 of our most striking research findings, which cover these topics and more.

  4. The future of research revealed

    The future of research revealed. April 20, 2022 ... Download data analyses of research results (opens in new tab/window) Project methodology. In total, over 2,000 researchers responded to two separate global surveys: 1,173 researchers responded in July-August 2021 and 1,066 in July 2020. Responses have been weighted to be representative of the ...

  5. Pisa

    The conference aims to discuss, exchange and learn from practices, approaches, and research on the topic of how evidence from PISA can effectively inform education policy and practice. ... Initial results of PISA 2022 were released on 5 December 2023. 81 countries and economies took part in PISA 2022.

  6. Higher Education Research and Development (HERD) Survey 2022

    The HERD Survey is an annual census of U.S. colleges and universities that expended at least $150,000 in separately accounted-for R&D in the fiscal year. The survey collects information on R&D expenditures by field of research and source of funds and also gathers information on types of research, expenses, and headcounts of R&D personnel.

  7. 2022 Research Highlights

    Results with Potential for Enhancing Human Health. With NIH support, scientists across the United States and around the world conduct wide-ranging research to discover ways to enhance health, lengthen life, and reduce illness and disability. ... Printer-friendly version of full 2022 NIH Research Highlights. 20220201-herpes.jpg.

  8. Articles in 2022

    Smartphones dependency risk analysis using machine-learning predictive models. Claudia Fernanda Giraldo-Jiménez. Javier Gaviria-Chavarro. André Luiz Felix Rodacki. Article Open Access 31 Dec 2022.

  9. 2022 Research Highlights

    In 2022, these honors included two NIH-supported scientists who received Nobel Prizes. Here's just a small sample of the NIH-supported human health advances in 2022. For more health and medical research findings from NIH, visit NIH Research Matters. Printer-friendly version of full 2022 NIH Research Highlights.

  10. MIT's top research stories of 2022

    The dizzying pace of research and innovation at MIT can make it hard to keep up. To mark the end of the year, MIT News is looking back at 10 of the research stories that generated the most excitement in 2022. We've also rounded up the year's top MIT community-related stories. Designing a heat engine with no moving parts. In April, engineers ...

  11. NIH releases 2022 dementia research progress report

    November 8, 2022. Alzheimer's Disease. NIH has released Advancing Alzheimer's Disease and Related Dementias Research for All Populations: Prevent. Diagnose. Treat. Care. (PDF, 17M), a 2022 scientific progress report. The report features science advances and related efforts made between March 2021 and early 2022 in areas including drug ...

  12. Science, Research and Innovation performance of the EU 2022 report

    The Commission has released the 2022 edition of the Science, Research and Innovation Performance (SRIP) report, analysing the EU's innovation performance in a global context.It provides insights into how research and innovation policies can help build an inclusive, sustainable, competitive and resilient Europe by leveraging the essential role of research and innovation as a source of ...

  13. Cancer Facts & Figures 2022

    The Facts & Figures annual report provides: Estimated numbers of new cancer cases and deaths in 2022 (In 2022, there will be an estimated 1.9 million new cancer cases diagnosed and 609,360 cancer deaths in the United States.) Current cancer incidence, mortality, and survival statistics. Information on cancer symptoms, risk factors, early ...

  14. Annual Results 2022: Annual Results: Research: Evidence-Based

    Welcome to the 2022 edition of Engagement Insights, NSSE's annual dissemination of selected research findings and institutional stories that have broad relevance to the improvement of undergraduate education.. In the winter months of 2023, this page will present three data-informed treatments of important topics for higher education with special value for institutions that participated in ...

  15. Pew Research Center

    Views are split by political party, but support for legal abortion has risen modestly in both groups since before the 2022 Dobbs decision. short read May 1, 2024. ... ABOUT PEW RESEARCH CENTER Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public ...

  16. How to Write a Results Section

    Published on August 30, 2022 by Tegan George. Revised on July 18, 2023. A results section is where you ... Checklist: Research results 0 / 7. I have completed my data collection and analyzed the results. I have included all results that are relevant to my research questions.

  17. A Practical Guide to Writing Quantitative and Qualitative Research

    INTRODUCTION. Scientific research is usually initiated by posing evidenced-based research questions which are then explicitly restated as hypotheses.1,2 The hypotheses provide directions to guide the study, solutions, explanations, and expected results.3,4 Both research questions and hypotheses are essentially formulated based on conventional theories and real-world processes, which allow the ...

  18. Survey of State Government Research and Development 2022

    Data specific to state government agencies were first released with the FY 2009 survey results and are also included in the FY 2022 data tables. Specific agency-level data for FY 2006 and FY 2007 are not available. ... These tables present the results of the FY 2022 Survey of State Government Research and Development (R&D), conducted by the ...

  19. Research articles

    Deep learning-derived cardiovascular age shares a genetic basis with other cardiac phenotypes. Julian Libiseller-Egger. Jody E. Phelan. Taane G. Clark. Article Open Access 31 Dec 2022.

  20. REF 2021: research excellence framework results

    The results of the REF will be used to distribute quality-related research funding by the UK's four higher education funding bodies, the value of which will stand at around £2 billion from 2022-23. The requirement to submit all research-active staff was introduced following the review of the REF conducted by Lord Stern in 2016 and was ...

  21. Results and submissions : REF 2021

    Introduction to the REF results. 157 UK higher education institutions (HEIs) made submissions in 34 subject-based units of assessment (UOAs). The submissions were assessed by panels of experts , who produced an overall quality profile for each submission. Each overall quality profile shows the proportion of research activity judged by the ...

  22. Meta

    Revenue - Revenue was $32.17 billion and $116.61 billion, a decrease of 4% and 1% year-over-year for the fourth quarter and full year 2022, respectively. Had foreign exchange rates remained constant with the same periods of 2021, revenue would have been $2.01 billion and $5.96 billion higher, an increase of 2% and 4% on a constant currency ...