- Open access
- Published: 15 March 2021
Instructor strategies to aid implementation of active learning: a systematic literature review
- Kevin A. Nguyen 1 ,
- Maura Borrego 2 ,
- Cynthia J. Finelli ORCID: orcid.org/0000-0001-9148-1492 3 ,
- Matt DeMonbrun 4 ,
- Caroline Crockett 3 ,
- Sneha Tharayil 2 ,
- Prateek Shekhar 5 ,
- Cynthia Waters 6 &
- Robyn Rosenberg 7
International Journal of STEM Education volume 8 , Article number: 9 ( 2021 ) Cite this article
30k Accesses
52 Citations
15 Altmetric
Metrics details
Despite the evidence supporting the effectiveness of active learning in undergraduate STEM courses, the adoption of active learning has been slow. One barrier to adoption is instructors’ concerns about students’ affective and behavioral responses to active learning, especially student resistance. Numerous education researchers have documented their use of active learning in STEM classrooms. However, there is no research yet that systematically analyzes these studies for strategies to aid implementation of active learning and address students’ affective and behavioral responses. In this paper, we conduct a systematic literature review and identify 29 journal articles and conference papers that researched active learning, affective and behavioral student responses, and recommended at least one strategy for implementing active learning. In this paper, we ask: (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide?
In our review, we noted that most active learning activities involved in-class problem solving within a traditional lecture-based course ( N = 21). We found mostly positive affective and behavioral outcomes for students’ self-reports of learning, participation in the activities, and course satisfaction ( N = 23). From our analysis of the 29 studies, we identified eight strategies to aid implementation of active learning based on three categories. Explanation strategies included providing students with clarifications and reasons for using active learning. Facilitation strategies entailed working with students and ensuring that the activity functions as intended. Planning strategies involved working outside of the class to improve the active learning experience.
To increase the adoption of active learning and address students’ responses to active learning, this study provides strategies to support instructors. The eight strategies are listed with evidence from numerous studies within our review on affective and behavioral responses to active learning. Future work should examine instructor strategies and their connection with other affective outcomes, such as identity, interests, and emotions.
Introduction
Prior reviews have established the effectiveness of active learning in undergraduate science, technology, engineering, and math (STEM) courses (e.g., Freeman et al., 2014 ; Lund & Stains, 2015 ; Theobald et al., 2020 ). In this review, we define active learning as classroom-based activities designed to engage students in their learning through answering questions, solving problems, discussing content, or teaching others, individually or in groups (Prince & Felder, 2007 ; Smith, Sheppard, Johnson, & Johnson, 2005 ), and this definition is inclusive of research-based instructional strategies (RBIS, e.g., Dancy, Henderson, & Turpen, 2016 ) and evidence-based instructional practices (EBIPs, e.g., Stains & Vickrey, 2017 ). Past studies show that students perceive active learning as benefitting their learning (Machemer & Crawford, 2007 ; Patrick, Howell, & Wischusen, 2016 ) and increasing their self-efficacy (Stump, Husman, & Corby, 2014 ). Furthermore, the use of active learning in STEM fields has been linked to improvements in student retention and learning, particularly among students from some underrepresented groups (Chi & Wylie, 2014 ; Freeman et al., 2014 ; Prince, 2004 ).
Despite the overwhelming evidence in support of active learning (e.g., Freeman et al., 2014 ), prior research has found that traditional teaching methods such as lecturing are still the dominant mode of instruction in undergraduate STEM courses, and low adoption rates of active learning in undergraduate STEM courses remain a problem (Hora & Ferrare, 2013 ; Stains et al., 2018 ). There are several reasons for these low adoption rates. Some instructors feel unconvinced that the effort required to implement active learning is worthwhile, and as many as 75% of instructors who have attempted specific types of active learning abandon the practice altogether (Froyd, Borrego, Cutler, Henderson, & Prince, 2013 ).
When asked directly about the barriers to adopting active learning, instructors cite a common set of concerns including the lack of preparation or class time (Finelli, Daly, & Richardson, 2014 ; Froyd et al., 2013 ; Henderson & Dancy, 2007 ). Among these concerns, student resistance to active learning is a potential explanation for the low rates of instructor persistence with active learning, and this negative response to active learning has gained increased attention from the academic community (e.g., Owens et al., 2020 ). Of course, students can exhibit both positive and negative responses to active learning (Carlson & Winquist, 2011 ; Henderson, Khan, & Dancy, 2018 ; Oakley, Hanna, Kuzmyn, & Felder, 2007 ), but due to the barrier student resistance can present to instructors, we focus here on negative student responses. Student resistance to active learning may manifest, for example, as lack of student participation and engagement with in-class activities, declining attendance, or poor course evaluations and enrollments (Tolman, Kremling, & Tagg, 2016 ; Winkler & Rybnikova, 2019 ).
We define student resistance to active learning (SRAL) as a negative affective or behavioral student response to active learning (DeMonbrun et al., 2017 ; Weimer, 2002 ; Winkler & Rybnikova, 2019 ). The affective domain, as it relates to active learning, encompasses not only student satisfaction and perceptions of learning but also motivation-related constructs such as value, self-efficacy, and belonging. The behavioral domain relates to participation, putting forth a good effort, and attending class. The affective and behavioral domains differ from much of the prior research on active learning that centers measuring cognitive gains in student learning, and systematic reviews are readily available on this topic (e.g., Freeman et al., 2014 ; Theobald et al., 2020 ). Schmidt, Rosenberg, and Beymer ( 2018 ) explain the relationship between affective, cognitive, and behavioral domains, asserting all three types of engagement are necessary for science learning, and conclude that “students are unlikely to exert a high degree of behavioral engagement during science learning tasks if they do not also engage deeply with the content affectively and cognitively” (p. 35). Thus, SRAL and negative affective and behavioral student response is a critical but underexplored component of STEM learning.
Recent research on student affective and behavioral responses to active learning has uncovered mechanisms of student resistance. Deslauriers, McCarty, Miller, Callaghan, and Kestin’s ( 2019 ) interviews of physics students revealed that the additional effort required by the novel format of an interactive lecture was the primary source of student resistance. Owens et al. ( 2020 ) identified a similar source of student resistance, which was to their carefully designed biology active learning intervention. Students were concerned about the additional effort required and the unfamiliar student-centered format. Deslauriers et al. ( 2019 ) and Owens et al. ( 2020 ) go a step further in citing self-efficacy (Bandura, 1982 ), mindset (Dweck & Leggett, 1988 ), and student engagement (Kuh, 2005 ) literature to explain student resistance. Similarly, Shekhar et al.’s ( 2020 ) review framed negative student responses to active learning in terms of expectancy-value theory (Wigfield & Eccles, 2000 ); students reacted negatively when they did not find active learning useful or worth the time and effort, or when they did not feel competent enough to complete the activities. Shekhar et al. ( 2020 ) also applied expectancy violation theory from physics education research (Gaffney, Gaffney, & Beichner, 2010 ) to explain how students’ initial expectations of a traditional course produced discomfort during active learning activities. To address both theories of student resistance, Shekhar et al. ( 2020 ) suggested that instructors provide scaffolding (Vygotsky, 1978 ) and support for self-directed learning activities. So, while framing the research as SRAL is relatively new, ideas about working with students to actively engage them in their learning are not. Prior literature on active learning in STEM undergraduate settings includes clues and evidence about strategies instructors can employ to reduce SRAL, even if they are not necessarily framed by the authors as such.
Recent interest in student affective and behavioral responses to active learning, including SRAL, is a relatively new development. But, given the discipline-based educational research (DBER) knowledge base around RBIS and EBIP adoption, we need not to reinvent the wheel. In this paper, we conduct a system review. Systematic reviews are designed to methodically gather and synthesize results from multiple studies to provide a clear overview of a topic, presenting what is known and what is not known (Borrego, Foster, & Froyd, 2014 ). Such clarity informs decisions when designing or funding future research, interventions, and programs. Relevant studies for this paper are scattered across STEM disciplines and in DBER and general education venues, which include journals and conference proceedings. Quantitative, qualitative, and mixed methods approaches have been used to understand student affective and behavioral responses to active learning. Thus, a systematic review is appropriate for this topic given the long history of research on the development of RBIS, EBIPs, and active learning in STEM education; the distribution of primary studies across fields and formats; and the different methods taken to evaluate students’ affective and behavioral responses.
Specifically, we conducted a systematic review to address two interrelated research questions. (1) What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies ? (2) What instructor strategies to aid implementation of active learning do the authors of these studies provide ? These two questions are linked by our goal of sharing instructor strategies that can either reduce SRAL or encourage positive student affective and behavioral responses. Therefore, the instructor strategies in this review are only from studies that present empirical data of affective and behavioral student response to active learning. The strategies we identify in this review will not be surprising to highly experienced teaching and learning practitioners or researchers. However, this review does provide an important link between these strategies and student resistance, which remains one of the most feared barriers to instructor adoption of RBIS, EBIPs, and other forms of active learning.
Conceptual framework: instructor strategies to reduce resistance
Recent research has identified specific instructor strategies that correlate with reduced SRAL and positive student response in undergraduate STEM education (Finelli et al., 2018 ; Nguyen et al., 2017 ; Tharayil et al., 2018 ). For example, Deslauriers et al. ( 2019 ) suggested that physics students perceive the additional effort required by active learning to be evidence of less effective learning. To address this, the authors included a 20-min lecture about active learning in a subsequent course offering. By the end of that course, 65% of students reported increased enthusiasm for active learning, and 75% said the lecture intervention positively impacted their attitudes toward active learning. Explaining how active learning activities contribute to student learning is just one of many strategies instructors can employ to reduce SRAL (Tharayil et al., 2018 ).
DeMonbrun et al. ( 2017 ) provided a conceptual framework for differentiating instructor strategies which includes not only an explanation type of instructor strategies (e.g., Deslauriers et al., 2019 ; Tharayil et al., 2018 ) but also a facilitation type of instructor strategies. Explanation strategies involve describing the purpose (such as how the activity relates to students’ learning) and expectations of the activity to students. Typically, instructors use explanation strategies before the in-class activity has begun. Facilitation strategies include promoting engagement and keeping the activity running smoothly once the activity has already begun, and some specific strategies include walking around the classroom or directly encouraging students. We use the existing categories of explanation and facilitation as a conceptual framework to guide our analysis and systematic review.
As a conceptual framework, explanation and facilitation strategies describe ways to aid the implementation of RBIS, EBIP, and other types of active learning. In fact, the work on these types of instructor strategies is related to higher education faculty development, implementation, and institutional change research perspectives (e.g., Borrego, Cutler, Prince, Henderson, & Froyd, 2013 ; Henderson, Beach, & Finkelstein, 2011 ; Kezar, Gehrke, & Elrod, 2015 ). As such, the specific types of strategies reviewed here are geared to assist instructors in moving toward more student-centered teaching methods by addressing their concerns of student resistance.
SRAL is a particular negative form of affective or behavioral student response (DeMonbrun et al., 2017 ; Weimer, 2002 ; Winkler & Rybnikova, 2019 ). Affective and behavioral student responses are conceptualized at the reactionary level (Kirkpatrick, 1976 ) of outcomes, which consists of how students feel (affective) and how they conduct themselves within the course (behavioral). Although affective and behavioral student responses to active learning are less frequently reported than cognitive outcomes, prior research suggests a few conceptual constructs within these outcomes.
Affective outcomes consist of any students’ feelings, preferences, and satisfaction with the course. Affective outcomes also include students’ self-reports of whether they thought they learned more (or less) during active learning instruction. Some relevant affective outcomes include students’ perceived value or utility of active learning (Shekhar et al., 2020 ; Wigfield & Eccles, 2000 ), their positivity toward or enjoyment of the activities (DeMonbrun et al., 2017 ; Finelli et al., 2018 ), and their self-efficacy or confidence with doing the in-class activity (Bandura, 1982 ).
In contrast, students’ behavioral responses to active learning consist of their actions and practices during active learning. This includes students’ attendance in the class, their participation , engagement, and effort with the activity, and students’ distraction or off-task behavior (e.g., checking their phones, leaving to use the restroom) during the activity (DeMonbrun et al., 2017 ; Finelli et al., 2018 ; Winkler & Rybnikova, 2019 ).
We conceptualize negative or low scores in either affective or behavioral student outcomes as an indicator of SRAL (DeMonbrun et al., 2017 ; Nguyen et al., 2017 ). For example, a low score in reported course satisfaction would be an example of SRAL. This paper aims to synthesize instructor strategies to aid implementation of active learning from studies that either address SRAL and its negative or low scores or relate instructor strategies to positive or high scores. Therefore, we also conceptualize positive student affective and behavioral outcomes as the absence of SRAL. For easy categorization of this review then, we summarize studies’ affective and behavioral outcomes on active learning to either being positive , mostly positive , mixed/neutral , mostly negative , or negative .
We conducted a systematic literature review (Borrego et al., 2014 ; Gough, Oliver, & Thomas, 2017 ; Petticrew & Roberts, 2006 ) to identify primary research studies that describe active learning interventions in undergraduate STEM courses, recommend one or more strategies to aid implementation of active learning, and report student response outcomes to active learning.
A systematic review was warranted due to the popularity of active learning and the publication of numerous papers on the topic. Multiple STEM disciplines and research audiences have published journal articles and conference papers on the topic of active learning in the undergraduate STEM classroom. However, it was not immediately clear which studies addressed active learning, affective and behavioral student responses, and strategies to aid implementation of active learning. We used the systematic review process to efficiently gather results of multiple types of studies and create a clear overview of our topic.
Definitions
For clarity, we define several terms in this review. Researchers refer to us, the authors of this manuscript. Authors and instructors wrote the primary studies we reviewed, and we refer to these primary studies as “studies” consistently throughout. We use the term activity or activities to refer to the specific in-class active learning tasks assigned to students. Strategies refer to the instructor strategies used to aid implementation of active learning and address student resistance to active learning (SRAL). Student response includes affective and behavioral responses and outcomes related to active learning. SRAL is an acronym for student resistance to active learning, defined here as a negative affective or behavioral student response. Categories or category refer to a grouping of strategies to aid implementation of active learning, such as explanation or facilitation. Excerpts are quotes from studies, and these excerpts are used as codes and examples of specific strategies.
Study timeline, data collection, and sample selection
From 2015 to 2016, we worked with a research librarian to locate relevant studies and conduct a keyword search within six databases: two multidisciplinary databases (Web of Science and Academic Search Complete), two major engineering and technology indexes (Compendex and Inspec), and two popular education databases (Education Source and Education Resource Information Center). We created an inclusion criteria that listed both search strings and study requirements:
Studies must include an in-class active learning intervention. This does not include laboratory classes. The corresponding search string was:
“active learning” or “peer-to-peer” or “small group work” or “problem based learning” or “problem-based learning” or “problem-oriented learning” or “project-based learning” or “project based learning” or “peer instruction” or “inquiry learning” or “cooperative learning” or “collaborative learning” or “student response system” or “personal response system” or “just-in-time teaching” or “just in time teaching” or clickers
Studies must include empirical evidence addressing student response to the active learning intervention. The corresponding search string was:
“affective outcome” or “affective response” or “class evaluation” or “course evaluation” or “student attitudes” or “student behaviors” or “student evaluation” or “student feedback” or “student perception” or “student resistance” or “student response”
Studies must describe a STEM course, as defined by the topic of the course, rather than by the department of the course or the major of the students enrolled (e.g., a business class for mathematics majors would not be included, but a mathematics class for business majors would).
Studies must be conducted in undergraduate courses and must not include K-12, vocational, or graduate education.
Studies must be in English and published between 1990 and 2015 as journal articles or conference papers.
In addition to searching the six databases, we emailed solicitations to U.S. National Science Foundation Improving Undergraduate STEM Education (NSF IUSE) grantees. Between the database searches and email solicitation, we identified 2364 studies after removing duplicates. Most studies were from the database search, as we received just 92 studies from email solicitation (Fig. 1 ).
PRISMA screening overview styled after Liberati et al. ( 2009 ) and Passow and Passow ( 2017 )
Next, we followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines for screening studies with our inclusion criteria (Borrego et al., 2014 ; Petticrew & Roberts, 2006 ). From 2016 to 2018, a team of seven researchers conducted two rounds of review in Refworks: the first round with only titles and abstracts and the second round with the entire full-text. In both rounds, two researchers independently decided whether each study should be retained based on our inclusion criteria listed above. At the abstract review stage, if there was a disagreement between independent coders, we decided to pass the study on to the full text screening round. We screened a total of 2364 abstracts, and only 746 studies passed the first round of title and abstract verification (see PRISMA flow chart on Fig. 1 ). If there was still a disagreement between independent coders at the full text screening round, then the seven researchers met and discussed the study, clarified the inclusion criteria as needed to resolve potential future disagreements, and when necessary, took a majority vote (4 out of the 7 researchers) on the inclusion of the study. Due to the high number of coders, it was unusual to reach full consensus with all 7 coders, so a majority vote was used to finalize the inclusion of certain studies. We resolved these disagreements on a rolling basis, and depending on the round (abstract or full text), we disagreed about 10–15% of the time on the inclusion of a study. In both the first and second round of screening, studies were often excluded because they did not gather novel empirical data or evidence (inclusion criteria #2) or were not in an undergraduate STEM course (inclusion criteria #3 and #4). Only 412 studies met all our final inclusion criteria.
Coding procedure
From 2017 to 2018, a team of five researchers then coded these 412 studies for detailed information. To quickly gather information about all 412 studies and to answer the first part of our research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we developed an online coding form using Google Forms and Google Sheets. The five researchers piloted and refined the coding form over three rounds of pair coding, and 19 studies were used to test and revise early versions of the coding form. The final coding form (Borrego et al., 2018 ) used a mix of multiple choice and free response items regarding study characteristics (bibliographic information, type of publication, location of study), course characteristics (discipline, course level, number of students sampled, and type of active learning), methodology (main type of evidence collected, sample size, and analysis methods), study findings (types of student responses and outcomes), and strategy reported (if the study explicitly mentioned using strategies to implementation of active learning).
In the end, only 29 studies explicitly described strategies to aid implementation of active learning (Fig. 1 ), and we used these 29 studies as the dataset for this study. The main difference between these 29 studies and the other 383 studies was that these 29 studies explicitly described the ways authors implemented active learning in their courses to address SRAL or positive student outcomes. Although some readers who are experienced active learning instructors or educational researchers may view pedagogies and strategies as integrated, we found that most papers described active learning methods in terms of student tasks, while advice on strategies, if included, tended to appear separately. We chose to not over interpret passing mentions of how active learning was implemented as strategies recommended by the authors.
Analysis procedure for coding strategies
To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we closely reviewed the 29 studies to analyze the strategies in more detail. We used Boyatzis’s ( 1998 ) thematic analysis technique to compile all mentions of instructor strategies to aid implementation of active learning and categorize these excerpts into certain strategies. This technique uses both deductive and inductive coding processes (Creswell & Creswell, 2017 ; Jesiek, Mazzurco, Buswell, & Thompson, 2018 ).
In 2018, three researchers reread the 29 studies, marking excerpts related to strategies independently. We found a total of 126 excerpts. The number of excerpts within each study ranged from 1 to 14 excerpts ( M = 4, SD = 3). We then took all the excerpts and pasted each into its own row in a Google Sheet. We examined the entire spreadsheet as a team and grouped similar excerpts together using a deductive coding process. We used the explanation and facilitation conceptual framework (DeMonbrun et al., 2017 ) and placed each excerpt into either category. We also assigned a specific strategy (i.e., describing the purpose of the activity, or encouraging students) from the framework for each excerpt.
However, there were multiple excerpts that did not easily match either category; we set these aside for the inductive coding process. We then reviewed all excerpts without a category and suggested the creation of a new third category, called planning . We based this new category on the idea that the existing explanation and facilitation conceptual framework did not capture strategies that occurred outside of the classroom. We discuss the specific strategies within the planning category in the Results. With a new category in hand, we created a preliminary codebook consisting of explanation, facilitation, and planning categories, and their respective specific strategies.
We then passed the spreadsheet and preliminary codebook to another researcher who had not previously seen the excerpts. The second researcher looked through all the excerpts and assigned categories and strategies, without being able to see the suggestions of the initial three researchers. The second researcher also created their own new strategies and codes, especially when a specific strategy was not presented in the preliminary codebook. All of their new strategies and codes were created within the planning category. The second researcher agreed on assigned categories and implementation strategies for 71% of the total excerpts. A researcher from the initial strategies coding met with the second researcher and discussed all disagreements. The high number of disagreements, 29%, arose from the specific strategies within the new third category, planning. Since the second researcher created new planning strategies, by default these assigned codes would be a disagreement. The two researchers resolved the disagreements by finalizing a codebook with the now full and combined list of planning strategies and the previous explanation and facilitation strategies. Finally, they started the last round of coding, and they coded the excerpts with the final codebook. This time, they worked together in the same coding sessions. Any disagreements were immediately resolved through discussion and updating of final strategy codes. In the end, all 126 excerpts were coded and kept.
Characteristics of the primary studies
To answer our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we report the results from our coding and systematic review process. We discuss characteristics of studies within our dataset below and in Table 1 .
Type of publication and research audience
Of the 29 studies, 11 studies were published in conference proceedings, while the remaining 18 studies were journal articles. Examples of journals included the European Journal of Engineering Education , Journal of College Science Teaching , and PRIMUS (Problems, Resources, and Issues in Mathematics Undergraduate Studies).
In terms of research audiences and perspectives, both US and international views were represented. Eighteen studies were from North America, two were from Australia, three were from Asia, and six were from Europe. For more details about the type of research publications, full bibliographic information for all 29 studies is included in the Appendix.
Types of courses sampled
Studies sampled different types of undergraduate STEM courses. In terms of course year, most studies sampled first-year courses (13 studies). All four course years were represented (4 second-year, 3 third-year, 2 fourth-year, 7 not reported). In regards to course discipline or major, all major STEM education disciplines were represented. Fourteen studies were conducted in engineering courses, and most major engineering subdisciplines were represented, such as electrical and computer engineering (4 studies), mechanical engineering (3 studies), general engineering courses (3 studies), chemical engineering (2 studies), and civil engineering (1 study). Thirteen studies were conducted in science courses (3 physics/astronomy, 7 biology, 3 chemistry), and 2 studies were conducted in mathematics or statistics courses.
For teaching methods, most studies sampled traditional courses that were primarily lecture-based but included some in-class activities. The most common activity was giving class time for students to do problem solving (PS) (21 studies). Students were instructed to either do problem solving in groups (16 studies) or individually (5 studies) and sometimes both in the same course. Project or problem-based learning (PBL) was the second most frequently reported activity with 8 studies, and the implementation of this teaching method ranged from end of term final projects to an entire project or problem-based course. The third most common activity was using clickers (4 studies) or having class discussions (4 studies).
Research design, methods, and outcomes
The 29 studies used quantitative (10 studies), qualitative (6 studies), or mixed methods (13 studies) research designs. Most studies contained self-made instructor surveys (IS) as their main source of evidence (20 studies). In contrast, only 2 studies used survey instruments with evidence of validity (IEV). Other forms of data collection included using institutions’ end of course evaluations (EOC) (10 studies), observations (5 studies), and interviews (4 studies).
Studies reported a variety of different measures for researching students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of learning (an affective outcome); twenty-one studies measured whether students thought they learned more or less due to the active learning intervention. Other common measures included whether students participated in the activities (16 studies, participation), whether they enjoyed the activities (15 studies, enjoyment), and if students were satisfied with the overall course experience (13 studies, course satisfaction). Most studies included more than one measure. Some studies also measured course attendance (4 studies) and students’ self-efficacy with the activities and relevant STEM disciplines (4 studies).
We found that the 23 of the 29 studies reported positive or mostly positive outcomes for their students’ affective and behavioral responses to active learning. Only 5 studies reported mixed/neutral study outcomes, and only one study reported negative student response to active learning. We discuss the implications of this lack of negative study outcomes and reports of SRAL in our dataset in the “Discussion” section.
To answer our second research question (What instructor strategies to aid implementation of active learning do the authors of these studies provide?), we provide descriptions, categories, and excerpts of specific strategies found within our systematic literature review.
Explanation strategies
Explanation strategies provide students with clarifications and reasons for using active learning (DeMonbrun et al., 2017 ). Within the explanation category, we identified two specific strategies: establish expectations and explain the purpose .
Establish expectations
Establishing expectations means setting the tone and routine for active learning at both the course and in-class activity level. Instructors can discuss expectations at the beginning of the semester, at the start of a class session, or right before the activity.
For establishing expectations at the beginning of the semester, studies provide specific ways to ensure students became familiar with active learning as early as possible. This included “introduc[ing] collaborative learning at the beginning of the academic term” (Herkert , 1997 , p. 450) and making sure that “project instructions and the data were posted fairly early in the semester, and the students were made aware that the project was an important part of their assessment” (Krishnan & Nalim, 2009 , p. 5).
McClanahan and McClanahan ( 2002 ) described the importance of explaining how the course will use active learning and purposely using the syllabus to do this:
Set the stage. Create the expectation that students will actively participate in this class. One way to accomplish that is to include a statement in your syllabus about your teaching strategies. For example: I will be using a variety of teaching strategies in this class. Some of these activities may require that you interact with me or other students in class. I hope you will find these methods interesting and engaging and that they enable you to be more successful in this course . In the syllabus, describe the specific learning activities you plan to conduct. These descriptions let the students know what to expect from you as well as what you expect from them (emphasis added, p. 93).
Early on, students see that the course is interactive, and they also see the activities required to be successful in the course.
These studies and excerpts demonstrate the importance of explaining to students how in-class activities relate to course expectations. Instructors using active learning should start the semester with clear expectations for how students should engage with activities.
Explain the purpose
Explaining the purpose includes offering students reasons why certain activities are being used and convincing them of the importance of participating.
One way that studies explained the purpose of the activities was by leveraging and showing assessment data on active learning. For example, Lenz ( 2015 ) dedicated class time to show current students comments from previous students:
I spend the first few weeks reminding them of the research and of the payoff that they will garner and being a very enthusiastic supporter of the [active learning teaching] method. I show them comments I have received from previous classes and I spend a lot of time selling the method (p. 294).
Providing current students comments from previous semesters may help students see the value of active learning. Lake ( 2001 ) also used data from prior course offerings to show students “the positive academic performance results seen in the previous use of active learning” on the first day of class (p. 899).
However, sharing the effectiveness of the activities does not have to be constrained to the beginning of the course. Autin et al. ( 2013 ) used mid-semester test data and comparisons to sell the continued use of active learning to their students. They said to students:
Based on your reflections, I can see that many of you are not comfortable with the format of this class. Many of you said that you would learn better from a traditional lecture. However, this class, as a whole, performed better on the test than my other [lecture] section did. Something seems to be working here (p. 946).
Showing students’ comparisons between active learning and traditional lecture classes is a powerful way to explain how active learning is a benefit to students.
Explaining the purpose of the activities by sharing course data with students appears to be a useful strategy, as it tells students why active learning is being used and convinces students that active learning is making a difference.
Facilitation strategies
Facilitation strategies ensure the continued engagement in the class activities once they have begun, and many of the specific strategies within this category involve working directly with students. We identified two strategies within the facilitation category: approach students and encourage students .
Approach students
Approaching students means engaging with students during the activity. This includes physical proximity and monitoring students, walking around the classroom, and providing students with additional feedback, clarifications, or questions about the activity.
Several studies described how instructors circulated around the classroom to check on the progress of students during an activity. Lenz ( 2015 ) stated this plainly in her study, “While the students work on these problems I walk around the room, listening to their discussions” (p. 284). Armbruster et al. ( 2009 ) described this strategy and noted positive student engagement, “During each group-work exercise the instructor would move throughout the classroom to monitor group progress, and it was rare to find a group that was not seriously engaged in the exercise” (p. 209). Haseeb ( 2011 ) combined moving around the room and approaching students with questions, and they stated, “The instructor moves around from one discussion group to another and listens to their discussions, ask[ing] provoking questions” (p. 276). Certain group-based activities worked better with this strategy, as McClanahan and McClanahan ( 2002 ) explained:
Breaking the class into smaller working groups frees the professor to walk around and interact with students more personally. He or she can respond to student questions, ask additional questions, or chat informally with students about the class (p. 94).
Approaching students not only helps facilitate the activity, but it provides a chance for the instructor to work with students more closely and receive feedback. Instructors walking around the classroom ensure that both the students and instructor continue to engage and participate with the activity.
Encourage students
Encouraging students includes creating a supportive classroom environment, motivating students to do the activity, building respect and rapport with students, demonstrating care, and having a positive demeanor toward students’ success.
Ramsier et al. ( 2003 ) provided a detailed explanation of the importance of building a supportive classroom environment:
Most of this success lies in the process of negotiation and the building of mutual respect within the class, and requires motivation, energy and enthusiasm on behalf of the instructor… Negotiation is the key to making all of this work, and building a sense of community and shared ownership. Learning students’ names is a challenge but a necessary part of our approach. Listening to student needs and wants with regard to test and homework due dates…projects and activities, etc. goes a long way to build the type of relationships within the class that we need in order to maintain and encourage performance (pp. 16–18).
Here, the authors described a few specific strategies for supporting a positive demeanor, such as learning students’ names and listening to student needs and wants, which helped maintain student performance in an active learning classroom.
Other ways to build a supportive classroom environment were for instructors to appear more approachable. For example, Bullard and Felder ( 2007 ) worked to “give the students a sense of their instructors as somewhat normal and approachable human beings and to help them start to develop a sense of community” (p. 5). As instructors and students become more comfortable working with each other, instructors can work toward easing “frustration and strong emotion among students and step by step develop the students’ acceptance [of active learning]” (Harun, Yusof, Jamaludin, & Hassan, 2012 , p. 234). In all, encouraging students and creating a supportive environment appear to be useful strategies to aid implementation of active learning.
Planning strategies
The planning category encompasses strategies that occur outside of class time, distinguishing it from the explanation and facilitation categories. Four strategies fall into this category: design appropriate activities , create group policies , align the course , and review student feedback .
Design appropriate activities
Many studies took into consideration the design of appropriate or suitable activities for their courses. This meant making sure the activity was suitable in terms of time, difficulty, and constraints of the course. Activities were designed to strike a balance between being too difficult and too simple, to be engaging, and to provide opportunities for students to participate.
Li et al. ( 2009 ) explained the importance of outside-of-class planning and considering appropriate projects: “The selection of the projects takes place in pre-course planning. The subjects for projects should be significant and manageable” (p. 491). Haseeb ( 2011 ) further emphasized a balance in design by discussing problems (within problem-based learning) between two parameters, “the problem is deliberately designed to be open-ended and vague in terms of technical details” (p. 275). Armbruster et al. ( 2009 ) expanded on the idea of balanced activities by connecting it to group-work and positive outcomes, and they stated, “The group exercises that elicited the most animated student participation were those that were sufficiently challenging that very few students could solve the problem individually, but at least 50% or more of the groups could solve the problem by working as a team” (p. 209).
Instructors should consider the design of activities outside of class time. Activities should be appropriately challenging but achievable for students, so that students remain engaged and participate with the activity during class time.
Create group policies
Creating group policies means considering rules when using group activities. This strategy is unique in that it directly addresses a specific subset of activities, group work. These policies included setting team sizes and assigning specific roles to group members.
Studies outlined a few specific approaches for assigning groups. For example, Ramsier et al. ( 2003 ) recommended frequently changing and randomizing groups: “When students enter the room on these days they sit in randomized groups of 3 to 4 students. Randomization helps to build a learning community atmosphere and eliminates cliques” (p. 4). Another strategy in combination with frequent changing of groups was to not allow students to select their own groups. Lehtovuori et al. ( 2013 ) used this to avoid problems of freeriding and group dysfunction:
For example, group division is an issue to be aware of...An easy and safe solution is to draw lots to assign the groups and to change them often. This way nobody needs to suffer from a dysfunctional group for too long. Popular practice that students self-organize into groups is not the best solution from the point of view of learning and teaching. Sometimes friendly relationships can complicate fair division of responsibility and work load in the group (p. 9).
Here, Lehtovuori et al. ( 2013 ) considered different types of group policies and concluded that frequently changing groups worked best for students. Kovac ( 1999 ) also described changing groups but assigned specific roles to individuals:
Students were divided into groups of four and assigned specific roles: manager, spokesperson, recorder, and strategy analyst. The roles were rotated from week to week. To alleviate complaints from students that they were "stuck in a bad group for the entire semester," the groups were changed after each of the two in-class exams (p. 121).
The use of four specific group roles is a potential group policy, and Kovac ( 1999 ) continued the trend of changing group members often.
Overall, these studies describe the importance of thinking about ways to implement group-based activities before enacting them during class, and they suggest that groups should be reconstituted frequently. Instructors using group activities should consider whether to use specific group member policies before implementing the activity in the classroom.
Align the course
Aligning the course emphasizes the importance of purposely connecting multiple parts of the course together. This strategy involves planning to ensure students are graded on their participation with the activities as well as considering the timing of the activities with respect to other aspects of the course.
Li et al. ( 2009 ) described aligning classroom tasks by discussing the importance of timing, and they wrote, “The coordination between the class lectures and the project phases is very important. If the project is assigned near the directly related lectures, students can instantiate class concepts almost immediately in the project and can apply the project experience in class” (p. 491).
Krishnan and Nalim ( 2009 ) aligned class activities with grades to motivate students and encourage participation: “The project was a component of the course counting for typically 10-15% of the total points for the course grade. Since the students were told about the project and that it carried a significant portion of their grade, they took the project seriously” (p. 4). McClanahan and McClanahan ( 2002 ) expanded on the idea of using grades to emphasize the importance of active learning to students:
Develop a grading policy that supports active learning. Active learning experiences that are important enough to do are important enough to be included as part of a student's grade…The class syllabus should describe your grading policy for active learning experiences and how those grades factor into the student's final grade. Clarify with the students that these points are not extra credit. These activities, just like exams, will be counted when grades are determined (p. 93).
Here, they suggest a clear grading policy that includes how activities will be assessed as part of students’ final grades.
de Justo and Delgado ( 2014 ) connected grading and assessment to learning and further suggested that reliance on exams may negatively impact student engagement:
Particular attention should be given to alignment between the course learning outcomes and assessment tasks. The tendency among faculty members to rely primarily on written examinations for assessment purposes should be overcome, because it may negatively affect students’ engagement in the course activities (p. 8).
Instructors should consider their overall assessment strategies, as overreliance on written exams could mean that students engage less with the activities.
When planning to use active learning, instructors should consider how activities are aligned with course content and students’ grades. Instructors should decide before active learning implementation whether class participation and engagement will be reflected in student grades and in the course syllabus.
Review student feedback
Reviewing student feedback includes both soliciting feedback about the activity and using that feedback to improve the course. This strategy can be an iterative process that occurs over several course offerings.
Many studies utilized student feedback to continuously revise and improve the course. For example, Metzger ( 2015 ) commented that “gathering and reviewing feedback from students can inform revisions of course design, implementation, and assessment strategies” (p. 8). Rockland et al. ( 2013 ) further described changing and improving the course in response to student feedback, “As a result of these discussions, the author made three changes to the course. This is the process of continuous improvement within a course” (p. 6).
Herkert ( 1997 ) also demonstrated the use of student feedback for improving the course over time: “Indeed, the [collaborative] learning techniques described herein have only gradually evolved over the past decade through a process of trial and error, supported by discussion with colleagues in various academic fields and helpful feedback from my students” (p. 459).
In addition to incorporating student feedback, McClanahan and McClanahan ( 2002 ) commented on how student feedback builds a stronger partnership with students, “Using student feedback to make improvements in the learning experience reinforces the notion that your class is a partnership and that you value your students’ ideas as a means to strengthen that partnership and create more successful learning” (p. 94). Making students aware that the instructor is soliciting and using feedback can help encourage and build rapport with students.
Instructors should review student feedback for continual and iterative course improvement. Much of the student feedback review occurs outside of class time, and it appears useful for instructors to solicit student feedback to guide changes to the course and build student rapport.
Summary of strategies
We list the appearance of strategies within studies in Table 1 in short-hand form. No study included all eight strategies. Studies that included the most strategies were Bullard and Felder’s ( 2007 ) (7 strategies), Armbruster et al.’s ( 2009 ) (5 strategies), and Lenz’s ( 2015 ) (5 strategies). However, these three studies were exemplars, as most studies included only one or two strategies.
Table 2 presents a summary list of specific strategies, their categories, and descriptions. We also note the number of unique studies ( N ) and excerpts ( n ) that included the specific strategies. In total, there were eight specific strategies within three categories. Most strategies fell under the planning category ( N = 26), with align the course being the most reported strategy ( N = 14). Approaching students ( N = 13) and reviewing student feedback ( N = 11) were the second and third most common strategies, respectively. Overall, we present eight strategies to aid implementation of active learning.
Characteristics of the active learning studies
To address our first research question (What are the characteristics of studies that examine affective and behavioral outcomes of active learning and provide instructor strategies?), we discuss the different ways studies reported research on active learning.
Limitations and gaps within the final sample
First, we must discuss the gaps within our final sample of 29 studies. We excluded numerous active learning studies ( N = 383) that did not discuss or reflect upon the efficacy of their strategies to aid implementation of active learning. We also began this systematic literature review in 2015 and did not finish our coding and analysis of 2364 abstracts and 746 full-texts until 2018. We acknowledge that there have been multiple studies published on active learning since 2015. Acknowledging these limitations, we discuss our results and analysis in the context of the 29 studies in our dataset, which were published from 1990 to 2015.
Our final sample included only 2 studies that sampled mathematics and statistics courses. In addition, there was also a lack of studies outside of first-year courses. Much of the active learning research literature introduces interventions in first-year (cornerstone) or fourth-year (capstone) courses, but we found within our dataset a tendency to oversample first-year courses. However, all four course-years were represented, as well as all major STEM disciplines, with the most common STEM disciplines being engineering (14 studies) and biology (7 studies).
Thirteen studies implemented course-based active learning interventions, such as project-based learning (8 studies), inquiry-based learning (3 studies), or a flipped classroom (2 studies). Only one study, Lenz ( 2015 ), used a previously published active learning intervention, which was Process-Oriented Guided Inquiry Learning (POGIL). Other examples of published active learning programs include the Student-Centered Active Learning Environment for Upside-down Pedagogies (SCALE-UP, Gaffney et al., 2010 ) and Chemistry, Life, the Universe, and Everything (CLUE, Cooper & Klymkowsky, 2013 ), but these were not included in our sample of 29 studies.
In contrast, most of the active learning interventions involved adding in-class problem solving (either with individual students or groups of students) to a traditional lecture course (21 studies). For some instructors attempting to adopt active learning, using this smaller active learning intervention (in-class problem solving) may be a good starting point.
Despite the variety of quantitative, qualitative, and mixed method research designs, most studies used either self-made instructor surveys (20 studies) or their institution’s course evaluations (10 studies). The variation between so many different versions of instructor surveys and course evaluations made it difficult to compare data or attempt a quantitative meta-analysis. Further, only 2 studies used instruments with evidence of validity. However, that trend may change as there are more examples of instruments with evidence of validity, such as the Student Response to Instructional Practices (StRIP, DeMonbrun et al., 2017 ), the Biology Interest Questionnaire (BIQ, Knekta, Rowland, Corwin, & Eddy, 2020 ), and the Pedagogical Expectancy Violation Assessment (PEVA, Gaffney et al., 2010 ).
We were also concerned about the use of institutional course evaluations (10 studies) as evidence of students’ satisfaction and affective responses to active learning. Course evaluations capture more than just students’ responses to active learning, as the scores are biased toward the instructors’ gender (Mitchell & Martin, 2018 ) and race (Daniel, 2019 ), and they are strongly correlated with students’ expected grade in the class (Nguyen et al., 2017 ). Despite these limitations, we kept course evaluations in our keyword search and inclusion criteria, because they relate to instructors concerns about student resistance to active learning, and these scores continue to be used for important instructor reappointment, tenure, and promotion decisions (DeMonbrun et al., 2017 ).
In addition to students’ satisfaction, there were other measures related to students’ affective and behavioral responses to active learning. The most common measure was students’ self-reports of whether they thought they learned more or less (21 studies). Other important affective outcomes included enjoyment (13 studies) and self-efficacy (4 students). The most common behavioral measure was students’ participation (16 studies). However, missing from this sample were other affective outcomes, such as students’ identities, beliefs, emotions, values, and buy-in.
Positive outcomes for using active learning
Twenty-three of the 29 studies reported positive or mostly positive outcomes for their active learning intervention. At the start of this paper, we acknowledged that much of the existing research suggested the widespread positive benefits of using active learning in undergraduate STEM courses. However, much of these positive benefits related to active learning were centered on students’ cognitive learning outcomes (e.g., Theobald et al., 2020 ) and not students’ affective and behavioral responses to active learning. Here, we show positive affective and behavioral outcomes in terms of students’ self-reports of learning, enjoyment, self-efficacy, attendance, participation, and course satisfaction.
Due to the lack of mixed/neutral or negative affective outcomes, it is important to acknowledge potential publication bias within our dataset. Authors may be hesitant to report negative outcomes to active learning interventions. It could also be the case that negative or non-significant outcomes are not easily published in undergraduate STEM education venues. These factors could help explain the lack of mixed/neutral or negative study outcomes in our dataset.
Strategies to aid implementation of active learning
We aimed to answer the question: what instructor strategies to aid implementation of active learning do the authors of these studies provide? We addressed this question by providing instructors and readers a summary of actionable strategies they can take back to their own classrooms. Here, we discuss the range of strategies found within our systematic literature review.
Supporting instructors with actionable strategies
We identified eight specific strategies across three major categories: explanation, facilitation, and planning. Each strategy appeared in at least seven studies (Table 2 ), and each strategy was written to be actionable and practical.
Strategies in the explanation category emphasized the importance of establishing expectations and explaining the purpose of active learning to students. The facilitation category focused on approaching and encouraging students once activities were underway. Strategies in the planning category highlight the importance of working outside of class time to thoughtfully design appropriate activities , create policies for group work , align various components of the course , and review student feedback to iteratively improve the course.
However, as we note in the “Introduction” section, these strategies are not entirely new, and the strategies will not be surprising to experienced researchers and educators. Even still, there has yet to be a systematic review that compiles these instructor strategies in relation to students’ affective and behavioral responses to active learning. For example, the “explain the purpose” strategy is similar to the productive framing (e.g., Hutchison & Hammer, 2010 ) of the activity for students. “Design appropriate activities” and “align various components of the course” relate to Vygotsky’s ( 1978 ) theories of scaffolding for students (Shekhar et al., 2020 ). “Review student feedback” and “approaching students” relate to ideas on formative assessment (e.g., Pellegrino, DiBello, & Brophy, 2014 ) or revising the course materials in relation to students’ ongoing needs.
We also acknowledge that we do not have an exhaustive list of specific strategies to aid implementation of active learning. More work needs to be done measuring and observing these strategies in-action and testing the use of these strategies against certain outcomes. Some of this work of measuring instructor strategies has already begun (e.g., DeMonbrun et al., 2017 ; Finelli et al., 2018 ; Tharayil et al., 2018 ), but further testing and analysis would benefit the active learning community. We hope that our framework of explanation, facilitation, and planning strategies provide a guide for instructors adopting active learning. Since these strategies are compiled from the undergraduate STEM education literature and research on affective and behavioral responses to active learning, instructors have compelling reason to use these strategies to aid implementation of active learning.
One way to consider using these strategies is to consider the various aspects of instruction and their sequence. That is, planning strategies would be most applicable during the phase of work that occurs prior to classroom instruction, the explanation strategies would be more useful when introducing students to active learning activities, while facilitation strategies would be best enacted while students are already working and engaged in the assigned activities. Of course, these strategies may also be used in conjunction with each other and are not strictly limited to these phases. For example, one plausible approach could be using the planning strategies of design and alignment as areas of emphasis during explanation . Overall, we hope that this framework of strategies supports instructors’ adoption and sustained use of active learning.
Creation of the planning category
At the start of this paper, we presented a conceptual framework for strategies consisting of only explanation and facilitation categories (DeMonbrun et al., 2017 ). One of the major contributions of this paper is the addition of a third category, which we call the planning category, to the existing conceptual framework. The planning strategies were common throughout the systematic literature review, and many studies emphasized the need to consider how much time and effort is needed when adding active learning to the course. Although students may not see this preparation, and we did not see this type of strategy initially, explicitly adding the planning category acknowledges the work instructors do outside of the classroom.
The planning strategies also highlight the need for instructors to not only think about implementing active learning before they enter the class, but to revise their implementation after the class is over. Instructors should refine their use of active learning through feedback, reflection, and practice over multiple course offerings. We hope this persistence can lead to long-term adoption of active learning.
Despite our review ending in 2015, most of STEM instruction remains didactic (Laursen, 2019 ; Stains et al., 2018 ), and there has not been a long-term sustained adoption of active learning. In a push to increase the adoption of active learning within undergraduate STEM courses, we hope this study provided support and actionable strategies for instructors who are considering active learning but are concerned about student resistance to active learning.
We identified eight specific strategies to aid implementation of active learning based on three categories. The three categories of strategies were explanation, facilitation, and planning. In this review, we created the third category, planning, and we suggested that this category should be considered first when implementing active learning in the course. Instructors should then focus on explaining and facilitating their activity in the classroom. The eight specific strategies provided here can be incorporated into faculty professional development programs and readily adopted by instructors wanting to implement active learning in their STEM courses.
There remains important future work in active learning research, and we noted these gaps within our review. It would be useful to specifically review and measure instructor strategies in-action and compare its use against other affective outcomes, such as identity, interest, and emotions.
There has yet to be a study that compiles and synthesizes strategies reported from multiple active learning studies, and we hope that this paper filled this important gap. The strategies identified in this review can help instructors persist beyond awkward initial implementations, avoid some problems altogether, and most importantly address student resistance to active learning. Further, the planning strategies emphasize that the use of active learning can be improved over time, which may help instructors have more realistic expectations for the first or second time they implement a new activity. There are many benefits to introducing active learning in the classroom, and we hope that these benefits are shared among more STEM instructors and students.
Availability of data and materials
Journal articles and conference proceedings which make up this review can be found through reverse citation lookup. See the Appendix for the references of all primary studies within this systematic review. We used the following databases to find studies within the review: Web of Science, Academic Search Complete, Compendex, Inspec, Education Source, and Education Resource Information Center. More details and keyword search strings are provided in the “Methods” section.
Abbreviations
Science, technology, engineering, and mathematics
Student resistance to active learning
Instrument with evidence of validity
Instructor surveys
Preferred Reporting Items for Systematic Reviews and Meta-Analyses
Problem solving
Problem or project-based learning
End of course evaluations
Armbruster, P., Patel, M., Johnson, E., & Weiss, M. (2009). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE Life Sciences Education , 8 (3), 203–213. https://doi.org/10.1187/cbe.09-03-0025 .
Article Google Scholar
Autin, M., Bateiha, S., & Marchionda, H. (2013). Power through struggle in introductory statistics. PRIMUS , 23 (10), 935–948. https://doi.org/10.1080/10511970.2013.820810 .
Bandura, A. (1982). Self-efficacy mechanism in human agency. American Psychologist , 37 (2), 122 https://psycnet.apa.org/doi/10.1037/0003-066X.37.2.122 .
Berkling, K., & Zundel, A. (2015). Change Management: Overcoming the Challenges of Introducing Self-Driven Learning. International Journal of Engineering Pedagogy (iJEP), 5 (4), 38–46. https://www.learntechlib.org/p/207352/ .
Bilston, L. (1999). Lessons from a problem-based learning class in first year engineering statics . Paper presented at the 2nd Asia-Pacific Forum on Engineering and Technology Education, Clayton, Victoria.
Borrego, M., Cutler, S., Prince, M., Henderson, C., & Froyd, J. E. (2013). Fidelity of implementation of research-based instructional strategies (RBIS) in engineering science courses. Journal of Engineering Education , 102 (3), 394–425. https://doi.org/10.1002/jee.20020 .
Borrego, M., Foster, M. J., & Froyd, J. E. (2014). Systematic literature reviews in engineering education and other developing interdisciplinary fields. Journal of Engineering Education , 103 (1), 45–76. https://doi.org/10.1002/jee.20038 .
Borrego, M., Nguyen, K., Crockett, C., DeMonbrun, M., Shekhar, P., Tharayil, S., … Waters, C. (2018). Systematic literature review of students’ affective responses to active learning: Overview of results . San Jose: Paper presented at the 2018 IEEE Frontiers in Education Conference (FIE). https://doi.org/10.1109/FIE.2018.8659306 .
Book Google Scholar
Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development . Sage Publications Inc.
Breckler, J., & Yu, J. R. (2011). Student responses to a hands-on kinesthetic lecture activity for learning about the oxygen carrying capacity of blood. Advances in Physiology Education, 35 (1), 39–47. https://doi.org/10.1152/advan.00090.2010 .
Bullard, L., & Felder, R. (2007). A student-centered approach to the stoichiometry course . Honolulu: Paper presented at the 2007 ASEE Annual Conference and Exposition https://peer.asee.org/1543 .
Carlson, K. A., & Winquist, J. R. (2011). Evaluating an active learning approach to teaching introductory statistics: A classroom workbook approach. Journal of Statistics Education , 19 (1). https://doi.org/10.1080/10691898.2011.11889596 .
Chi, M. T., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist , 49 (4), 219–243. https://doi.org/10.1080/00461520.2014.965823 .
Christensen, T. (2005). Changing the learning environment in large general education astronomy classes. Journal of College Science Teaching, 35 (3), 34.
Cooper, M., & Klymkowsky, M. (2013). Chemistry, life, the universe, and everything: A new approach to general chemistry, and a model for curriculum reform. Journal of Chemical Education , 90 (9), 1116–1122. https://doi.org/10.1021/ed300456y .
Creswell, J. W., & Creswell, J. D. (2017). Research design: Qualitative, quantitative, and mixed methods approaches . Sage Publishing Inc.
Dancy, M., Henderson, C., & Turpen, C. (2016). How faculty learn about and implement research-based instructional strategies: The case of peer instruction. Physical Review Physics Education Research , 12 (1). https://doi.org/10.1103/PhysRevPhysEducRes.12.010110 .
Daniel, B. J. (2019). Teaching while black: Racial dynamics, evaluations, and the role of white females in the Canadian academy in carrying the racism torch. Race Ethnicity and Education , 22 (1), 21–37. https://doi.org/10.1080/13613324.2018.1468745 .
de Justo, E., & Delgado, A. (2014). Change to competence-based education in structural engineering. Journal of Professional Issues in Engineering Education and Practice , 141 (3). https://doi.org/10.1061/(ASCE)EI.1943-5541.0000215 .
DeMonbrun, R. M., Finelli, C., Prince, M., Borrego, M., Shekhar, P., Henderson, C., & Waters, C. (2017). Creating an instrument to measure student response to instructional practices. Journal of Engineering Education , 106 (2), 273–298. https://doi.org/10.1002/jee.20162 .
Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences , 116 (39), 19251–19257. https://doi.org/10.1073/pnas.1821936116 .
Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review , 95 (2), 256–273. https://doi.org/10.1037/0033-295X.95.2.256 .
Finelli, C., Nguyen, K., Henderson, C., Borrego, M., Shekhar, P., Prince, M., … Waters, C. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching , 47 (5), 80–91 https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-mayjune-2018/research-and-1 .
Google Scholar
Finelli, C. J., Daly, S. R., & Richardson, K. M. (2014). Bridging the research-to-practice gap: Designing an institutional change plan using local evidence. Journal of Engineering Education , 103 (2), 331–361. https://doi.org/10.1002/jee.20042 .
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences , 111 (23), 8410–8415. https://doi.org/10.1073/pnas.1319030111 .
Froyd, J. E., Borrego, M., Cutler, S., Henderson, C., & Prince, M. J. (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education , 56 (4), 393–399. https://doi.org/10.1109/TE.2013.2244602 .
Gaffney, J. D., Gaffney, A. L. H., & Beichner, R. J. (2010). Do they see it coming? Using expectancy violation to gauge the success of pedagogical reforms. Physical Review Special Topics - Physics Education Research , 6 (1), 010102. https://doi.org/10.1103/PhysRevSTPER.6.010102 .
Gough, D., Oliver, S., & Thomas, J. (2017). An introduction to systematic reviews . Sage Publishing Inc.
Harun, N. F., Yusof, K. M., Jamaludin, M. Z., & Hassan, S. A. H. S. (2012). Motivation in problem-based learning implementation. Procedia-Social and Behavioral Sciences , 56 , 233–242. https://doi.org/10.1016/j.sbspro.2012.09.650 .
Haseeb, A. (2011). Implementation of micro-level problem based learning in a course on electronic materialas. Journal of Materials Education , 33 (5-6), 273–282 http://eprints.um.edu.my/id/eprint/5501 .
Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching , 48 (8), 952–984. https://doi.org/10.1002/tea.20439 .
Henderson, C., & Dancy, M. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics - Physics Education Research , 3 (2). https://doi.org/10.1103/PhysRevSTPER.3.020102 .
Henderson, C., Khan, R., & Dancy, M. (2018). Will my student evaluations decrease if I adopt an active learning instructional strategy? American Journal of Physics , 86 (12), 934–942. https://doi.org/10.1119/1.5065907 .
Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics , 3 (4), 447–462. https://doi.org/10.1007/s11948-997-0047-x .
Hodgson, Y., Benson, R., & Brack, C. (2013). Using action research to improve student engagement in a peer-assisted learning programme. Educational Action Research, 21 (3), 359-375. https://doi.org/10.1080/09650792.2013.813399 .
Hora, M. T., & Ferrare, J. J. (2013). Instructional systems of practice: A multi-dimensional analysis of math and science undergraduate course planning and classroom teaching. Journal of the Learning Sciences , 22 (2), 212–257. https://doi.org/10.1080/10508406.2012.729767 .
Hutchison, P., & Hammer, D. (2010). Attending to student epistemological framing in a science classroom. Science Education , 94 (3), 506–524. https://doi.org/10.1002/sce.20373 .
Jaeger, B., & Bilen, S. (2006). The one-minute engineer: Getting design class out of the starting blocks . Paper presented at the 2006 ASEE Annual Conference and Exposition, Chicago, IL. https://peer.asee.org/524 .
Jesiek, B. K., Mazzurco, A., Buswell, N. T., & Thompson, J. D. (2018). Boundary spanning and engineering: A qualitative systematic review. Journal of Engineering Education , 107 (3), 318–413. https://doi.org/10.1002/jee.20219 .
Kezar, A., Gehrke, S., & Elrod, S. (2015). Implicit theories of change as a barrier to change on college campuses: An examination of STEM reform. The Review of Higher Education , 38 (4), 479–506. https://doi.org/10.1353/rhe.2015.0026 .
Kirkpatrick, D. L. (1976). Evaluation of training. In R. L. Craig (Ed.), Training and development handbook: A guide to human resource development . McGraw Hill.
Knekta, E., Rowland, A. A., Corwin, L. A., & Eddy, S. (2020). Measuring university students’ interest in biology: Evaluation of an instrument targeting Hidi and Renninger’s individual interest. International Journal of STEM Education , 7 , 1–16. https://doi.org/10.1186/s40594-020-00217-4 .
Kovac, J. (1999). Student active learning methods in general chemistry. Journal of Chemical Education , 76 (1), 120. https://doi.org/10.1021/ed076p120 .
Krishnan, S., & Nalim, M. R. (2009). Project based learning in introductory thermodynamics . Austin: Paper presented at the 2009 ASEE Annual Conference and Exposition https://peer.asee.org/5615 .
Kuh, G. D. (2005). Student engagement in the first year of college. In M. L. Upcraft, J. N. Gardner, J. N, & B. O. Barefoot (Eds.), Challenging and supporting the first-year student: A handbook for improving the first year of college , (pp. 86–107). Jossey-Bass.
Laatsch, L., Britton, L., Keating, S., Kirchner, P., Lehman, D., Madsen-Myers, K., Milson, L., Otto, C., & Spence, L. (2005). Cooperative learning effects on teamwork attitudes in clinical laboratory science students. American Society for Clinical Laboratory Science, 18(3). https://doi.org/10.29074/ascls.18.3.150 .
Lake, D. A. (2001). Student performance and perceptions of a lecture-based course compared with the same course utilizing group discussion. Physical Therapy , 81 (3), 896–902. https://doi.org/10.1093/ptj/81.3.896 .
Laursen, S. (2019). Levers for change: An assessment of progress on changing STEM instruction American Association for the Advancement of Science. https://www.aaas.org/resources/levers-change-assessment-progress-changing-stem-instruction .
Lehtovuori, A., Honkala, M., Kettunen, H., & Leppävirta, J. (2013). Interactive engagement methods in teaching electrical engineering basic courses. In Paper presented at the IEEE global engineering education conference (EDUCON) . Germany: Berlin. https://doi.org/10.1109/EduCon.2013.6530089 .
Chapter Google Scholar
Lenz, L. (2015). Active learning in a math for liberal arts classroom. PRIMUS , 25 (3), 279–296. https://doi.org/10.1080/10511970.2014.971474 .
Li, J., Zhao, Y., & Shi, L. (2009). Interactive teaching methods in information security course . Paper presented at the International Conference on Scalable Computing and Communications; The Eighth International Conference on Embedded Computing. https://doi.org/10.1109/EmbeddedCom-ScalCom.2009.94 .
Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gotzsche, P. C., Ioannidis, J. P., … Moher, D. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate healthcare interventions: Explanation and elaboration. Journal of Clinical Epidemology , 62 (10), e1–e34. https://doi.org/10.1016/j.jclinepi.2009.06.006 .
Lund, T. J., & Stains, M. (2015). The importance of context: An exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. International Journal of STEM Education , 2 (1). https://doi.org/10.1186/s40594-015-0026-8 .
Machemer, P. L., & Crawford, P. (2007). Student perceptions of active learning in a large cross-disciplinary classroom. Active Learning in Higher Education , 8 (1), 9–30. https://doi.org/10.1177/1469787407074008 .
Maib, J., Hall, R., Collier, H., & Thomas, M. (2006). A multi-method evaluation of the implementation of a student response system . Paper presented at the 12th Americas’ Conference on Information Systems (AMCIS), Acapulco, Mexico. https://aisel.aisnet.org/amcis2006/27 .
McClanahan, E. B., & McClanahan, L. L. (2002). Active learning in a non-majors biology class: Lessons learned. College Teaching , 50 (3), 92–96. https://doi.org/10.1080/87567550209595884 .
McLoone, S., & Brennan, C. (2015). On the use and evaluation of a smart device student response system in an undergraduate mathematics classroom. AISHE-J: The All Ireland Journal of Teaching and Learning in Higher Education, 7(3). http://ojs.aishe.org/index.php/aishe-j/article/view/243 .
Metzger, K. J. (2015). Collaborative teaching practices in undergraduate active learning classrooms: A report of faculty team teaching models and student reflections from two biology courses. Bioscene: Journal of College Biology Teaching , 41 (1), 3–9 http://www.acube.org/wp-content/uploads/2017/11/2015_1.pdf .
Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. PS: Political Science & Politics , 51 (3), 648–652. https://doi.org/10.1017/S104909651800001X .
Nguyen, K., Husman, J., Borrego, M., Shekhar, P., Prince, M., DeMonbrun, R. M., … Waters, C. (2017). Students’ expectations, types of instruction, and instructor strategies predicting student response to active learning. International Journal of Engineering Education , 33 (1(A)), 2–18 http://www.ijee.ie/latestissues/Vol33-1A/02_ijee3363ns.pdf .
Oakley, B. A., Hanna, D. M., Kuzmyn, Z., & Felder, R. M. (2007). Best practices involving teamwork in the classroom: Results from a survey of 6435 engineering student respondents. IEEE Transactions on Education , 50 (3), 266–272. https://doi.org/10.1109/TE.2007.901982 .
Oliveira, P. C., & Oliveira, C. G. (2014). Integrator element as a promoter of active learning in engineering teaching. European Journal of Engineering Education, 39 (2), 201–211. https://doi.org/10.1080/03043797.2013.854318 .
Owens, D. C., Sadler, T. D., Barlow, A. T., & Smith-Walters, C. (2020). Student motivation from and resistance to active learning rooted in essential science practices. Research in Science Education , 50 (1), 253–277. https://doi.org/10.1007/s11165-017-9688-1 .
Parker Siburt, C. J., Bissell, A. N., & Macphail, R. A. (2011). Developing Metacognitive and Problem-Solving Skills through Problem Manipulation. Journal of Chemical Education, 88 (11), 1489–1495. https://doi.org/10.1021/ed100891s .
Passow, H. J., & Passow, C. H. (2017). What competencies should undergraduate engineering programs emphasize? A systematic review. Journal of Engineering Education , 106 (3), 475–526. https://doi.org/10.1002/jee.20171 .
Patrick, L. E., Howell, L. A., & Wischusen, W. (2016). Perceptions of active learning between faculty and undergraduates: Differing views among departments. Journal of STEM Education: Innovations and Research , 17 (3), 55 https://www.jstem.org/jstem/index.php/JSTEM/article/view/2121/1776 .
Pellegrino, J., DiBello, L., & Brophy, S. (2014). The science and design of assessment in engineering education. In A. Johri, & B. Olds (Eds.), Cambridge handbook of engineering education research , (pp. 571–598). Cambridge University Press. https://doi.org/10.1017/CBO9781139013451.036 .
Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide . Blackwell Publishing. https://doi.org/10.1002/9780470754887 .
Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education , 93 , 223–232. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x .
Prince, M., & Felder, R. (2007). The many faces of inductive teaching and learning. Journal of College Science Teaching , 36 (5), 14–20.
Ramsier, R. D., Broadway, F. S., Cheung, H. M., Evans, E. A., & Qammar, H. K. (2003). University physics: A hybrid approach . Nashville: Paper presented at the 2003 ASEE Annual Conference and Exposition https://peer.asee.org/11934 .
Regev, G., Gause, D. C., & Wegmann, A. (2008). Requirements engineering education in the 21st century, an experiential learning approach . 2008 16th IEEE International Requirements Engineering Conference, Catalunya. https://doi.org/10.1109/RE.2008.28 .
Rockland, R., Hirsch, L., Burr-Alexander, L., Carpinelli, J. D., & Kimmel, H. S. (2013). Learning outside the classroom—Flipping an undergraduate circuits analysis course . Atlanta: Paper presented at the 2013 ASEE Annual Conference and Exposition https://peer.asee.org/19868 .
Schmidt, J. A., Rosenberg, J. M., & Beymer, P. N. (2018). A person-in-context approach to student engagement in science: Examining learning activities and choice. Journal of Research in Science Teaching , 55 (1), 19–43. https://doi.org/10.1002/tea.21409 .
Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., & Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: A systematic review of underlying reasons. Journal of College Science Teaching , 49 (6) https://www.nsta.org/journal-college-science-teaching/journal-college-science-teaching-julyaugust-2020/negative-student .
Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005). Pedagogies of engagement: Classroom-based practices. Journal of Engineering Education , 94 (1), 87–101. https://doi.org/10.1002/j.2168-9830.2005.tb00831.x .
Stains, M., Harshman, J., Barker, M., Chasteen, S., Cole, R., DeChenne-Peters, S., … Young, A. M. (2018). Anatomy of STEM teaching in north American universities. Science , 359 (6383), 1468–1470. https://doi.org/10.1126/science.aap8892 .
Stains, M., & Vickrey, T. (2017). Fidelity of implementation: An overlooked yet critical construct to establish effectiveness of evidence-based instructional practices. CBE Life Sciences Education , 16 (1). https://doi.org/10.1187/cbe.16-03-0113 .
Stump, G. S., Husman, J., & Corby, M. (2014). Engineering students' intelligence beliefs and learning. Journal of Engineering Education , 103 (3), 369–387. https://doi.org/10.1002/jee.20051 .
Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C. J., & Waters, C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education , 5 (1), 7. https://doi.org/10.1186/s40594-018-0102-y .
Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences , 117 (12), 6476–6483. https://doi.org/10.1073/pnas.1916903117 .
Tolman, A., Kremling, J., & Tagg, J. (2016). Why students resist learning: A practical model for understanding and helping students . Stylus Publishing, LLC.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes . Harvard University Press.
Weimer, M. (2002). Learner-centered teaching: Five key changes to practice . Wiley.
Wigfield, A., & Eccles, J. S. (2000). Expectancy–value theory of achievement motivation. Contemporary Educational Psychology , 25 (1), 68–81. https://doi.org/10.1006/ceps.1999.1015 .
Winkler, I., & Rybnikova, I. (2019). Student resistance in the classroom—Functional-instrumentalist, critical-emancipatory and critical-functional conceptualisations. Higher Education Quarterly , 73 (4), 521–538. https://doi.org/10.1111/hequ.12219 .
Download references
Acknowledgements
We thank our collaborators, Charles Henderson and Michael Prince, for their early contributions to this project, including screening hundreds of abstracts and full papers. Thank you to Adam Papendieck and Katherine Doerr for their feedback on early versions of this manuscript. Finally, thank you to the anonymous reviewers at the International Journal of STEM Education for your constructive feedback.
This work was supported by the National Science Foundation through grant #1744407. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Author information
Authors and affiliations.
Hutchins School of Liberal Studies, Sonoma State University, Rohnert Park, CA, USA
Kevin A. Nguyen
Departments of Curriculum & Instruction and Mechanical Engineering, University of Texas, Austin, TX, USA
Maura Borrego & Sneha Tharayil
Departments of Electrical Engineering & Computer Science and Education, University of Michigan, 4413 EECS Building, 1301 Beal Avenue, Ann Arbor, MI, 48109, USA
Cynthia J. Finelli & Caroline Crockett
Enrollment Management Research Group, Southern Methodist University, Dallas, TX, USA
Matt DeMonbrun
School of Applied Engineering and Technology, New Jersey Institute of Technology, Newark, NJ, USA
Prateek Shekhar
Advanced Manufacturing and Materials, Naval Surface Warfare Center Carderock Division, Potomac, MD, USA
Cynthia Waters
Cabot Science Library, Harvard University, Cambridge, MA, USA
Robyn Rosenberg
You can also search for this author in PubMed Google Scholar
Contributions
All authors contributed to the design and execution of this paper. KN, MB, and CW created the original vision for the paper. RR solicited, downloaded, and catalogued all studies for review. All authors contributed in reviewing and screening hundreds of studies. KN then led the initial analysis and creation of strategy codes. CF reviewed and finalized the analysis. All authors drafted, reviewed, and finalized sections of the paper. KN, MB, MD, and CC led the final review of the paper. All authors read and approved the final manuscript.
Corresponding author
Correspondence to Cynthia J. Finelli .
Ethics declarations
Competing interests.
The authors declare that they have no competing interests.
Additional information
Publisher’s note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and permissions
About this article
Cite this article.
Nguyen, K.A., Borrego, M., Finelli, C.J. et al. Instructor strategies to aid implementation of active learning: a systematic literature review. IJ STEM Ed 8 , 9 (2021). https://doi.org/10.1186/s40594-021-00270-7
Download citation
Received : 19 June 2020
Accepted : 18 January 2021
Published : 15 March 2021
DOI : https://doi.org/10.1186/s40594-021-00270-7
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Active learning
- Systematic review
- Instructor strategies; student response
Trends of Active Learning in Higher Education and Students' Well-Being: A Literature Review
Affiliations.
- 1 Faculty of Sport Sciences and Physical Education, University of Coimbra, Coimbra, Portugal.
- 2 Research Unit in Sport and Physical Activity (CIDAF), Coimbra, Portugal.
- 3 Centre for 20th Century Interdisciplinary Studies (CEIS20), Coimbra, Portugal.
- 4 Faculty of Education, Universidad Internacional de La Rioja, La Rioja, Spain.
- 5 Faculty of Sport, University of Porto, Porto, Portugal.
- 6 Research Centre in Education, Innovation, Intervention in Sport (CIFI2D), Porto, Portugal.
- 7 Centre for Research and Intervention in Education (CIIE), Porto, Portugal.
- PMID: 35519651
- PMCID: PMC9062227
- DOI: 10.3389/fpsyg.2022.844236
This literature Review had the purpose of inspecting how the use of active learning methodologies in higher education can impact students' Well-being. Considering the Heads of State meeting at United Nations Headquarters on September 2015, in which the 2030 Agenda for Sustainable Development was adopted by all United Nations Member states, this literature review is limbered to the time period between September 2015 and September 2021. A Previous research focused on reviews was made to support the conceptual framework. The search was done in two databases - Web of Science main collection and Scopus - by two researchers autonomously, using the following search criteria: "higher education AND active learning AND student AND wellness OR well-being OR wellbeing." The studies section attended the following inclusion criteria: (i) published in peer-reviewed journals; (ii) empirical studies; (iii) written in English, French, Portuguese or Spanish; (iv) open access full text; (v) Higher education context; and (vi) focused on the topic under study. The search provided 10 articles which were submitted to an inductive thematic analysis attending to the purpose of this review, resulting in two themes: (i) students' well-being during confinement; (ii) methodological solutions for students' well-being. Data show that the use of active methodologies, as digital technologies, and the incorporation of some practice as physical activity and volunteering seems to benefit students' well-being, namely in their academic achievement, physical, emotional, and social life, and empower them to the professional future with multi-competencies. Higher education institutions need to understand the value of active learning methodologies in sustained education and promote them in their practices.
Keywords: active methodologies; review; sustainability; university; well-being.
Copyright © 2022 Ribeiro-Silva, Amorim, Aparicio-Herguedas and Batista.
Publication types
An official website of the United States government
Official websites use .gov A .gov website belongs to an official government organization in the United States.
Secure .gov websites use HTTPS A lock ( Lock Locked padlock icon ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.
- Publications
- Account settings
- Advanced Search
- Journal List
A space for learning: An analysis of research on active learning spaces
Robert talbert, anat mor-avi.
- Author information
- Article notes
- Copyright and License information
Corresponding author. [email protected]
Received 2019 Jan 24; Revised 2019 May 14; Accepted 2019 Nov 28; Collection date 2019 Dec.
This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
Active Learning Classrooms (ALCs) are learning spaces specially designed to optimize the practice of active learning and amplify its positive effects in learners from young children through university-level learners. As interest in and adoption of ALCs has increased rapidly over the last decade, the need for grounded research in their effects on learners and schools has grown proportionately. In this paper, we review the peer-reviewed published research on ALCs, dating back to the introduction of “studio” classrooms and the SCALE-UP program up to the present day. We investigate the literature and summarize findings on the effects of ALCs on learning outcomes, student engagement, and the behaviors and practices of instructors as well as the specific elements of ALC design that seem to contribute the most to these effects. We also look at the emerging cultural impact of ALCs on institutions of learning, and we examine the drawbacks of the published research as well as avenues for potential future research in this area.
Keywords: Education, Institutional culture, Active learning space, Active learning, 21st century classroom, Active learning classroom, Collaborative learning
Education, Institutional culture; Active learning space; Active learning; 21st century classroom; Active learning classroom; Collaborative learning
1. Introduction
1.1. what is active learning, and what is an active learning classroom.
Active learning is defined broadly to include any pedagogical method that involves students actively working on learning tasks and reflecting on their work, apart from watching, listening, and taking notes ( Bonwell and Eison, 1991 ). Active learning has taken hold as a normative instructional practice in K12 and higher education institutions worldwide. Recent studies, such as the 2014 meta-analysis linking active learning pedagogies with dramatically reduced failure rates in university-level STEM courses ( Freeman et al., 2014 ) have established that active learning drives increased student learning and engagement across disciplines, grade levels, and demographics (see Figures 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 ).
SCALE-UP classroom, adapted from http://scaleup.ncsu.edu/FAQs.html .
Database search overview.
Number of studies by educational context.
Number of studies by location.
Number of studies by publication year.
Research question 1.
Framework for 21st century skills.
Framework for research question 2.
Next generation learning space, adapted from Byers and Imms (2014) .
Research question 3.
Steelcase LearnLab ALC.
ALC with nonstandard layout.
Design qualities and elements of ALCs.
From the findings: the influence of ALCs on the new culture of learning.
As schools, colleges, and universities increasingly seek to implement active learning, concerns about the learning spaces used for active learning have naturally arisen. Attempts to implement active learning pedagogies in spaces that are not attuned to the particular needs of active learning --- for example, large lecture halls with fixed seating --- have resulted in suboptimal results and often frustration among instructors and students alike. In an effort to link architectural design to best practices in active learning pedagogy, numerous instructors, school leaders, and architects have explored how learning spaces can be differently designed to support active learning and amplify its positive effects on student learning. The result is a category of learning spaces known as Active Learning Classrooms (ALCs) 1 .
While there is no universally accepted definition of an ALC, the spaces often described by this term have several common characteristics:
ALCs are classrooms , that is, formal spaces in which learners convene for educational activities. We do not include less-formal learning spaces such as faculty offices, library study spaces, or “in-between” spaces located in hallways or foyers.
ALCs include deliberate architectural and design attributes that are specifically intended to promote active learning. These typically include moveable furniture that can be reconfigured into a variety of different setups with ease, seating that places students in small groups, plentiful horizontal and/or vertical writing surfaces such as whiteboards, and easy access to learning technologies (including technological infrastructure such as power outlets).
In particular, most ALCs have a “polycentric” or “acentric” design in which there is no clearly-defined front of the room by default. Rather, the instructor has a station which is either movable or located in an inconspicuous location so as not to attract attention; or perhaps there is no specific location for the instructor.
Finally, ALCs typically provide easy access to digital and analog tools for learning , such as multiple digital projectors, tablet or laptop computers, wall-mounted and personal whiteboards, or classroom response systems.
1.2. A brief history of ALCs
Focused attempts to design learning spaces specifically attuned to active learning pedagogies date back to the 1990s. One of the earliest published efforts along these lines was the “studio physics” concept implemented at Rensselaer Polytechnic Institute ( Wilson and Jennings, 2000 ; Wilson, 1994 ). A study of institutional costs of large lecture approaches to teaching introductory physics (which, as Wilson found, incurred large hidden expenses in the need for staffing labs and recitation sections) along with contemporary findings on the pedagogical shortcomings of lecture methods ( Halloun and Hestenes, 1985 ) drove the design of classrooms seating 50 and 64 students each, with six-foot work tables holding two students each, arranged in concentric ovals around the center of the room.
The physics classes taught in these “studio” spaces were redesigned to focus on active lab-like activities done in groups of two students, seated so that they must turn away from the center of the room to work on their activities. Unlike traditional science courses that separate lecture and lab activities, studio physics courses conducted all learning activities in the same space.
Later, in an effort to adapt the studio physics concept to larger course sections, Robert Beichner and others at North Carolina State University redesigned their introductory physics courses using a combination of innovations in pedagogy, technology, and space in what they eventually deemed the Student Centered Active Learning Environment with Upside-down Pedagogy, or SCALE-UP ( Beichner, 1999 ; Beichner et al., 2007 ). As with studio physics, SCALE-UP physics courses combined changes in pedagogy, technology, and space to refocus class activity on active learning in a combination of lecture, recitation, and lab tasks.
Students in SCALE-UP rooms are arranged at large circular tables seating nine students each (so they could work in three small groups of three students each, then combine into a larger group of nine) with affordances for technology such as multiple large projection screens for computer work and plentiful whiteboard space, and there was no clearly defined front of the room.
A similar initiative called Technology Enabled Active Learning (TEAL), also involving large polycentrically-designed rooms with round tables for active learning tasks, was undertaken at around the same time at MIT ( Dori and Belcher, 2005 ) and a few years later at Iowa State University with the TILE (Transform, Interact, Learn, Engage) project ( Van Horne, Murniati, Gaffney and Jesse, 2012 ).
While these ALC implementations were taking place in higher education, parallel investigations done by architects and designers were producing evidence against traditional lecture-focused learning spaces. One of the most prominent of these studies ( Scott-Webber et al., 2000 ), simply titled “Higher Education Classrooms Fail to Meet Needs of Faculty and Students”, used a design based on the concept of proxemics ( Hall, 1959 ) to conduct a qualitative study of faculty and students’ responses to traditional classrooms. The study found that key elements of the physical environment such as seating and noise control interfered with active learning and suppressed social activity and affective responses.
In addition to studies like that of Scott-Webber, Abraham, and Marini, research on ALCs began to accumulate shortly after their initial implementation, often conducted by the faculty members teaching in the space. Beichner's original report on the early implementation of SCALE-UP ( Beichner, 1999 ) and the later, much-expanded follow-up study involving 16,000 students and multiple universities ( Beichner et al., 2007 ) found remarkable improvements in student learning and engagement in SCALE-UP; these results were corroborated by research from the TEAL and TILE groups and later by others.
These promising results have helped fuel a surge in interest for ALCs among K12 and higher education institutions worldwide. EDUCAUSE named 2017 as “the year of the active learning classroom” ( Brooks et al., 2017 ) and placed ALCs at the top of the list of higher education's top 10 strategic technologies for 2017 ( Grajek, 2015 ). At Steelcase, Inc., a program to award 16 grants of up to $65,000 each to fund the construction of ALCs received over 1100 applications in 2018 from K12 and higher education institutions in North America. The interest in ALCs from many different educational sectors is strong and growing.
This interest has come with a common question: What evidence is there that active learning classrooms actually make a difference in the issues that matter to learners and educational institutions, such as student learning outcomes, student engagement and retention, and faculty adoption of research-based pedagogical practices? Put more simply, what assurances are there that institutions who expend large amounts of financial and other resources to install ALCs will reap positive benefits from doing so?
The purpose of this study is to examine the published literature up to the time of this writing to analyze and summarize the research on active learning classrooms, both in general and organized around several categories of importance to researchers and educational institutions alike. We seek to discern patterns of results among the studies we selected to discover the answers to the above questions about ALCs as well as best practices for using ALCs and opportunities for further research in this area.
2. Material and methods
2.1. research questions.
The main question that this study intends to investigate is: What are the effects of the use of ALCs on student learning, faculty teaching, and institutional cultures? Within this broad overall question, we will focus on four research questions:
1. What effects do ALCs have on measurable metrics of student academic achievement? Included in such metrics are measures such as exam scores, course grades, and learning gains on pre/post-test measures, along with data on the acquisition of “21st Century Skills”, which we will define using a framework ( OCDE, 2009 ) which groups “21st Century Skills” into skills pertaining to information, communication, and ethical/social impact.
What effects do ALCs have on student engagement? Specifically, we examine results pertaining to affective, behavioral, and cognitive elements of the idea of “engagement” as well as results that cut across these categories.
What effect do ALCs have on the pedagogical practices and behaviors of instructors? In addition to their effects on students, we are also interested the effects of ALCs on the instructors who use them. Specifically, we are interested in how ALCs affect instructor attitudes toward and implementations of active learning, how ALCs influence faculty adoption of active learning pedagogies, and how the use of ALCs affects instructors’ general and environmental behavior.
What specific design elements of ALCs contribute significantly to the above effects? Finally, we seek to identify the critical elements of ALCs that contribute the most to their effects on student learning and instructor performance, including affordances and elements of design, architecture, and technology integration.
2.2. Identification of relevant studies
To find studies relevant to our research questions, we performed academic database queries using the databases ERIC ( eric.ed.gov ), ProQuest Education, LearnTechLib, Google Scholar, and Education Research Complete. We used the following terms for these queries:
Active learning classroom
Active learning space
Active learning environment
Active learning center
We discovered after our initial search that these terms are fraught with ambiguity. For example, many studies from searches on the term “active learning classroom” did not use the term “classroom” to refer to a physical space, but rather to a group of students (e.g. “My biology course is an active learning classroom” or “my classroom engages in active learning”). Similarly, the terms “space” and “environment” were sometimes used to refer to course design rather than physical space or environment (e.g., “I designed my course to be an active learning environment for students”). Therefore, the results of the database search were checked manually to ensure that the studies referred to physical spaces.
Three search queries were found to yield results that address our research questions: “active learning classroom”, “active learning space”, and “active learning center”. Those queries are hereafter referred to as Query 1, Query 2, and Query 3 respectively. Each query was performed on each of the databases listed above, with each search filtered to include only those studies that were peer-reviewed and published in scholarly journals. Ultimately, we used only the search results from the first three of these databases; Google Scholar lacked the tools to sufficiently narrow our search based on inclusion criteria (below), and Education Research Complete yielded only a few results, all of which were duplicated by the other databases.
We also noted that some studies on ALCs do not use the language in our queries. For example, the studies of Byers and Imms use the term “21st century classrooms” or “next generation learning spaces” which are not found elsewhere in the literature. Earlier studies do not use any special terminology referring to ALCs at all. Therefore, to capture these “false negatives” (studies known to pertain to ALCs but which did not appear in our searches), we used three known sources of information on ALCs: the literature review of Donna Gierdowski (2013) , the studies referenced in Baepler et al. (2016) , and the papers of Terry Byers and Wesley Imms along with studies found in the reference sections of the Byers and Imms papers.
2.3. Selection of studies
Performing Queries 1–3 on the ProQuest, LearnTechLib, and ERIC databases yielded 99, 35, and 117 results respectively, a large number of which were duplicates. Additionally, the reference section search using Brooks et al., Gierdowski, and the papers of Byers and Imms yielded 60 total studies with some duplicates. These results were filtered through the following inclusion criteria:
We include only those studies that pertain to ALCs and not just learning spaces in general.
We include only those studies that examine an intervention involving the introduction of an ALC to an actual student learning situation such as a class or multiple sections of a class.
We include only those studies that collect empirical qualitative or quantitative data on the effects of the intervention towards one or more of our five research questions, with sufficiently scientific methodology and in sufficient quantities to merit rigorous analysis.
For example, we excluded studies that examined the effects of space on student learning but which did not specifically involve ALCs (for example, Meyers-Levy and Zhu, 2007 ). We also excluded studies that examined the nature, architecture, or properties of ALCs but did not perform an intervention on students or which did not collect qualitative or quantitative data. We also excluded studies that focused on ALCs and did collect some data, but only collected anecdotal evidence or evidence from only a very small number of respondents.
After removing duplicate search results and filtering out studies that did not meet the inclusion criteria, the total number of distinct studies from the database queries was 21. We also found 60 studies in the references in Brooks et al., Gierdowski, and Byers and Imms; after removing duplicate entries and applying the inclusion criteria, the results from the non-database references was 55. The database and reference section results were then combined and duplicates removed, and a final round of the inclusion criteria applied, to arrive at a final count of 37 studies.
2.4. Data charting and collation
Using the studies identified through our searches and inclusion criteria, we read each article carefully and compiled notes on the setup, methods, and results from each. We summarized each of these studies in a table (below) with entries for the authors and year of the study; the context (primary education, higher education, etc.) and location of the study; the type of intervention done; the design of the study; and a summary of the results. A summary of the results can be found in the table located in Appendix A. An electronic version of this table, suitable for downloading and data analysis, can be found at the following link: http://bit.ly/ALC-litreview-table .
2.5. Overall characteristics of the results
All of the studies identified through our search took place with actual students in educational settings (as opposed to experimental studies done in controlled laboratory-like settings). The majority of these (32 out of 37) were conducted in higher education, with university courses and university students. Of the remaining studies, two were conducted in secondary education settings; two more were conducted using grade and age levels in late primary/early secondary education settings (specifically, students in years 7 and 8 of formal schooling); and one study conducted in a primary education setting.
In addition, 27 of the 37 studies were conducted in the United States of America (and all of these were done in higher education settings). Roughly half (5 out of 11) of the remaining studies were conducted in Australia in research connected with Terry Byers and Wesley Imms; and the rest took place in various other geographic and cultural contexts.
Finally, we note that most of the research identified in this study took place between 2012 and 2016:
2.6. Methodological characteristics of the studies
The methodologies employed by the 37 studies that met our inclusion criteria varied widely. We have summarized the design and results of each of the studies in Appendix A: Summary Table of Results where the reader can find abbreviated information on the educational context, type of intervention, study design (including sample sizes and type of design), which of our research questions were addressed by each study, and results of each of the studies in this review. Because of the wide variation in the studies we considered, a few notes are in order:
The sample sizes used in the studies were sometimes not specified in the studies themselves, or different populations and samples were used within single studies. For example, some studies did not list their population sizes at all ( Connolly and Lampe, 2016 ), while others discussed the number of classes that were studied but not the number of students ( Hyun et al., 2017 ), while still others studied both students and faculty and give sample sizes for both.
We have attempted to note which of our four research questions were addressed by each study in Appendix A, and sum up how many studies address each research question in Section 3 (Results). However, it was not clear in some cases whether a particular research question was being addressed by a particular study. In particular, all of the studies in this review in some sense address research question 4 (Which elements of design most affect student and instructor outcomes?) but only a few of those studies address this question directly as an intentional element of the research design. In other cases, the line between student learning outcomes and student engagement in the design and results of a study was blurry. Therefore our estimation of which questions were addressed by each study is a judgment call.
More specific comments on the methodologies of the studies involved will be given in Section 4 (Discussion).
3.1. What effects do ALCs have on measurable metrics of student academic achievement?
In this section we address the research question, What effects do ALCs have on measurable metrics of student academic achievement? We separate these metrics of academic achievement into two categories: grades and other quantitative outcomes of learning that are focused on content mastery, and so-called “21st century skills” focused on non-content academic skills such as the management of information. Of the 37 studies in this review, we found 10 that address this particular question.
3.1.1. Quantitative measures of student learning
Many of the studies in this review focused on quantitative measures of student learning, generally grouped into three categories: grades on individual assignments (e.g. final exams) or groups of assignments, course grades, and learning gains on pre-versus post-test measures of concept inventories. We now summarize those results using each of those categories.
The earliest study in this review is by Oliver-Hoyo et al. (2004) and focuses on assignment grades. The researchers used an implementation of the SCALE-UP model in an introductory university chemistry class that employed the “concept Advancement through chemistry Lab-Lecture" (cAcL 2 ) pedagogical model in which lecture and laboratory components are done together rather than separately. The study compared the grade outcomes on various assignments for students in the ALC section of the course with those taken by similar students in a traditional setting in which lecture and labs were conducted separately. They found no statistically significant differences between the sections when taking averages across exams for each section. However, the exam × class interaction was significant; students in the ALC section scored significantly higher on the second and fourth exams than the students in the traditional section. There were no statistically significant differences on the remaining exams, although the bottom 25% of students in the ALC section scored significantly higher than the bottom 25% of students in the traditional section on the last three exams. This study illustrates two results that appear elsewhere in this review: (1) ALC sections of courses often require time for students to acclimate to the space before positive results are seen, and (2) ALC sections tend to show the strongest learning gains among the lowest-performing students.
In a study with a different focus, Muthyala and Wei (2013) compared the exam scores of students in two different layouts of ALCs, rather than ALC vs. traditional space. The two layouts, “Spoke” (students seated at tables radiating outward from the center of the room) and “Node” (students seated in moveable chairs in clusters of four; not to be confused with the Steelcase seating product of the same name), were used in two sections of an introductory organic chemistry class and two sections of a more advanced organic chemistry class, and student performance on exams between sections of the same course were compared. The study found no statistically significant differences on a composite “summative evaluation” score composed of exam scores, the final exam, and the course grade between the ALC and traditional sections. However, there was a non-significant but “noticeable” difference in the first exam, with the “Node” layout performing better than the “Spoke” layout, but on subsequent exams this was reversed, with “Spoke” having higher exam grades. The authors posit that the differences in the first exam score are due to the similarity of “Node” to more traditional layouts, but then as students become more comfortable with collaboration, they become more attuned to the affordances of the “Spoke” layout.
A study by Brooks and Solheim (2014) examined both the course grade and individual assignment grades in a SCALE-UP environment compared to a traditional space, using two sections of a university-level personal finance course. They found statistically significant differences in the course grades, with the ALC students scoring higher than the traditional section (85.5% versus 81.8%). In examining whether the differences in course grades might be an artifact of a single assignment (e.g. higher participation rates in the ALC section) rather than systematic improvement, average scores in each major group of assignments (attendance, a financial planner project, case studies, the final exam, and quizzes) were compared. The students in the ALC section scored significantly higher in each group than did students in the traditional section, at either p < 0.001 or p < 0.0001 levels.
Shifting focus to course grades alone, Brooks and Solheim (2014) also examined the differences in course grades between students in the quartiles of each section and found statistically significant difference in each quartile, with ALC students having higher course grades than their traditional section counterparts in every quartile. The grade differences in the first, second, and third quartiles were significant at the p < 0.0001 level, while the grade differences in the fourth quartile were significant but only at the p < 0.05 level.
Several other studies give results on the course grades of students in ALCs, either by comparing course grades between ALC and traditional sections of courses or by comparing proportions of students who fail courses in an ALC versus a traditional section. Beichner et al. (2007) report that the pilot implementation of SCALE-UP across several institutions resulted in a 40–60% reduction in the failure rates of students in introductory university physics courses compared to students in the same course in a traditional space. The same study examined “failure ratios” in introductory physics courses taught in SCALE-UP versus traditional environments; this term is defined to be the percentage of students who failed the course taught in a traditional space divided by the percentage of students who failed taught in the SCALE-UP space. Failure ratios were computed for students in various ethnic and gender groups, and the results ranged between 2.1 for Asian-American students to 4.7 for female students. The failure ratio for Hispanic students could not be computed because none of the Hispanic students failed the SCALE-UP courses.
Three related studies ( Brooks, 2011 ; Cotner et al., 2013 ; Walker et al., 2011 ) again studied the course grades of students in ALC sections of courses versus those of students in different sections of the same course taught under similar conditions but in a traditional space. In each of these studies, students in the ALC section were found to have significantly lower scores on their American College Test (ACT) scores than students in the corresponding traditional section. In each study, a regression model showed a statistically significant correlation between ACT score and final course grade based on past data. All three studies found no statistically significant difference between the final course grades of students in an ALC section versus those in the traditional section --- even though the regression model predicted the ALC students would have significantly lower grades. By contrast, students in the traditional sections obtained course grades in line with what the regression model predicted. The authors conclude that the ALC environment served as a catalyst for students to perform better than reliable statistical models would predict.
One study ( Baepler et al., 2014 ) examined the use of an ALC in a large (350-student) undergraduate general chemistry class by splitting the class into three cohorts of roughly 100–115 students each, and having each cohort meet in a 117-student SCALE-UP room once per week, instead of the entire class meeting in a traditional space three times per week. The remaining two days per week of activities for each cohort were spent working through online activities. Two sections of the course were blended in this way, and a control section was kept intact in a traditional space. Grade and demographic data were collected as well as two examinations developed by the American Chemical Society and a single exam generated by the instructors of the class. The researchers found moderate performance differences between one of the experimental blended groups and the control group, and no statistically significant difference between the other blended group and the control. After controlling for demographic and aptitude variables, however, the blended approach yielded results that were the same or better than the traditional approach.
Finally, we look at studies that examine normalized gains on concept inventories. The normalized gain ( Coletta and Phillips, 2005 ) is a numerical measure computed using the outcomes of a pre-test and a post-test administration of an assessment (in this case, a concept inventory). The normalized gain is defined as the ratio of actual student gains between the pre-test and post-test, to the theoretical gain:
The normalized gain is therefore a number between 0 and 1 inclusively, with a higher score indicating greater improvement. A score of 0 means the pre- and post-test scores were the same, indicating no improvement; a score of 1 indicates a perfect score on the post-test, i.e. the maximum amount of improvement has occurred. Typically, normalized gain is computed using for entire classes using class averages, rather than one student at a time.
Beichner et al. (2007) examined normalized gains across students in university physics courses at several different sites using different concept inventory instruments. In one setting, students in a SCALE-UP section of an electricity and magnetism course showed significantly higher normalized gains on the Conceptual Survey of Electricity and Magnetism, the Electric Circuit Conceptual Evaluation, and the MIT Electricity and Magnetism Test, than students in a traditional setting, with the largest differences in normalized gains found among students in the top one-third of the courses. Students in SCALE-UP sections of the mechanics portion of an introductory physics course showed significantly higher normalized gains on the Force Concept Inventory compared to students in a traditional section, with the traditional section obtaining normalized gains of 0.204 and 0.176 for regular and honors sections of the courses (respectively) compared with normalized gains of 0.483 and 0.477 for regular and honors sections of the SCALE-UP course.
3.1.2. “21st century skills”
ALCs are generally considered to have an effect not only on content-oriented measures of student learning but also on proficiencies often called “21st century skills”. We examined the studies in this review for evidence of the impact of ALCs on “21st century skills” as a form of measurable student learning outcome, separate from and orthogonal to content-based measures of achievement.
Despite their frequent use in current discourse about education, there is considerable disagreement on what constitutes “21st century skills”. A common rubric for these skills is “The Four C's” --- communication, creativity, collaboration, and critical thinking ( Framework for 21st Century Learning - P21, 2019 ). However, there is in fact no standard operationalization of this term. In many contexts, “21st century skills” refers to a set of competencies associated with processing and communicating information. In others, it refers to items more properly characterized as behaviors, such as motivation, perseverance or “grit”, or flexibility. Ananiadou and Claro (2009) point out that lists of “21st century skills” is dependent on culture, with different world cultures placing different values on individual skills.
To provide some standardization for interpreting the results of studies that purport to examine “21st century skills”, we will adopt the framework presented by the Organization for Economic Cooperation and Development ( OCDE, 2009 ), which categorizes “21st century skills” into three dimensions:
Information : This dimension is subdivided into information as source , which includes skills pertaining to finding and organizing information such as information literacy, research and inquiry, and critical analysis; and information as product , which includes skills at restructuring, modeling, and developing one's own ideas about the meaning of information. The OECD framework includes “creativity” and “innovation” in this sub-dimension. Although not explicitly included in the OECD framework, we include general “problem solving skills” in this dimension as well, since this concept occurs frequently in other discussions of “21st century skills” and is a combination of working with information as a source and as a product.
Communication : This dimension is also subdivided, into effective communication, which includes skills in critical thinking and interpersonal communication; and collaboration and virtual interaction , including skills such as teamwork and flexibility.
Ethical and social impact : As with the other two, this dimension is subdivided into social responsibility , including skills in critical thinking (seen as distinct from critical thinking in the communication dimension, although with overlap between the two), personal responsibility and decision making, and social impact which includes such skills as reflection and “digital citizenship”.
Note that this framework includes all of the “Four C's”, although “critical thinking” is mapped to two different dimensions. Critical thinking in this framework can refer to two skills: The mindful use of information when engaging in discourse with another (Communication), and the analysis of information to make decisions (Ethical/Social Impact).
We first highlight a study by Nissim et al. (2016) because it addresses a wide range of 21st century skills in an ALC used by pre-service teachers in Israel. Students were given the Active Learning Post Occupancy Evaluation (AL-POE) ( Scott-Webber et al., 2014 ), which contains items in several areas related to what the authors describe as “21st century skills”. For the purposes of this review, we will only focus on those skills that fit the OECD framework given above. The study found that:
100% of students and approximately 95% of instructors reported moderate, high, or exceptional increases in the ability to be creative in the ALC.
When asked to rate both their old/traditional classrooms and their ALCs as “adequate” in several different factors, student ratings were significantly greater for ALC's (at p < 0.002 significance) for the following areas: collaboration, active involvement, and in-class feedback, for both pedagogical practices and pedagogical solutions. (Several other factors that do not fit into the OECD 21st Century Skills framework were similarly highly rated.)
Other studies in the review have more targeted results in each of the three dimensions of the OECD framework:
Information dimension : The SCALE-UP report from Beichner et al. (2007) reports not only significant improvements on student learning outcomes as reported in the previous section, but also reports that students’ “[a]bility to solve problems is as good or better” in the SCALE-UP environment as in a traditional college physics setting. The authors appear to extrapolate this observation from the data on physics content outcomes, rather than collecting additional data on general problem-solving skills. A study by Rands and Gansemer-Topf (2017) reports students using tools afforded by ALCs to engage spontaneously in visualizing information, even when not explicitly instructed to do so. Finally, in a very large scale study ( Chiu and Cheng, 2016 ) involving n = 35953 students in general education courses taught in ALCs in a Hong Kong university, students reported significantly better learning experiences overall and a significantly higher rating for “encouragement to be creative and innovative” when compared to the same course taught in a traditional space.
Communication dimension : Some studies of ALCs report students, instructors, or both stating that the ALC “affords” improvements in communication and collaboration ( Connolly and Lampe, 2016 ; Rands and Gansemer-Topf, 2017 ) --- that is, the respondents in those studies imply that communication and collaboration take place in an ALC at a level that would not be present in a traditional classroom space. (We note that data reported in Connolly and Lampe (2016) are quite limited; survey results were not reported, and only three individual responses from student focus groups are given.). Additionally, Cotner et al. (2013) , in an observational study, report that students in an ALC objectively engage in more group activity and collaboration than in a traditional classroom space. Finally, students in the study by Byers and Imms ( Byers and Imms, 2016 ) report significantly more interactivity and collaboration take place in an ALC than in a traditional space.
Ethical/social impact dimension : The only study in the review that reports data on this dimension of 21st Century Skills is ( Chen, 2014 ), in which 28 fourth-year students in psychology were instructed in an ALC and given the Active Open-Minded Thinking (AOT) questionnaire ( Stanovich and West, 1997 ) as a pre- and post-test measure. Chen (2014) explains that AOT is “a person's ability to actively reflect on his/her thinking, actively seek and process information that contradicts his/her beliefs, and be willing to alter his/her mindset after carefully considering opposing beliefs” (pp. 173–174). Chen found that students AOT generally increased over the term, although increases among students with high initial AOT were negligible.
3.2. What effects do ALCs have on student engagement?
The second research question we investigated is, What effects do ALCs have on student engagement? This dimension of the student learning experience is connected to, but quite distinct from the effects noted in the previous section on student learning outcomes. A large number of the studies in this review (31 out of 37 in our estimation) targeted student engagement, and there has been a strong and sustained interest in “engaged learning” at all levels of discussion on teaching and learning. The results of the studies in this review that specifically target “engagement” are therefore numerous, widely varied, and of potential use to many school leaders and educators.
However, we must first come to terms with the meaning of the word “engagement” as it relates to learning. Unfortunately, despite the wide usage of “engagement” in academic and popular discourse, the term remains poorly operationalized. At times, it is defined in terms that describe certain student activities and behaviors, while at others it describes what parents, teachers, and school leaders do to elicit those activities and behaviors. Even when confined just to students, “engagement” can sometimes refer to feelings and emotions; in other cases, to the things students do, say, or intend; in still other situations, to beliefs or perceptions that students hold; or a combination of these, often undifferentiated between the aforementioned aspects of “engagement”. There is no consensus on a definition of “engagement” from which to conduct a rigorous analysis of research on the subject; and given the wide array of meanings of this term in actual use, such a definition is unlikely to come to light in the near future.
Therefore, in this literature review, we will not attempt to rectify the various concepts of “engagement” that permeate discussions of teaching and learning. We will only give some background on the origins of the term, and then state a model for “engagement” that we will use to parse and frame the various results in this review's studies that pertain to engagement.
As Axelson and Flick (2010) point out, the term “engage” originally derives from a Norman word meaning “pledge”, so that to “engage” originally meant to enter into a binding obligation to another through oaths or laws (The term “mortgage” derives from the same root and has a similar connotation.) The modern usage of the word is similar: To be “engaged” means to pledge oneself to a binding involvement with something or someone, much like the usage of this term in the context of a pledge to be married. In an academic setting, “engagement” can therefore be understood as a state of committed involvement, in an activity into which we have entered willingly and with the intention to complete, and in which “we are entirely present and not somewhere else” (Axelson & Flick, p. 40, emphasis in original).
The concept of involvement speaks to a more modern origin of the concept of academic engagement. Alexander Astin's research on student involvement in the 1980's ( Astin, 1999 ) is considered by many scholars to be the forerunner of the modern notion of “engagement”. Astin's theory of involvement “refers to the investment of physical and psychological energy in various objects” ( Astin (1999) , p. 519); occurs along a continuum with different students contributing different levels of investment in the same objects; and has both quantitative and qualitative aspects. Astin further ties the quantity of student learning and development as well as the effectiveness of educational policy to the quality and quantity of student involvement.
In later explorations of “engagement”, the qualitative and quantitative features of “involvement” tend to take on three distinct, yet strongly overlapping categories: affective forms of engagement, which pertain to feelings and emotions attached to student involvement (or the lack thereof); behavioral evidence of engagement, describing student actions that indicate involvement; and cognitive engagement, referring to beliefs, knowledge, and perceptions connected to involvement.
We note that this tripartite framework aligns well with the “ABC” (Affective/Behavioral/Cognitive) model of attitude promulgated by Eagly and Chaiken (for example ( Eagly and Chaiken, 1998 ),) and others. Indeed, the modern concept of student engagement is closely tied to student attitudes about learning, and thus this framework for engagement is appropriate for the results found in this review.
As we begin to examine the studies on ALCs and engagement, we will use the above framework for engagement and an understanding of the etymology of the word to guide our understanding of the results. In those results, we identified two major groups of studies about engagement: Those that focused on only one of the three particular facets of engagement in our model, and those that addressed more than one form of engagement.
3.2.1. General or undifferentiated results about engagement
We begin by examining studies whose focus on engagement either did not differentiate between the three areas of engagement in our model, or did recognize the different areas but examined more than one of them.
We first highlight two results from our review that explicitly connect the elements of ALCs to various facets of student engagement. These center on the use of the Active Learning Post-Occupancy Evaluation (AL-POE) instrument developed by Scott-Webber et al. (2014) to gather quantitative data about the impact of ALC's on engagement.
Scott-Webber et al. (2014) developed the AL-POE to measure student engagement, as a behavioral indicator using twelve parameters:
“…collaboration
active involvement
opportunity to engage
repeated exposure to the material through multiple means
in-class feedback
real-life scenarios
ability to engage
physical movement
stimulation
feeling comfortable to participate
creation of enriching experience” (p. 4).
The result was an instrument that sought to measure the effect of evidence-based, intentionally designed solution interventions on student engagement in the formal learning place. The AL-POE aimed to explore the comparison between old/pre (row-by-column seating) environment and new/post intentionally designed environment by identified student engagement factors. Respondents to the AL-POE are asked to rate the adequacy of the old learning space and the new learning space on factors connected to the twelve parameters above.
In Scott-Webber et al. (2014) , three different universities participated in the study that included 124 students and educators in three types of new environments where a combination of intentionally designed active learning furniture designed by Steelcase, Inc. was included. The design of the study was a comparison of the experience between a traditional classroom and the new experience in the active learning classrooms. The results showed statistically significant differences (p < 0.001) on all twelve factors between the behavior in a traditional classroom and the behavior in the ALC. A later study by Scott-Webber and others ( Scott-Webber et al., 2017 ), while not included in our main results because it does not involve ALCs and therefore does not meet our inclusion criteria, verifies that active learning techniques and physical movement are significant indicators of student engagement and that the design of the built environment can have an effect on the connection between pedagogy and engagement.
A later study that also used the AL-POE ( Nissim et al., 2016 ) administered the instrument to 87 students and 15 instructors in an Israeli teacher training college and used the results to examine self-ratings of whether ALCs enhanced their creativity, motivation, engagement in general, and perceived ability to achieve a higher grade. Over 90% of instructors and 80% of students reported moderate, high, or exceptional increases in these factors in moving from a traditional space to an ALC. Further, an “overall engagement score” was calculated from the AL-POE results. The overall engagement scores for both practices and solutions in the ALC space were statistically significantly higher (at the p < 0.0001 level) than those scores in the traditional space.
A similar line of inquiry about engagement, in which students are exposed to both traditional and ALC spaces and then asked to rate each space numerically using various parameters, can be found in the papers of Australian researchers Terry Byers and Wesley Imms and their co-investigators. The studies of Byers and Imms share a common methodology, namely a single-subject research design (SSRD). Under this design, a single cohort of students (and sometimes instructors) experiences both a “control” condition (learning in a traditional space) and then an “experimental” condition (learning in an ALC), with all other variables such as pedagogy and technology held relatively constant. In most of Byers and Imms’ work, the students return to the traditional space for a third phase, and in some studies the students return again to the ALC for a fourth phase. At the beginning and end of each phase, students are given a survey that measures self-reported data on various aspects of the learning process, resulting in between four and eight sets of comparative data. A feature of Byers and Imms’ design is that major confounding variables such as pedagogical techniques and the use of technology are controlled for, allowing space to be the only true variable in the study and hence create a much stronger connection between the variance of space and the results of the survey. Byers and Imms, in fact, claim a truly causal link between space and their results.
In 2014, Byers and Imms conducted a study that produced two separate papers ( Byers and Imms, 2014 ; Byers et al., 2014 ) connecting ALCs to improvements in engagement. The study involved 164 students, grouped into six classes comprised of both year 7 and year 8 students, in three buildings in a primary/secondary religious school in buildings constructed between 1940 and 1960. Two groups of classroom spaces were involved in the study: a traditional space, and a “next generation learning space” with a polycentric layout allowing for three different teaching and learning modalities to exist in the same space. Students learned first in the traditional space and then in the ALC, taking pre- and post-measure surveys in each phase.
The first paper ( Byers and Imms, 2014 ) focused on survey results pertaining to students’ perceptions of the effectiveness of using one-to-one technology in their learning. The researchers found statistically significant higher ratings of ALCs versus traditional spaces in the areas of "positive influence” (i.e. whether students perceived the use of technology as having a positive influence on their learning), "effectiveness", and "flexibility", in 11 out of the 18 possible combinations of student class and question type. Furthermore, a teacher focus group was held to follow up the surveys; in the focus group, teachers reported the ALCs afforded higher levels of student collaboration.
Whereas Byers and Imms (2014) reports on somewhat narrow facets of student engagement, the second set of results from this study ( Byers et al., 2014 ) takes a broader view, reporting back on survey results that target engagement generally speaking. The researchers found that the ALC phase of the study generated significantly higher ratings on "student learning experience” and "student engagement” in all but one class’ response to one group of questions, and the effect sizes suggest "an improvement of 1–2 standard deviations from the traditional classroom mean” (p. 8).
Another study from Byers and Imms ( Byers and Imms, 2016 ) involved 94 year 4 students in an Australian primary school (We note that this is the only study identified in the review focusing specifically on primary schools.) The cohort attended class sessions in two spaces: a traditional space featuring rows of desks facing a well-defined front of the room, and an ALC with a polycentric layout and adjustable-height tables. Both spaces were outfitted with technology such as interactive whiteboards, wireless internet connectivity, and tablet computers. Following the pattern of single-subject research design, students spent a period of time in the traditional space, then a similar period in the ALC, and then moved back to the traditional space. At the beginning and end of each phase, students were given the Linking Technology Pedagogy and Space (LTPS) survey instrument which poses 10 items in three domains regarding students’ perceptions of technology, learning experiences, and engagement. (According to our model, all three of these domains can be considered “engagement”.) Students were grouped into four classes, giving 40 possible combinations of class and survey item. The researchers found statistically significantly higher ratings for the ALC in 22 out of 40 of those combinations spanning all three domains, with significant results in all four classes on items pertaining to the positive influence and effectiveness of technology, and increased interest in learning. Results were significantly higher in 3 of the 4 classes in the areas of increased interactivity, collaboration, and preference for a space to learn.
A final study from Byers and Imms ( Imms and Byers, 2017 ) studied three classes (n = 170) of year 7 students in three different learning spaces: A traditional space (referred to as “mode 1”), a student-focused polycentric ALC (“mode 2”), and a “mode 3” learning space “where social activities overlap informal and active learning activities in spaces such as covered outdoor learning areas, hallway ‘nooks’ and lounge-styled rooms” (p. 140). Each class rotated through all three spaces, spending one academic term in each and taking the LTPS at the beginning and end of each term. The study found mixed results. On the LTPS items pertaining to student engagement (“domain C″ in the language of the LTPS), significantly higher ratings were given for the mode 3 classroom compared to the mode 1 (traditional) space, but not for the mode 2 space; significantly higher ratings for students’ willingness to take on challenges were obtained for both mode 2 and mode 3 compared to the traditional mode 1 space; and ratings for students’ willingness to work beyond their limits of expertise were statistically significantly higher only for mode 3 among only the high-ability students.
To round out our discussion of studies targeting general notions of engagement, we look at four other studies:
A study by Whiteside et al. (2010) , one of several conducted at the University of Minnesota, involved a quasi-experimental design in which one section of an introductory biology course was taught in a traditional lecture space, while another was taught in an ALC designed roughly according to the SCALE-UP model (large round tables seating 9 students each, ubiquitous projection technology, accessibility to whiteboards, etc.). An observational analysis was conducted to record the frequency and type of faculty and student behaviors in each section. The study found that students reported significantly higher ratings for ALCs on several parameters associated with engagement. Students rated ALCs higher for contributions to their class engagement overall, the enrichment of their learning experiences, the flexibility of their learning, and the fit of the classroom to the course. Additionally, there were significant differences in how students rated the different spaces based on their geographic background (urban versus rural) and their classification (freshman/sophomore versus junior/senior).
In another study done at the University of Minnesota ( Cotner et al., 2013 ), again two sections of an introductory biology course participated in a quasi-experiment in which one section was conducted in an ALC and the other in a traditional space. Surveys on student perceptions of those spaces were given during the last week of class, personal data were collected from students, and observations were conducted on a randomly selected 50% of those students with a focus on their on-task behavior and specific other learning behaviors. The study found significantly higher levels of engagement (measured generally through the survey and observational data) in the ALC and found significant positive correlation between the use of ALCs and the use of group activities.
In a study of social science and literature classes in a Korean university ( Park and Choi, 2014 ), one group of classes was conducted in an ALC holding 30 students using five circular tables outfitted with computer docking stations and projectors. Another group was held in a traditional space with a row-column arrangement of desks. Both groups were given a survey on their satisfaction with the rooms in which they learned. Students in the traditional space showed a preference for seating locations, which the authors termed the “golden zone”, as well as a “shadow zone” that was less preferred, and it was observed that the two different zones led to different learning outcomes --- and one's position in the room was strongly dependent on arrival time and proximity to friends. On the other hand, students in the ALC found it to be preferable to traditional learning spaces and found a much more uniform student experience of having a sense of belonging in the class, a sense of fun, and a high level of concentration and class attendance.
In a smaller-scale study, Rands and Gansemer-Topf (2017) conducted a focus group of four instructors and nine students who had taught or taken classes in an ALC. The participants were given semi-structured interviews on their interaction with others, the physical and technological attributes of the room, and their perceptions of their own motivation and self-regulation. The researchers found that the ALC design helped to create a community of learners and “erased the line” between students and instructors. Respondents also reported that the open space in the ALC affords more movement and interaction, helps students work at their optimal level of challenge, provides tools that are helpful for assessing understanding and for visualizing thinking, and encourages holistic and integrated learning.
Engagement and Constructivism: As a final remark on these general results on engagement, we note one study ( Sawers et al., 2016 ) that studied how instructors perceive “engagement” in students in an ALC. Instructors teaching in an ALC were given the Classroom Utilization Survey augmented with open-ended questions. The study found that the ALC tends to have a greater impact on perceived student engagement when used by faculty who identity with a more constructivist educational philosophy (that is, a teaching philosophy based on the notion that knowledge is not transferred from one person to another but rather constructed within the minds of each person through purposeful activity). It is unclear whether this means that actual student engagement is more frequent or pronounced in an ALC under a constructivist teacher, or whether more teachers subscribing to constructivism are more likely to interpret student activity as active engagement than those who are not constructivist. The results of Sawers et al. (2016) primarily serve as a caveat to those seeking to apply or interpret the research we have cited here, that survey and self-reported data must take into account the ontological stance of the observer.
3.2.2. Specific categorical results about engagement
We also identified several studies that targeted specific forms of engagement. As we have noted, the categories in our engagement model strongly overlap, so our categorization below is approximate.
The following studies focused on affective forms of engagement:
In Miller-Cochran and Gierdowski (2013) , 195 students in sections of a university first-year writing course learned in an ALC using a “bring your own technology” (BYOT) approach to technology (in which students brought their own computing equipment, rather than it being provided as part of the space). The ALC in turn was outfitted with LCD projectors, whiteboards, and three kinds of moveable desks. In a post-semester survey on student preferences, 78% of students preferred the flexible BYOT design over traditional fixed design; 5% preferred more fixed design, and 17% indicated no preference. A large portion (66%) of students said the design of the ALC "somewhat contributes", "contributes", or "contributes highly” to their learning.
In Hyun, Ediger, and Lee (2017) , sixteen classes taught by seven different faculty were given an assessment focusing on student satisfaction and the frequency of engagement in active learning tasks. Five of the classes were taught in ALCs and the remaining eleven taught in traditional spaces. The study found that active learning pedagogy and classroom type were significantly correlated with levels of individual and group satisfaction and were significant predictors of both kinds of satisfaction. Student individual and group satisfaction were significantly increased in ALCs compared to corresponding measures of satisfaction in traditional classrooms.
The following studies focused on behavioral forms of engagement:
Taylor (2009) examined two science classes taught in the same semester using a “Studio” setup (mentioned earlier as a precursor to the SCALE-UP idea). One class was an undergraduate introductory course, the other a graduate class. Each faculty member was interviewed four times and each class surveyed four times. The student surveys focused on overall impressions and usage of the Studio environment. The survey responses indicated that students attributed a significant impact on their learning to the ALC layout, particularly its affordances for group work and collaboration and for freedom of movement. Students also reported feeling more comfortable in asking questions of the instructor and of each other.
In Whiteside, Jorn, Duin, and Fitzgerald (2009) , two science classrooms (one engineering, the other biology) were renovated into ALCs using a variant of the SCALE-UP model. Surveys and exit interviews were conducted with faculty and students using the spaces (n = 17 and n = 168 respectively) along with 29 class observations focused on student attitudes and perceptions. Among other results reported in Section 6 (instructor practices), students expressed a broadly favorable opinion of the ALCs, especially in terms of their effectiveness to facilitate collaboration, connectedness, and discussion.
A somewhat unusual study by Henshaw et al. (2011) involved taking a traditional classroom and replacing all the seats with swivel chairs, bolted to the floor but capable of 360-degree rotation. Through surveys given to students about their experiences in the room, the study found significant increases in face-to-face interaction among students and changes in instructor pedagogies that promote interaction (which we will describe in more detail in the next section).
Another study ( Ge et al., 2015 ) examined five classes with 92 students in all, along with five professors, in a “technology-enhanced” ALC. Each class was studied using classroom observations, interviews, and surveys at the beginning and end of the semester. By triangulating the various data, the researchers found that students achieved significantly higher confidence by the end of the semester, and students’ self-efficacy and confidence in problem solving tasks increased over time. However, there was no significant difference between the pre- and post-measures in terms of intrinsic motivation to solve problems. .
Finally, Connolly and Lampe (2016) studied a class in information technology management being taught in an ALC with moveable furniture, multiple projector units, and a polycentric layout. Students reported that the classroom environment encouraged them to connect with fellow students, be more sociable, better prepare for class discussions, and enjoy the class more. However, we must also note that the survey data used in this study were not disclosed in the paper, and only three self-reported results from the surveys were mentioned.
Finally, the following studies focused on cognitive forms of engagement:
Pashak and Hagen (2014) studied a single undergraduate psychology course with 24 students taught in a Steelcase LearnLab environment, with a polycentric layout featuring four tables seating six students each, and each table having a separate digital display switchable between different students at the table. The room also had PolyVision interactive whiteboards, personal whiteboards, and a document camera. Research assistants observed class sessions and noted behaviors of students including body movement and note-taking, as well as behaviors and pedagogical choices of the professors. Additionally, a survey was given to students at the end of the course to measure perceptions and preferences regarding the space. The main result of the study pertained to student attention: The timing of the class had a significant effect on students’ attention, and the study found lower levels of attention when the instructor was lecturing or conducting question-answer sessions, but higher levels of attention with less variation when engaged in question-and-answer with supervision. Additionally, the use of personal whiteboards was highly correlated with student attention.
An extensive multiple case study of 232 students and 13 instructors in newly-constructed ALCs in a Canadian university ( Gebre et al., 2015 ) found that students perceived that 43% of students whose professors held a “transmissive” view of effective teaching (i.e. primarily focused on direct instruction) felt their learning would have been the same or better in a traditional classroom, compared to just 27% and 8% of students whose professors viewed effective teaching as “engaging students” or “development” respectively.
Stover and Zisweiler (2017) conducted a study of 417 students enrolled in “large lecture” courses (more than 70 students) at a university. In the Spring 2015 semester, the classes were taught in a traditional auditorium with fixed seating and primarily lecture pedagogy; in the following Fall semester, the class was taught in an ALC by faculty trained to use the affordances of the ALC, although the pedagogical practices of each faculty member were different. A survey was given to students in the final week of the semester consisting of 34 items from the Community of Inquiry survey. The study found that the ALC sections had a significantly negative effect on the perceived level of “instructor presence” in the course for one evening section of one professor and in all the sections of another professor. The ALC, on the other hand, had significantly positive impact on social presence in one remaining class. No statistically significant differences were found in any of the classes with regards to “cognitive presence”. We note that this is the only study in the review that reports back significantly negative results for ALC's compared to traditional spaces; Stover and Zisweiler speculate that this could be a function of the instructors’ pedagogical choices rather than the ALC itself.
3.3. What effect do ALCs have on the pedagogical practices and behaviors of instructors?
In this section we address the research question, What effects do ALCs have on the pedagogical practices and behaviors of instructors? The set of results that focus on instructors is considerably smaller than the set of results focusing on students; indeed, as we will mention later, the effects of ALCs on instructors is an emerging area for future research. Because of the smaller number of results on instructors, both practices and behaviors will be examined in this section. However, the data emerging from our review point to pedagogical practices and instructor behaviors as distinct dimensions of the impact of ALCs, so we will treat the results in separate subsections.
We only consider studies in which data were collected after an intervention involving a reasonably large group of instructors teaching in an ALC. Some studies that we found and which were treated one or more of the other research questions (for example, Chen (2014) and Connolly and Lampe (2016) ) contain discussions about ALCs and instructors, but are excluded for this question, either because the sample size was too small (e.g. just one instructor teaching in the ALC was considered) or because the discussion is too general. Of the 37 studies in this review, we found 15 that address this particular question.
3.3.1. Instructor pedagogical practices
Brooks (2012) conducted perhaps the most thorough study of the impact of ALCs on instructor pedagogical practices in this review. In his study, a custom classroom observation instrument was developed and used to observe the classes on 32 separate variables related to student and instructor activities, and administered to two sections of a course, one of which met in a traditional classroom space and the other in an ALC. The observations found significantly less lecturing done in the ALC (54.5% of observations compared with 77.4%), significantly more class discussion (48% more in the ALC), significantly less time spent at the podium (69.2% of observations versus 95.1%), significantly more time using marker boards (10.1% more frequently), and a higher rate of consulting with students in small groups (59.4% of observations versus 27.4%). There was no significant difference in the frequency of group activities or question-and-answer sessions in the two sections, or in the frequency of the use of Power Point slides. This can be explained by the fact that the instructor designed both sections with active learning in mind; and yet, even though the pedagogy was not planned to be different between the two sections, the differences in lecture and class discussion appeared nonetheless. In other words, in Brooks’ study, the ALC seemed to exert an influence on instructor pedagogy, causing the instructor to engage in more active pedagogical methods even when she was specifically told not to change instructional methods between sections.
Pashak and Hagen (2014) examined both student and instructor behaviors in a Steelcase LearnLab classroom, equipped with moveable tables and chairs, interactive whiteboards, huddle boards, and a document camera. They observed classes taught in the LearnLab and recorded the frequency of instructor behavior in five categories of actions: stationary lecture only, stationary question-and-answer, stationary supervision of student work, lecturing plus movement, and question-and-answer or supervision plus movement. They also measured the frequency of usage of five categories of technological tools: one PowerPoint slide for an extended period, a series of PowerPoint slides, huddle boards, a combination of tools, and no tools. The observations were done for a single course meeting in the LearnLab; i.e. there was no control group. Pashak and Hagan found significant differences among the five behavioral categories as well as among the five tool categories. Although “lecture” was still the most commonly used pedagogical method and “none” the most common tool category, further analysis revealed that the “question-and-answer with movement” format and the huddle board tool category were strongly correlated with student attention levels.
Metzger (2015) studied the use of team-teaching in an ALC. While the impact of the ALC on the pedagogical choices of the instructors was minimally reported, Metzger does conclude based on student data that team-teaching instruction in an ALC would be improved by setting up “zones of engagement” in the ALC, with different parts of the ALC used for different learning experiences and individual instructors on the team responsible for different zones. The flexibility of an ALC affords the opportunity to make such pedagogical choices.
Other studies investigated the impact of ALCs on instructor activities that were not strictly pedagogical choices but which impacted instruction. In Henshaw et al. (2011) a classroom was outfitted with seating attached to the floor but on a swivel that allowed 360 degrees of movement by students, along with large aisles to allow instructors to access students. Observational and interview data indicate that instructors in this room moved through the room more freely, which in turn created a positive impact on collaboration and group activity. In Rands and Gansemer-Topf (2017) , a traditional classroom was renovated to include chairs and tables on casters, arranged in clusters seating four students each. As with the swivel chair configuration above, instructors in this ALC report that they moved around the classroom and engaged in discussions more frequently with students than they did in a traditional space.
3.3.2. Instructor motivation and behavior
Whiteside et al. (2009) examined the “PAIR-up” model (“PAIR” being an acronym for Partner, Assess, Integrate, Revisit) and how faculty attitudes and practices change within an ALC in which this model is used. Through interviews and surveys of instructors, the researchers studied instructor expectations and attitudes before teaching in the ALCs and how those beliefs changed during the term. The faculty reported that the ALC changed or deepened student-teacher relationships, influenced the faculty to shift their role to more of a learning coach or facilitator, created an environment where learning could easily occur, and helped to minimize class preparation and allowed for more focus on content due to the ALC being already set up for active learning.
In a case study of five classes with 92 students and nine instructors in an ALC, Ge et al. (2015) report several influences on their motivations and behaviors about teaching directly attributable to the ALC. The professors reported a realization that they could move about the room and meet with students in an ALC, a change from their beliefs about teaching in a traditional classroom. They also reported that the ALC provided them enhanced opportunities to discuss students’ difficulties with the students, and that the technology afforded by the space helped them to visualize difficult concepts more easily.
Finally, Sawers et al. (2016) examined thirty instructors who taught in both a traditional space and an ALC to examine the effects of the instructors’ personal teaching philosophies on their instruction and its results. They found that instructors who adhere to a constructivist teaching philosophy perceived their students as being more engaged in an ALC than those instructors whose teaching philosophy was less constructivist; the data indicated no significant difference among teaching philosophies on the perception of student engagement in a traditional space. Further, the type of learning space used appeared to exert an influence on the relationship between instructor teaching philosophy, the use of active learning pedagogies, and perceptions of student engagement. The results of this study can be taken as a cautionary tale when interpreting the results of other studies involving self-reported data from instructors, namely that what instructors perceive in student activity can in large part be a function of their own pre-existing beliefs about learning.
3.4. What specific design elements of ALCs contribute significantly to other results?
Our fourth and final research question is, What specific design elements of ALCs seem to make the biggest impact in the results on student learning, instructor practices, and engagement? This question is important especially for school leaders and others making decisions about the construction of new ALCs or renovations of existing classroom spaces. The answers to this question may be particularly helpful for those seeking to make incremental transitions from traditional learning spaces to an ALC and are looking for a “minimum viable product” that will provide some of the benefits of an ALC for minimal investments in construction or redesign.
We note at the beginning that the studies in this review do not point to a single architecture or set of physical objects that drive the results in those studies. What does emerge from a close reading of the literature is a single design concept that is a common thread among the most pronounced positive results on ALCs: The concept of connectedness . Any architectural design, furniture, or tools included in an ALC that promote connectedness in any of several forms tends to drive the strongest results in the literature we have reviewed. We will now look at the major facets of connectedness as found in the studies we reviewed. Of the 37 studies in this review, we found 15 that address this particular question.
3.4.1. Connectedness through mobility
Many of the results we have reviewed feature the freedom of movement as a key element in the positive results they obtained. In addition to the studies we have already reviewed, the following have a particular focus on the effects of mobility in the ALC.
Harvey and Kenyon (2013) focused on design attributes while comparing different types of ALC layouts. The study administered the Classroom Seating Rating Scale for Students (CSRS-S), comprised of 15 Likert-type items on subscales for “Comfort and Spaces”, “Learning Engagement”, and “Interactivity”. In their study, 863 students completed the CSRS-S rating five different kinds of seating: a modern mobile “Node” chair manufactured by Steelcase (which has a flexible attached desk and casters for mobility), stationary chairs with tables, fixed chairs with tables, chairs and tables without wheels, and stationary trapezoidal tables with chairs with wheels. The modern mobile chairs and the trapezoidal tables with chairs on wheels had the highest overall ratings, as well as the highest ratings for the three separate subscales, and those ratings were statistically significantly higher than the ratings for the other three forms of furniture. That is, the furnishings that enabled mobility the most were the ones most highly rated for the key aspects of the student experience in the classroom.
We have already mentioned a unique setting arrangement in a classroom studied by Henshaw et al. (2011) where the seats were bolted to the floor in four groups and allows for 360° rotation. The option for the instructors to move around including to the center of the space and the ability for the students to connects with their peers all around them had a strongly positive, impact on collaboration with peers and the instructor due to the ability to quickly and easily switch learning modes in the classroom and allow for free movement through the seats.
3.4.2. Connectedness through visual relationships
Many of the results pertaining to SCALE-UP, as well as variants such as TEAL and PAIR-UP, note that the visual layout of the ALC makes a significant difference in how students and instructors perceive the purpose of the room as well as their own role in the space. The polycentric layout of these spaces, as we have already noted, directs the attention of students not to the front of the room occupied by an authority figure but to each other, fostering group cohesion and better enabling active learning.
A study by Salter et al. (2013) explored the effect of a unique collaborative design layout on collaboration as well as the use of technology. An entire building of collaborative spaces was constructed on their campus, each space of which being designed with different nonstandard shapes and layouts.
These layouts create vertical zones to facilitated separate types of activities, where groups in one level were able to work on completely different activities to groups working in other levels. In sequence, the highest level was the most desirable spot and encouraged students to join the class early, while the less desirable spot where at the entrance level of the class. Technology was incorporated as a supporting tool for every activity, from smaller team presentations to the entire class presentation. The collaborative learning space had high ratings from students in supporting collaboration and the use of the available technology. The most useable technologies were the big screen for the entire space, the small screens attached to tables, and laptop provided at the table. However, paradoxically, only half of the students in the study wished to take more courses in a collaborative space. The authors argue that this finding speaks to the need to help students orient themselves to the new learning expectations inherent in an ALC.
The work of Park and Choi (2014) found that traditional spaces with row-by-column seating tend to create zones, some of which are beneficial for learning while others are detrimental to learning, largely due to the front-centered visual layout of the room. To equalize the zones, the ALC was designed with wheels on the lectern, tables, and chairs to be able to reconfigure the layout easily. Students found the new space better for sight lines to the screens, in motivation to learn, in building relationships with other students, and other key facets of connectedness.
According to Taylor (2009) , studio classroom spaces are characterized by a combination of moveable furniture that group students into learning teams, a centrally located or moveable teacher's station that does not create a “front” of the room, wireless laptops and computer projection, and wall spaces for writing or posting ideas. The goal is to create flexible spaces to support flexible pedagogy,hand-on activities. High flexibility and mobility have the opportunity for unfamiliarity, and as we mentioned in question four, greatest learning occurs in unfamiliar settings. Studio spaces are still unfamiliar layouts for many students, and the flexibility of the space means that it can be frequently reconfigured to create new learning settings. The study found that studio spaces were preferred by students due to their physical comfort and a less restrictive atmosphere.
Finally, we mention again the various studies of Terry Byers, Wesley Imms, and their collaborators ( Byers et al., 2016 ; Byers et al., 2014 ; Byers and Imms, 2014 , 2016 ; Imms and Byers, 2017 ), all of which involve comparing traditional spaces with “next generation learning spaces” with polycentric layouts capable of housing three different learning modes in the same space. All of those studies report back significantly higher ratings for ALCs compared to traditional spaces on measures of student engagement and learning outcomes. We specifically highlight the study by Imms and Byers (2017) , in which three types of classes were evaluated: a traditional setting, a polycentric ALC that facilitates student-centered learning, and a space representing an informal setting which was referred to the “third space”. The study found a nuanced relationship between the three different modes; however, students generally preferred the third space (mode 3) over both of the other modes for fostering a positive attitude, taking on challenges, and working beyond the limits of their expertise.
3.4.3. Connectedness through tools for learning
The ready availability of both analog and digital learning tools in an ALC promotes student activity that connects concepts to students’ own interests, experiences, and understanding; such activity also promotes the connection of knowledge by using the tools as sense-making implements.
Analog tools such as whiteboards are among the most-cited implements in an ALC that promote student learning and engagement. Beichner et al. (2007) describe the whiteboards as a “public thinking space” in the SCALE-UP class, where sharing knowledge and ideas is conducted. For content delivery modes, Brooks (2012) found significant difference in use of marker boards with ALC having higher frequency. In addition to wall-mounted whiteboards, personal whiteboards or “huddle boards” were one more source of communication which created special attention from students as reported by Pashak and Hagan (2014) . Writable glass table tops were mentioned in Connolly and Lampe (2016) as another way of communicating in the ALC.
Digital tools also play a role in student sense-making activities. The several studies on SCALE-UP and related spaces focus in on the ready availability of digital tools such as digital projectors for student use, power supplies, and computers (either housed in the ALC or allowed through a “bring-your-own-technology” approach) as key elements of improved student learning and interaction in the classroom. Conversely, Byers and Imms (2016) investigated the interaction between learning space and digital technology, and how this interaction affects teaching and learning in primary schools and found that students in an ALC perceive the technology as being more beneficial for their learning than the same technology used in the same ways in a traditional space. As Connolly and Lampe (2016) also point out, the availability of digital projects makes it easier to have a room with a polycentric layout, since student visual attention can be focused in a variety of directions when a digital item is on display.
4. Discussion
In this section, we will summarize the primary themes of the findings in the studies identified for this review. We also discuss the limitations and methodological issues in those studies.
4.1. Primary themes
Looking across all the research questions from studies in this review, the following common themes emerge from research on ALCs.
ALCs are connected with improved student learning outcomes . ALCs tend to be associated with improved measurable student learning outcomes, whether those outcomes are traditional quantitative measures such as exams and course grades or measures of "21st century skills". All of the studies in the review that reported on measurable student learning outcomes reported either improved outcomes for students in ALC sections, or no significant difference in learning outcomes when comparing ALC sections to sections in traditional classroom spaces. None reported lower results for students in ALC sections. Furthermore, several of the studies report that the results on learning outcomes are the most pronounced among low-achieving and minority students.
ALCs are connected with improved student engagement, in several forms. ALCs tend to provoke strong improvements in student engagement, framed in terms of affective, behavioral, or cognitive forms or as a combination of these. Students typically report a preference for learning in an ALC compared to a traditional space as well as increased motivation to attend class. Students also report increased willingness to participate actively in class and to take on challenges and work past their comfort zone in an ALC versus in a traditional space. Students also report that ALCs lead to increased interaction and deepened relationships with their peers and instructors, and that ALCs foster a sense of community and belonging.
ALCs have a positive connection with instructor practices and beliefs. When focusing from students to instructors, a third theme emerges: Instructors in ALCs tend to change their instructional practices and their perceptions of their role as instructors. Our studies report that instructors in ALCs tend to use active learning techniques more frequently than they do when teaching in traditional classrooms, and they readily integrate the special affordances of ALCs --- reconfigurable tables, vertical writing surfaces, ubiquitous digital technology, etc. --- into their teaching effectively. One study ( Brooks and Solheim (2014) ) reported increases in the use of active learning and ALC affordances even when instructors were specifically told to teach the same way they do in a traditional lecture environment. Several of the studies also report that the experience of teaching in an ALC can drive fundamental reconsiderations of instructors’ teaching goals and philosophies. However, other studies indicate that those changes in mindset are to a significant extent dependent on the starting point of instructors’ teaching philosophies.
ALCs are a potential tool for the evolutio n of a new culture of learning. A fourth theme emerges from the literature as a combination of the above three themes, namely that there is a significant potential for ALCs to have a transformative effect on the institutional and learning cultures of the schools in which they are installed. By "institutional culture", we mean the norms, practices, and beliefs of an educational institution that shape the way it and its members perform their work and interact with each other; by "learning culture", we similarly mean the norms, practices, and beliefs held and enacted by instructors and learners as it pertains to learning. While none of the studies in this review specifically studied the institutional or learning cultures of schools, for example using anthropological methods, there appears to be evidence that a synergy between active learning and ALCs has a significant, system-wide impact on the social and cultural patterns in the schools where ALCs are found and in which active learning is used.
For example, the results we have summarized pertaining to student engagement all point to greatly enhanced engagement, however it may be defined, in the presence of ALCs and the active learning that take place within those spaces. The work of Scott-Webber et al., 2014 ), in particular, indicates that students learning in an ALC rate ALCs significantly engaging on all 12 levels testing by the instrument used in that study, including "feeling comfortable to participate” and "creation of an enriching experience". Many of the studies in the review also report repeated instances of students finding learning in an ALC to be more “fun” than in a traditional space; whether this refers to the space, or to pedagogy or technology, is unclear, but the perception that learning is fun represents a significant cultural shift for most students.
Similarly, the various studies by Byers and Imms and their collaborators show that learning in an ALC leads to enhanced experiences in general areas of student engagement and student learning experiences. Some of the specific forms of engagement and learning experience pertain to broader, big-picture aspects of the student experience such as comfort level, the willingness to work outside of one's comfort zone, and perceptions of what teaching effective. These larger systemic items in these and other studies point to the beginnings, at least, of major changes in the ways that students conceive of their own roles as learners in a school, and what learning looks and feels like. These are at the root of large-scale cultural change.
The results for instructors point primarily to the fact that the presence of an ALC tends to encourage instructors to use more active learning techniques in class, and this in turn often causes instructors to rethink their own roles in their profession. For example, Brooks and Solheim (2014) indicates that instructors use more active learning in an ALC even when specifically instructed not to do so; and faculty, after these experiences, tend to view themselves more as a guide or coach than as a lecturer.
The common denominator in the larger cultural effects of ALCs and active learning on students and instructors is the notion of connectedness, a concept we have already introduced in discussions of specific ALC design elements. By being freer to move and have physical and visual contact with each other in a class meeting, students feel more connected to each other and more connected to their instructor . By having an architectural design that facilitates not only movement but choice and agency --- for example, through the use of polycentric layouts and reconfigurable furniture --- the line between instructor and students is erased , turning the ALC into a vessel in which an authentic community of learners can take form. By having analog and digital tools readily available in the ALC, students are better able to connect the concepts of a class to their own interests and conceptions and are better able to draw connections between ideas. A learning experience promoting a significantly enhanced level of connectedness on a number of levels would, by itself, indicate a major cultural shift in current educational spaces and practices. While, again, none of the results in this study specifically focus on cultural change, this aspect of the presence and effect of the ALC indicates that cultural change is there to be discovered with future research.
4.2. Limitations and issues in the studies and their results
As with any research study, the studies in our review had limitations in their design and methodology that should be kept in mind when interpreting and applying their results.
Failure to control for pedagogy and technology . The first and by far most prevalent methodological limitation is that many of the studies in this review do not control for variables related to pedagogy or technology. For example, the various SCALE-UP studies appearing in this review often report significant improvements in student learning outcomes and behaviors. But it is unclear from the studies what, precisely, is primarily responsible for these results: the space itself, the active learning taking place within the space (and whether similar results might be obtained by similar pedagogy in a more traditional space), the availability and use of technology, or some multiple-way interaction between these. Therefore, it is difficult or impossible to distinguish between the effects of the learning space on the one hand, and the effects of the pedagogy and technology used in that space on the other, in most of the studies in this review. Many of the studies report instructor and student beliefs that ALCs at least afford greater opportunities for engaging in effective active learning and technology integration. But in terms of isolating the role of space by itself and controlling for the variables of pedagogy and technology, only the studies involving Terry Byers and Wesley Imms explicitly design their research to accomplish this, and research that delineates between space, pedagogy, and technology is still emerging.
Limited cultural and educational contexts . Second, as the overview of the results in Section 2 show, the cultural and educational contexts of these studies are quite limited. The vast majority of the studies in the review were done in higher educational settings, with only five being conducted in primary or secondary schools. We find it unusual, give the tremendous amount of interest in primary and secondary schools in ALCs, that the body of published research on ALCs in this setting is so small. Also, a large majority of the studies (27 out of 37) in this review were done in the United States; half of the remaining 10 studies were conducted in Australia. The combination of the limitations on educational and cultural contexts make many of the results difficult to generalize outside of their niche contexts, and those wishing to apply the results from this review to contexts outside these contexts should take special care in generalizing.
Use of quantitative measures that lack standardization and rigor . Third, the studies in the review that focus on quantitative indicators of learning, tend to use measures such as exam grades and course grades, using instruments that are typically designed by individual faculty members and not subjected to rigorous analysis of their reliability and thus have uncertain internal validity. For example, course grades often include group work or attendance credit which do not correlate with individual student learning outcomes; and exams given to courses with different instructors are not always examined for inter-rater reliability. The studies that use normalized learning gains on standardized concept inventories have considerably stronger validity but are fewer in number.
Limited scope and sample size . Fourth, several of the studies use limited samples. Some studies identified in the database searches involved even more limited samples, for example just three responses to a survey; those studies were excluded from review. Many of the remainder that passed our inclusion criteria were still quite limited in their scope, often to just to sections of a single course for a single instructor in one semester. As with cultural and educational context limitations, the limited sampling scope makes it difficult to generalize the results of those studies with confidence.
5. Conclusion
Our understanding of the role that learning spaces play in the learning process is still in its adolescence, but the research we have reviewed here, along with the numerous new studies published since the writing of this manuscript, speak to the growing importance of this role in all phases of education. The research emerging from the study of learning spaces highlights a growing need to understand space as a third component of effective learning experiences, complementing pedagogy and technology:
As our understanding of both pedagogy (particularly active learning pedagogies) and technology (particularly technology used for active learning tasks) evolves, a continuing evolution of our understanding of active learning spaces is necessary to help the promise of active learning reach its full potential.
To conclude this review, we will identify emerging lines of inquiry about active learning spaces that hold promise for future research. Many of these opportunities for future research are based on the limitations of the research that we have reviewed in this study.
Research outside higher education contexts . As noted, a great majority of the research reviewed here is done in higher education. This research may not be particularly useful for primary and secondary schools, given the wide differences in organizational and educational norms between primary/secondary and higher education. In fact, a great deal of work on ALCs that is either unpublished or non-peer reviewed exists in the primary and secondary education space in the form of editorials, blog posts, action reports from classroom usage, conference presentations, and reports from grants such as Steelcase, Inc.‘s Active Learning Center Grant program. Given the strong interest in active learning classrooms from primary and secondary schools and the relative lack of published research done in those contexts, more research done in these schools is warranted. One avenue for generating such research would be to build on the extensive body of existing work that is unpublished or non-peer reviewed, and expand those works into research studies of the kind we have reviewed here.
Research in a wider variety of ethnic and cultural contexts . Similarly, there is a lack of research on ALCs situated in countries outside the United States. Even within the US studies, all of the studies were done in universities, the plurality of which are large research-focused institutions. Research that investigates ALCs in a wider diversity of cultural context would be useful for teasing apart the role of culture and ethnicity in the effects we have documented here.
Research that controls for pedagogy and technology . As we noted in the previous section, most of the studies reviewed here do not attempt to control for pedagogical methods or technology use. As a result, it remains unclear in many cases whether the results we have seen in this review can be isolated and attributed to the space itself, versus the active pedagogy and intentional use of technology (or combination of these) taking place in the space. Terry Byers and Wesley Imms, along with their collaborators, have specifically targeted methodologies that do control for these variables, and consequently their results reflect the influence of space alone. More such research is needed to determine which results depend on the particular ALC space involved. We believe that replications of existing studies with appropriately-adjusted research methodology, for example replications of SCALE-UP studies using single subject research designs, would represent an important step in this direction.
Longitudinal research . We did not encounter any longitudinal studies in this review. As more ALCs are constructed around the world in schools and universities, the opportunities to conduct long-term studies of the persistence of ALCs’ effectiveness grow proportionately. The time seems right for longitudinal research, which would extend the results found in the studies in this review over time and through different kinds of spaces to see if the results especially regarding changes in attitude and behavior persist.
Research comparing different models of ALCs . Only one of the studies in this review ( Muthyala and Wei, 2013 )compared two different layouts of an ALC for their relative effects on learning outcomes. (There were no significant differences in those outcomes between the two layouts.) This seems to be a promising entry point for research, particularly for schools that have already installed ALCs and can simply reconfigure them to create different groups for study while controlling for confounding variables.
Research on design and architectural principles that impact learning . Our final research question ( What specific design or architectural elements of ALCs contribute the most to the results in the identified studies? ) was largely answered deductively by looking at patterns in the research data. None of the studies we encountered specifically aimed to address this question. Studies that examine design and architectural elements, while keeping other variables under control, would be useful for designers, architects, and potential buyers of ALCs.
Ethnographic research on cultural effects . In the previous section, we noted the strong potential for cultural change that is attached to the use of ALCs; however, we also noted that none of the studies in this review directly address cultural change in institutions adopting ALCs. We believe the issue of systemic culture change merits further, more targeted research. Specifically, ethnographic research done within institutions that adopt ALCs to examine the cultural impact of "connectedness” as discussed in the previous section, collaboration, territorial behaviors, and other instances of culture could yield significant insight on the impact of ALCs not directly tied to student learning.
Declarations
Author contribution statement.
All authors listed have significantly contributed to the development and the writing of this article.
Funding statement
This work was commissioned by Steelcase, Inc. as a project for the sabbatical leave of one of the authors (Talbert), and Steelcase provided work space and partial funding of that author's salary during the sabbatical. Steelcase provided funding for the participation of the other author (Mor-Avi).
Competing interest statement
The authors declare the following conflict of interests: This work was commissioned by Steelcase, Inc. as a project for the sabbatical leave of one of the authors (Talbert), and Steelcase provided work space and partial funding of that author's salary during the sabbatical. Steelcase provided funding for the participation of the other author (Mor-Avi).
Additional information
No additional information is available for this paper.
Acknowledgements
Both authors wish to thank Steelcase for commissioning this study and providing complete freedom to research and summarize the results of this review irrespective of how those results might impact the business operations of Steelcase, Inc. Talbert specifically wishes to thank Steelcase Education for co-sponsoring his sabbatical during the 2017–2018 academic year, during which time he served as Scholar-in-Residence at Steelcase and worked on this study as one of his sabbatical projects. Mor-Avi also wishes to thank Steelcase Education for providing the opportunity and also for the sponsorship for her work on this project.
We wish to note that our use of the term “active learning center” is not connected to the “Active Learning Center Grant” program offered by Steelcase, Inc. but is merely the most common term used to describe these spaces.
Appendix A. Supplementary data
The following is the supplementary data related to this article:
- Ananiadou K., Claro M. 2009. Working Paper 21st Century Skills and Competences for New Millenium Learners in OECD Countries. Edu/Wkp; p. 20. 2009. Retrieved from. [ Google Scholar ]
- Astin A. Student involvement: a developmental theory for higher education. J. Coll. Student Dev. 1999;40(January):518–529. [ Google Scholar ]
- Axelson R.D., Flick A. Defining student engagement. Change. 2010;43(1):38–43. [ Google Scholar ]
- Baepler P., Walker J.D., Brooks D.C., Saichaie K., Petersen C. Stylus Publishing; 2016. A Guide to Teaching in the Active Learning Classroom. [ Google Scholar ]
- Baepler P., Walker J.D., Driessen M. It’s not about seat time: blending, flipping, and efficiency in active learning classrooms. Comput. Educ. 2014;78:227–236. [ Google Scholar ]
- Beichner R. Case study of the physics component of an integrated curriculum. Am. J. Phys. 1999;67(S1):S16. [ Google Scholar ]
- Beichner R.J., Saul J.M., Abbott D.S., Morse J.J., Deardorff D., Allain R.J., Risley J. The student-centered activities for large enrollment undergraduate programs (SCALE-UP) project abstract. Physics. 2007;1(1):1–42. Retrieved from. [ Google Scholar ]
- Bonwell C., Eison J. Office of Educational Research and Improvement; Washington, DC: 1991. Active Learning: Creating Excitement in the Classroom. Retrieved from. [ Google Scholar ]
- Brooks C., Brown M., Buckner M., Cotner S., Finkelstein A. 2017. The Year of the Active Learning Classroom. http://bit.ly/2Ru2CcU2019-01-14 2017. Retrieved from. [ Google Scholar ]
- Brooks D.C. Space matters: the impact of formal learning environments on student learning. Br. J. Educ. Technol. 2011;42(5):719–726. [ Google Scholar ]
- Brooks D.C. Space and consequences: the impact of different formal learning spaces on instructor and student behavior. J. Learn. Spaces. 2012;1(2):1–16. [ Google Scholar ]
- Brooks D.C., Solheim C.A. New Directions for Teaching and Learning. 2014. Pedagogy matters, too: the impact of adapting teaching approaches to formal learning environments on student learning; pp. 53–61. [ Google Scholar ]
- Byers T., Hartnell-Young E., Imms W. Empirical evaluation of different classroom spaces on students’ perceptions of the use and effectiveness of 1-to-1 technology. Br. J. Educ. Technol. 2016;(December) [ Google Scholar ]
- Byers T., Imms W. Making the space for space: the effect of the classroom layout on teacher and student usage and perception of one-to-one technology. Aust. Comp. Educ. Conf. 2014:61–69. 2014. [ Google Scholar ]
- Byers T., Imms W. The Translational Design of Schools. 2016. Evaluating the change in space in a technology-enabled primary years setting; pp. 215–236. [ Google Scholar ]
- Byers T., Imms W., Hartnell-Young E. Making the case for space: the effect of learning spaces on teaching and learning. Curric. Teach. 2014;29(1):5–19. [ Google Scholar ]
- Chen V. “There is No single right answer”: the potential for active learning classrooms to facilitate actively open-minded thinking. Collect. Essays Learn. Teach. 2014;8:171–180. [ Google Scholar ]
- Chiu P.H.P., Cheng S.H. Effects of active learning classrooms on student learning: a two-year empirical investigation on student perceptions and academic performance. High. Educ. Res. Dev. 2016;4360(February):1–11. [ Google Scholar ]
- Coletta V.P., Phillips J.A. Interpreting FCI scores: normalized gain, preinstruction scores, and scientific reasoning ability. Am. J. Phys. 2005;73(12):1172–1182. [ Google Scholar ]
- Connolly A., Lampe M. How an active learning classroom transformed IT executive management. Inf. Syst. Electron. J. 2016;14(1):15–27. Retrieved from. [ Google Scholar ]
- Cotner S., Loper J., Walker J.D., Brooks D.C. It’s not you, it’s the room - are the high-tech, active learning classrooms worth it? J. Coll. Sci. Teach. 2013;42:82–88. [ Google Scholar ]
- Dori Y.J., Belcher J. How does technology-enabled active learning affect undergraduate students’ understanding of electromagnetism concepts? J. Learn. Sci. 2005;14(2):243–279. [ Google Scholar ]
- Eagly A.H., Chaiken S. Attitude structure and function. In: Gilbert D.T., Fiske S.T., Lindzey G., editors. The Handbook of Social Psychology. McGraw-Hill; New York, NY: 1998. pp. 269–322. [ Google Scholar ]
- Framework for 21st Century Learning - P21 2019. http://www.p21.org/about-us/p21-framework Retrieved June 21, 2018, from.
- Freeman S., Eddy S.L., McDonough M., Smith M.K., Okoroafor N., Jordt H., Wenderoth M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. 2014;111(23):8410–8415. doi: 10.1073/pnas.1319030111. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
- Ge X., Yang Y.J., Liao L., Wolfe E.G. E-learning Systems, Environments and Approaches, (Celda) 2015. Perceived affordances of a technology-enhanced active learning classroom in promoting collaborative problem solving; pp. 305–322. [ Google Scholar ]
- Gebre E., Saroyan A., Aulls M.W. Conceptions of effective teaching and perceived use of computer technologies in active learning classrooms. Int. J. Teach. and Learn. Higher Educ. 2015;27(2):204–220. [ Google Scholar ]
- Gierdowski D. Cases on Higher Education Spaces: Innovation. Collaboration, and Technology; 2013. Studying learning spaces: a review of selected empirical studies; pp. 14–39. [ Google Scholar ]
- Hall E.T. Doubleday; Garden City, NJ: 1959. The Silent Language. [ Google Scholar ]
- Halloun I.A., Hestenes D. The initial knowledge state of college physics students. Am. J. Phys. 1985;53(11):1043–1048. [ Google Scholar ]
- Harvey E.J., Kenyon M.C. Classroom seating considerations for 21st century students and faculty. J. Learn. Spaces. 2013;2(1):1–13. [ Google Scholar ]
- Henshaw R.G., Edwards P.M., Bagley E.J. Use of swivel desks and aisle space to promote interaction in mid-sized college classrooms. J. Learn. Spaces. 2011;1(1) Retrieved from. [ Google Scholar ]
- Grajek Susan. 2015. Higher Education’s Top 10 Strategic Technologies for 2017. https://library.educause.edu/resources/2017/1/higher-educations-top-10-strategic-technologies-for-2017 Review 50.1. [ Google Scholar ]
- Hyun J., Ediger R., Lee D. Students’ satisfaction on their learning process in active learning and traditional classrooms. Int. J. Teach. and Learn. Higher Educ. 2017;29(1):108–118. [ Google Scholar ]
- Imms W., Byers T. Impact of classroom design on teacher pedagogy and student engagement and performance in mathematics. Learn. Environ. Res. 2017;20(1):139–152. [ Google Scholar ]
- Metzger K.J. Collaborative teaching practices in undergraduate active learning classrooms: a report of faculty team teaching models and student reflections from two biology courses. Bioscene. 2015;41(1):3–9. [ Google Scholar ]
- Meyers-Levy J., Zhu R., Juliet The influence of ceiling height: the effect of priming on the type of processing that people use. J. Consum. Res. 2007;34(2):174–186. [ Google Scholar ]
- Miller-Cochran S., Gierdowski D. Making peace with the rising costs of writing technologies: flexible classroom design as a sustainable solution. Comput. Compos. 2013;30(1):50–60. [ Google Scholar ]
- Muthyala R.S., Wei W. Does space matter? Impact of classroom space on student learning in an organic-first Curriculum. J. Chem. Educ. 2013;90(1):45–50. [ Google Scholar ]
- Nissim Y., Weissblueth E., Scott-Webber L., Amar S. The effect of a stimulating learning environment on pre-service teachers’ motivation and 21st century skills. J. Educ. Learn. 2016;5(3):29. [ Google Scholar ]
- OCDE Working paper 21st century skills and competences for new millenium learners in OECD countries. Edu/Wkp. 2009;20(41):1–33. 2009. [ Google Scholar ]
- Oliver-Hoyo M.T., Allen D., Hunt W.F., Hutson J., Pitts A. Effects of an active learning environment: teaching innovations at a research I institution. J. Chem. Educ. 2004;81(3):441. [ Google Scholar ]
- Park E.L., Choi B.K. Transformation of classroom spaces: traditional versus active learning classroom in colleges. High. Educ. 2014;68(5):749–771. [ Google Scholar ]
- Pashak T.J., Hagen J.W. The LearnLab : using enhanced teaching technology to improve learning in the college classroom. Curr. Adv. Psychol. Res. 2014;1(2):56–60. [ Google Scholar ]
- Rands M.L., Gansemer-Topf A.M. The room itself is active: how classroom design impacts student engagement. J. Learn. Spaces. 2017;6(1):26–33. http://bit.ly/2Rx9syI2019-01-14 Retrieved from. [ Google Scholar ]
- Salter D., Thomson D.L., Fox B., Lam J. Use and evaluation of a technology-rich experimental collaborative classroom. High. Educ. Res. Dev. 2013;32(5):805–819. [ Google Scholar ]
- Sawers K.M., Wicks D., Mvududu N., Seeley L., Copeland R. What drives student engagement: is it learning space, instructor behavior, or teaching philosophy? J. Learn. Spaces. 2016;5(2):26–38. [ Google Scholar ]
- Scott-Webber L., Abraham J., Marini M. Higher education classroom fail to meet needs of faculty and students. J. Inter. Des. 2000;26(2):16–34. [ Google Scholar ]
- Scott-Webber L., Konyndyk R., French R., Lembke J., Kinney T. Spatial design makes a difference in student academic engagement levels: a pilot study for grades 9-12. Eur. Sci. J. 2017;13(16):9–12. [ Google Scholar ]
- Scott-Webber L., Strickland A., Kapitula L. Built environments impact behaviors: results of an active-learning post-occupancy evaluation. Plan. High. Educ. J. 2014;42(1):28–39. [ Google Scholar ]
- Stanovich K.E., West R.F. Reasoning independently of prior belief and individual differences in actively open-minded thinking. J. Educ. Psychol. 1997;89(2):342–357. [ Google Scholar ]
- Stover S., Ziswiler K. Impact of active learning environments on community of inquiry. Int. J. Teach. and Learn. Higher Educ. 2017;29(3):458–470. [ Google Scholar ]
- Taylor S.S. Effects of studio space on teaching and learning: preliminary findings from two case studies. Innov. High. Educ. 2009;33(4):217–228. [ Google Scholar ]
- Van Horne S., Murniati C., Gaffney J.D.H., Jesse M. Case study: promoting active learning in technology-infused TILE classrooms at the university of Iowa. J. Learn. Spaces. 2012;1(2):1–10. [ Google Scholar ]
- Walker J.D., Brooks D.C., Baepler P. Pedagogy and space : empirical research on new learning environments. Educ. Q. 2011;34(4):1–21. http://bit.ly/2Rq7FLE 2019-01-14 Retrieved from. [ Google Scholar ]
- Whiteside A.L., Brooks D.C., Walker J.D. Making the case for space: three years of empirical research on learning environments. Educ. Q. 2010;33(3):1–23. [ Google Scholar ]
- Whiteside A.L., Jorn L., Duin A.H., Fitzgerald S. Using the PAIR-up model to evaluate active learning spaces. Educ. Q. 2009;32(January 2009):12. [ Google Scholar ]
- Wilson J.M. The CUPLE physics studio. Phys. Teach. 1994;32(9):518. [ Google Scholar ]
- Wilson J.M., Jennings W.C. Studio courses: how information technology is changing the way we teach, on campus and off. Proc. IEEE. 2000;88(1):72–80. [ Google Scholar ]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
- View on publisher site
- PDF (2.7 MB)
- Collections
Similar articles
Cited by other articles, links to ncbi databases.
- Download .nbib .nbib
- Format: AMA APA MLA NLM
Add to Collections
Active Learning—Review
- First Online: 23 November 2023
Cite this chapter
- KC Santosh 3 &
- Suprim Nakarmi 3
Part of the book series: SpringerBriefs in Applied Sciences and Technology ((BRIEFSINTELL))
107 Accesses
In the previous chapter, we discussed different Active Learning (AL) scenarios, classical query strategies, and examples of when to deploy them.
This is a preview of subscription content, log in via an institution to check access.
Access this chapter
Subscribe and save.
- Get 10 units per month
- Download Article/Chapter or eBook
- 1 Unit = 1 Article or 1 Chapter
- Cancel anytime
- Available as PDF
- Read on any device
- Instant download
- Own it forever
- Available as EPUB and PDF
- Compact, lightweight edition
- Dispatched in 3 to 5 business days
- Free shipping worldwide - see info
Tax calculation will be finalised at checkout
Purchases are for personal use only
Institutional subscriptions
Settles B (2009) Active learning literature survey
Google Scholar
Aggarwal CC, Kong X, Gu Q, Han J, Philip SY (2014) Active learning: a survey. In: Data classification, pp 599–634. Chapman and Hall/CRC
Elahi M, Ricci F, Rubens N (2016) A survey of active learning in collaborative filtering recommender systems. Comput Sci Rev 20:29–50
Article MathSciNet MATH Google Scholar
Fu Y, Zhu X, Li B (2013) A survey on instance selection for active learning. Knowl Inf Syst 35(2):249–283
Kumar P, Gupta A (2020) Active learning query strategies for classification, regression, and clustering: a survey. J Comput Sci Technol 35(4):913–945
Article Google Scholar
Wang M, Hua X-S (2011) Active learning in multimedia annotation and retrieval: a survey. ACM Trans Intell Syst Technol (TIST) 2(2):1–21
Wu J, Sheng VS, Zhang J, Li H, Dadakova T, Swisher CL, Cui Z, Zhao P (2020) Multi-label active learning algorithms for image classification: overview and future promise. ACM Comput Surveys (CSUR) 53(2):1–35
Ren P, Xiao Y, Chang X, Huang P-Y, Li Z, Gupta BB, Chen X, Wang X (2021) A survey of deep active learning. ACM Comput Surveys (CSUR) 54(9):1–40
Miller B, Linder F, Mebane WR (2020) Active learning approaches for labeling text: review and assessment of the performance of active learning approaches. Polit Anal 28(4):532–551
Wu M, Li C, Yao Z (2022) Deep active learning for computer vision tasks: methodologies, applications, and challenges. Appl Sci 12(16):8103
Tong S, Koller D (2001) Support vector machine active learning with applications to text classification. J Mach Learn Res 2:45–66
Mitchell T (1982) Generalization as search. Artif Intell 28:203–226
Article MathSciNet Google Scholar
Sinha S, Ebrahimi S, Darrell T (2019) Variational adversarial active learning. In: Proceedings of the IEEE/CVF international conference on computer vision
Sener O, Savarese S (2017) Active learning for convolutional neural networks: a core-set approach. Preprint at arXiv:1708.00489
Houlsby N, Huszár F, Ghahramani Z, Lengyel M (2011) Bayesian active learning for classification and preference learning. Preprint at arXiv:1112.5745
Gal Y, Islam R, Ghahramani Z (2017) Deep bayesian active learning with image data. In: International conference on machine learning (PMLR)
Shui C, Zhou F, Gagné C, Wang B (2020) Deep active learning: unified and principled method for query and training. In: International conference on artificial intelligence and statistics, pp 1308–1318. PMLR
Yin C, Qian B, Cao S, Li X, Wei J, Zheng Q, Davidson I (2017) Deep similarity-based batch mode active learning with exploration-exploitation. In: 2017 IEEE international conference on data mining (ICDM), pp 575–584. IEEE
Beluch WH, Genewein T, Nürnberger A, Köhler JM (2018)The power of ensembles for active learning in image classification. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 9368–9377
Zhu JJ, Bento J (2017) Generative adversarial active learning. Preprint at arXiv:1702.07956
Tran T, Do TT, Reid I, Carneiro G (2019) Bayesian generative active deep learning. In: International conference on machine learning (PMLR), pp 6295–6304
Mayer C, Timofte R (2020) Adversarial sampling for active learning. In: Proceedings of the IEEE/CVF winter conference on applications of computer vision
Ash JT, Zhang C, Krishnamurthy A, Langford J, Agarwal A (2019) Deep batch active learning by diverse, uncertain gradient lower bounds. Preprint at arXiv:1906.03671
Yoo D, Kweon IS (2019) Learning loss for active learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition
Wang K, Zhang D, Li Y, Zhang R, Lin L (2016) Cost-effective active learning for deep image classification. IEEE Trans Circuits Syst Video Technol 27(12):2591–2600
Fang M, Li Y, Cohn T (2017) Learning how to active learn: a deep reinforcement learning approach. Preprint at arXiv:1708.02383
Liu M, Buntine W, Haffari G (2018) Learning how to actively learn: a deep imitation learning approach. In: Proceedings of the 56th annual meeting of the association for computational linguistics (vol 1: Long Papers)
Casanova A, Pinheiro PO, Rostamzadeh N, Pal CJ (2020) Reinforced active learning for image segmentation. Preprint at arXiv:2002.06583
Smailagic A, Costa P, Noh HY, Walawalkar D, Khandelwal K, Galdran A, Mirshekari M et al (2018) Medal: accurate and robust deep active learning for medical image analysis. In: 2018 17th IEEE international conference on machine learning and applications (ICMLA), pp 481–488. IEEE
Doyle S, Madabhushi A (2010) Consensus of ambiguity: theory and application of active learning for biomedical image analysis. In: Pattern recognition in bioinformatics: 5th IAPR international conference, PRIB 2010, Nijmegen, The Netherlands, 22–24 Sept 2010, vol 5 Springer, Berlin, Heidelberg
Budd S, Robinson EC, Kainz B (2021) A survey on active learning and human-in-the-loop deep learning for medical image analysis. Med Image Anal 71:102062
Reker D, Schneider G (2015) Active-learning strategies in computer-assisted drug discovery. Drug Discovery Today 20(4):458–465
Reker D (2019) Practical considerations for active machine learning in drug discovery. Drug Discov Today Technol 32:73–79
Shi X, Jin Y, Dou Q, Heng PA (2020) LRTD: Long-range temporal dependency based active learning for surgical workflow recognition. Int J Comput Assist Radiol Surg 15:1573–1584
Bangert P, Moon H, Woo JO, Didari S, Hao H (2021) Active learning performance in labeling radiology images is 90% effective. Front Radiol 1:748968
De Angeli K, Gao S, Alawad M, Yoon HJ, Schaefferkoetter N, Wu XC, Durbin EB et al (2021) Deep active learning for classifying cancer pathology reports. BMC Bioinf 22(1):1–25
Nguyen DHM, Patrick JD (2014) Supervised machine learning and active learning in classification of radiology reports. J Am Med Inf Assoc 21(5):893–901
Owoyele O, Pal P, Torreira AV (2021) An automated machine learning-genetic algorithm framework with active learning for design optimization. J Energy Resour Technol 143(8):082305
Singh K, Kapania RK (2021) Alga: active learning-based genetic algorithm for accelerating structural optimization. AIAA J 59(1):330–344
Sinha U, Bui A, Taira R, Dionisio J, Morioka C, Johnson D, Kangarloo H (2002) A review of medical imaging informatics. Ann NY Acad Sci 980(1):168–197
Qureshi I, Ma J, Abbas Q (2021) Diabetic retinopathy detection and stage classification in eye fundus images using active deep learning. Multimedia Tools Appl 80(8):11691–11721
Hao H, Didari S, Woo JO, Moon H, Bangert P (2021) Highly efficient representation and active learning framework for imbalanced data and its application to covid-19 x-ray classification. Preprint at arXiv:2103.05109
Park S, Hwang W, Jung KH (2018) Semi-supervised reinforced active learning for pulmonary nodule detection in chest x-rays
Hemmer P, Kühl N, Schöffer J (2022) Deal: deep evidential active learning for image classification. In: Deep learning applications, vol 3, pp 171–192. Springer
Iglesias JE, Konukoglu E, Montillo A, Tu Z, Criminisi A (2011) Combining generative and discriminative models for semantic segmentation of CT scans via active learning. In: Biennial international conference on information processing in medical imaging, pp 25–36. Springer
Kuo W, Häne C, Yuh Y, Mukherjee P, Malik J (2018) Cost-sensitive active learning for intracranial hemorrhage detection. In: International conference on medical image computing and computer-assisted intervention, pp 715–723. Springer
Nath I, Yang D, Landman BA, Xu D, Roth HR (2020) Diminishing uncertainty within the training pool: active learning for medical image segmentation. IEEE Trans Med Imaging 40(10):2534–2547
Hao R, Namdar K, Liu L, Khalvati F (2021) A transfer learning–based active learning framework for brain tumor classification. Front Artif Intell 4:635766
Gorriz M, Carlier A, Faure E, Giró-i-Nieto X (2017) Cost-effective active learning for melanoma segmentation. Preprint at arXiv:1711.09168
Liu J, Cao L, Tian Y (2020) Deep active learning for effective pulmonary nodule detection. In: Medical image computing and computer assisted intervention–MICCAI 2020: 23rd international conference, Lima, Peru, 4–8 Oct 2020, Proceedings, Part VI 23. Springer International Publishing
Jin Q, Li S, Du X, Yuan M, Wang M, Song Z (2023) Density-based one-shot active learning for image segmentation. Eng Appl Artif Intell 126:106805
Shao W, Sun L, Zhang D (2018) Deep active learning for nucleus classification in pathology images. In: 2018 IEEE 15th international symposium on biomedical imaging (ISBI 2018). IEEE
Han X, Kwoh CK, Kim J (2016) Clustering based active learning for biomedical named entity recognition. In: 2016 International joint conference on neural networks (IJCNN). IEEE
Shelmanov A, Liventsev V, Kireev D, Khromov N, Panchenko A, Fedulova I, Dylov DV (2019) Active learning with deep pre-trained models for sequence tagging of clinical and biomedical texts. In: 2019 IEEE international conference on bioinformatics and biomedicine (BIBM), pp 482–489. IEEE
Park S, Hwang W, Jung KH (2022) Semi-supervised reinforced active learning for pulmonary nodule detection in chest X-rays
Wu X, Chen C, Zhong M, Wang J, Shi J (2021) Covid-al: The diagnosis of covid-19 with deep active learning. Med Image Anal 68:101913
Iglesias JE, Konukoglu E, Montillo A, Tu Z, Criminisi A (2011) Combining generative and discriminative models for semantic segmentation of CT scans via active learning. In: Information processing in medical imaging: 22nd international conference, IPMI 2011, Kloster Irsee, Germany, 3–8 July 2011, vol 22, pp 25–36. Springer Berlin Heidelberg
Huynh TH, Tran VA, Tran HD (2011) Semi-supervised tree support vector machine for online cough recognition. In: Twelfth annual conference of the international speech communication association
Makkar A, Santosh KC (2023) SecureFed: federated learning empowered medical imaging technique to analyze lung abnormalities in chest X-rays. Int J Mach Learn Cybern:1–12
Dor LE, Halfon A, Gera A, Shnarch E, Dankin L, Choshen L, Danilevsky M, Aharonov R, Katz Y, Slonim N (2020) Active learning for BERT: an empirical study. In: Proceedings of the 2020 conference on empirical methods in natural language processing (EMNLP), pp 7949–7962
Hakkani-Tür D, Riccardi G, Gorin A (2002) Active learning for automatic speech recognition. In: 2002 IEEE international conference on acoustics, speech, and signal processing, vol 4. IEEE
Riccardi G, Hakkani-Tur D (2005) Active learning: theory and applications to automatic speech recognition. IEEE Trans Speech Audio Process 13(4):504–511
Zhang Y, Lease M, Wallace B (2017) Active discriminative text representation learning. In: Proceedings of the AAAI conference on artificial intelligence, vol 31(1)
Li L, Jin X, Pan SJ, Sun JT (2012) Multi-domain active learning for text classification. In: Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining, pp 1086–1094
Qu Z, Du J, Cao Y, Guan Q, Zhao P (2020) Deep active learning for remote sensing object detection. Preprint at arXiv:2003.08793
Tuia D, Pasolli E, Emery WJ (2011) Using active learning to adapt remote sensing image classifiers. Remote Sens Environ 115(9):2232–2242
Stumpf A, Lachiche N, Malet J-P, Kerle N, Puissant A (2013) Active learning in the spatial domain for remote sensing image classification. IEEE Trans Geosci Remote Sens 52(5):2492–2507
Download references
Author information
Authors and affiliations.
Applied AI Research Lab, Department of Computer Science, University of South Dakota, Vermillion, USA
KC Santosh & Suprim Nakarmi
You can also search for this author in PubMed Google Scholar
Corresponding author
Correspondence to KC Santosh .
Rights and permissions
Reprints and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this chapter
Santosh, K., Nakarmi, S. (2023). Active Learning—Review. In: Active Learning to Minimize the Possible Risk of Future Epidemics. SpringerBriefs in Applied Sciences and Technology(). Springer, Singapore. https://doi.org/10.1007/978-981-99-7442-9_3
Download citation
DOI : https://doi.org/10.1007/978-981-99-7442-9_3
Published : 23 November 2023
Publisher Name : Springer, Singapore
Print ISBN : 978-981-99-7441-2
Online ISBN : 978-981-99-7442-9
eBook Packages : Intelligent Technologies and Robotics Intelligent Technologies and Robotics (R0)
Share this chapter
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Publish with us
Policies and ethics
- Find a journal
- Track your research
IMAGES
VIDEO
COMMENTS
We conducted a systematic literature review (Borrego et al., 2014; Gough, Oliver, & Thomas, 2017; Petticrew & Roberts, 2006) to identify primary research studies that describe active learning interventions in undergraduate STEM courses, recommend one or more strategies to aid implementation of active learning, and report student response outcomes to active learning.
Rapid changes in the education system in the 21st century require active learning among students to maintain the system's relevance to the most recent trends and developments.
This report provides a general introduction to active learning and a survey of the literature, including a discussion of the scenarios in which queries can be formulated, and an overview of the query strategy frameworks proposed in the literature to date. The key idea behind active learning is that a machine learning algorithm can achieve greater accuracy with fewer labeled training instances ...
This literature Review had the purpose of inspecting how the use of active learning methodologies in higher education can impact students' Well-being. Considering the Heads of State meeting at United Nations Headquarters on September 2015, in which the 2030 Agenda for Sustainable Development was adopted by all United Nations Member states ...
research included in this review suggests that using [active learning] methods cannot in general assure significant benefits in terms of students performance, especially on course ... education research and pedagogy literature. The current review complements the review of Driessen et al. (2020) focusing on undergraduate biology education but is ...
There have been several literature review projects on active learning. However, all of them are narrative reviews, and this type of review does typically not aim to examine the internal validity of the studies in focus (Toronto, 2020).We argue that research quality appraisal should form an essential part of a literature review as this helps to mitigate bias in research.
Objectives: to analyze the scientific evidence on the strategies of active learning methodologies used in the training of nurses, as well as their contributions and obstacles in training. Methods: integrative literature review conducted with 33 selected articles in the Medical Literature Analysis and Retrieval System Online, Latin American and Caribbean Literature in Health Sciences, Nursing ...
This study aims to synthesise the peer-reviewed literature about 'active learning' in college science classroom settings. Using the methodology of an integrative literature review, 337 articles archived in the Educational Resources Information Center (ERIC) are examined.
Focused attempts to design learning spaces specifically attuned to active learning pedagogies date back to the 1990s. One of the earliest published efforts along these lines was the "studio physics" concept implemented at Rensselaer Polytechnic Institute (Wilson and Jennings, 2000; Wilson, 1994).A study of institutional costs of large lecture approaches to teaching introductory physics ...
There are two reviews of the literature on Active Learning that are particularly helpful. The first is by Prince and the second by Michael [2, 7]. Between the two reviews, they examined a wide swath of approaches to Active Learning, including in-class activities, collaborative learning, PBL, and TBL.
To better understand active learning, we constructed this review through an innovative interdisciplinary collaboration involving research teams from psychology and discipline-based education research (DBER). ... A review of literature reports of clickers applicable to college chemistry classrooms. Chemistry Education Research and Practice, 9(3 ...
Active learning has gained growing political, instructional, and research interest. However, the definitions of active learning are wide. The learning outcomes related to it have been mostly positive but the measurement methods are not without problems. This review provides an overview of active learning, especially in the context of engineering higher education, by answering two research ...
This literature Review had the purpose of inspecting how the use of active learning methodologies in higher education can impact students' Well-being. Considering the Heads of State meeting at United Nations Headquarters on September 2015, in which the 2030 Agenda for Sustainable Development was ado …
This literature Review had the purpose of inspecting how the use of active learning methodologies in higher education can impact students' Well-being.
In this paper, we conduct a systematic literature review and identify 29 journal articles and conference papers that researched active learning, affective and behavioral student responses, and ...
Several medical curricula started adopting active learning methods, such as case-based learning, team-based learning, problem-based learning, ... implementation, and outlooks of both instructors and students. 2 Overall, an extensive literature review conducted by Vickey indicated that replacing traditional didactic lectures ...
This report provides a general review of the literature on active learning. There have been a host of algorithms and applications for learning with queries over the years, and this document is an attempt to distill the core ideas, methods, and applications that have been considered by the machine learning community. To
ALCs include deliberate architectural and design attributes that are specifically intended to promote active learning. These typically include moveable furniture that can be reconfigured into a variety of different setups with ease, seating that places students in small groups, plentiful horizontal and/or vertical writing surfaces such as whiteboards, and easy access to learning technologies ...
Teachers play active roles in playful learning, ranging from guides, supervisors, and teammates to onlookers (Mao, Doan, and Handford Citation 2022). ... A systematic review of literature on playful learning in primary education was conducted using the search terms: ('playful learning' OR 'learning by play' OR 'learning through play ...
The authors systematically reviewed the literature on active blended learning (ABL). Health sciences is the most common field where empirical studies have been conducted.
Several review papers exist in the literature that analyze, compare, and discuss various AL strategies applied in classical Machine Learning (ML) and Deep Learning (DL) models [1,2,3,4,5,6,7,8,9,10]. The realm of active research in AL primarily revolves around two key objectives: diminishing the cost associated with labeling—both the ...