Advertisement

Advertisement

The effects of online education on academic success: A meta-analysis study

  • Published: 06 September 2021
  • Volume 27 , pages 429–450, ( 2022 )

Cite this article

research study about online learning

  • Hakan Ulum   ORCID: orcid.org/0000-0002-1398-6935 1  

83k Accesses

30 Citations

9 Altmetric

Explore all metrics

The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students’ academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this study will provide a source to assist future studies with comparing the effect of online education on academic achievement before and after the pandemic. This meta-analysis study consists of 27 studies in total. The meta-analysis involves the studies conducted in the USA, Taiwan, Turkey, China, Philippines, Ireland, and Georgia. The studies included in the meta-analysis are experimental studies, and the total sample size is 1772. In the study, the funnel plot, Duval and Tweedie’s Trip and Fill Analysis, Orwin’s Safe N Analysis, and Egger’s Regression Test were utilized to determine the publication bias, which has been found to be quite low. Besides, Hedge’s g statistic was employed to measure the effect size for the difference between the means performed in accordance with the random effects model. The results of the study show that the effect size of online education on academic achievement is on a medium level. The heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

Explore related subjects

  • Artificial Intelligence
  • Digital Education and Educational Technology

Avoid common mistakes on your manuscript.

1 Introduction

Information and communication technologies have become a powerful force in transforming the educational settings around the world. The pandemic has been an important factor in transferring traditional physical classrooms settings through adopting information and communication technologies and has also accelerated the transformation. The literature supports that learning environments connected to information and communication technologies highly satisfy students. Therefore, we need to keep interest in technology-based learning environments. Clearly, technology has had a huge impact on young people's online lives. This digital revolution can synergize the educational ambitions and interests of digitally addicted students. In essence, COVID-19 has provided us with an opportunity to embrace online learning as education systems have to keep up with the rapid emergence of new technologies.

Information and communication technologies that have an effect on all spheres of life are also actively included in the education field. With the recent developments, using technology in education has become inevitable due to personal and social reasons (Usta, 2011a ). Online education may be given as an example of using information and communication technologies as a consequence of the technological developments. Also, it is crystal clear that online learning is a popular way of obtaining instruction (Demiralay et al., 2016 ; Pillay et al., 2007 ), which is defined by Horton ( 2000 ) as a way of education that is performed through a web browser or an online application without requiring an extra software or a learning source. Furthermore, online learning is described as a way of utilizing the internet to obtain the related learning sources during the learning process, to interact with the content, the teacher, and other learners, as well as to get support throughout the learning process (Ally, 2004 ). Online learning has such benefits as learning independently at any time and place (Vrasidas & MsIsaac, 2000 ), granting facility (Poole, 2000 ), flexibility (Chizmar & Walbert, 1999 ), self-regulation skills (Usta, 2011b ), learning with collaboration, and opportunity to plan self-learning process.

Even though online education practices have not been comprehensive as it is now, internet and computers have been used in education as alternative learning tools in correlation with the advances in technology. The first distance education attempt in the world was initiated by the ‘Steno Courses’ announcement published in Boston newspaper in 1728. Furthermore, in the nineteenth century, Sweden University started the “Correspondence Composition Courses” for women, and University Correspondence College was afterwards founded for the correspondence courses in 1843 (Arat & Bakan, 2011 ). Recently, distance education has been performed through computers, assisted by the facilities of the internet technologies, and soon, it has evolved into a mobile education practice that is emanating from progress in the speed of internet connection, and the development of mobile devices.

With the emergence of pandemic (Covid-19), face to face education has almost been put to a halt, and online education has gained significant importance. The Microsoft management team declared to have 750 users involved in the online education activities on the 10 th March, just before the pandemic; however, on March 24, they informed that the number of users increased significantly, reaching the number of 138,698 users (OECD, 2020 ). This event supports the view that it is better to commonly use online education rather than using it as a traditional alternative educational tool when students do not have the opportunity to have a face to face education (Geostat, 2019 ). The period of Covid-19 pandemic has emerged as a sudden state of having limited opportunities. Face to face education has stopped in this period for a long time. The global spread of Covid-19 affected more than 850 million students all around the world, and it caused the suspension of face to face education. Different countries have proposed several solutions in order to maintain the education process during the pandemic. Schools have had to change their curriculum, and many countries supported the online education practices soon after the pandemic. In other words, traditional education gave its way to online education practices. At least 96 countries have been motivated to access online libraries, TV broadcasts, instructions, sources, video lectures, and online channels (UNESCO, 2020 ). In such a painful period, educational institutions went through online education practices by the help of huge companies such as Microsoft, Google, Zoom, Skype, FaceTime, and Slack. Thus, online education has been discussed in the education agenda more intensively than ever before.

Although online education approaches were not used as comprehensively as it has been used recently, it was utilized as an alternative learning approach in education for a long time in parallel with the development of technology, internet and computers. The academic achievement of the students is often aimed to be promoted by employing online education approaches. In this regard, academicians in various countries have conducted many studies on the evaluation of online education approaches and published the related results. However, the accumulation of scientific data on online education approaches creates difficulties in keeping, organizing and synthesizing the findings. In this research area, studies are being conducted at an increasing rate making it difficult for scientists to be aware of all the research outside of their ​​expertise. Another problem encountered in the related study area is that online education studies are repetitive. Studies often utilize slightly different methods, measures, and/or examples to avoid duplication. This erroneous approach makes it difficult to distinguish between significant differences in the related results. In other words, if there are significant differences in the results of the studies, it may be difficult to express what variety explains the differences in these results. One obvious solution to these problems is to systematically review the results of various studies and uncover the sources. One method of performing such systematic syntheses is the application of meta-analysis which is a methodological and statistical approach to draw conclusions from the literature. At this point, how effective online education applications are in increasing the academic success is an important detail. Has online education, which is likely to be encountered frequently in the continuing pandemic period, been successful in the last ten years? If successful, how much was the impact? Did different variables have an impact on this effect? Academics across the globe have carried out studies on the evaluation of online education platforms and publishing the related results (Chiao et al., 2018 ). It is quite important to evaluate the results of the studies that have been published up until now, and that will be published in the future. Has the online education been successful? If it has been, how big is the impact? Do the different variables affect this impact? What should we consider in the next coming online education practices? These questions have all motivated us to carry out this study. We have conducted a comprehensive meta-analysis study that tries to provide a discussion platform on how to develop efficient online programs for educators and policy makers by reviewing the related studies on online education, presenting the effect size, and revealing the effect of diverse variables on the general impact.

There have been many critical discussions and comprehensive studies on the differences between online and face to face learning; however, the focus of this paper is different in the sense that it clarifies the magnitude of the effect of online education and teaching process, and it represents what factors should be controlled to help increase the effect size. Indeed, the purpose here is to provide conscious decisions in the implementation of the online education process.

The general impact of online education on the academic achievement will be discovered in the study. Therefore, this will provide an opportunity to get a general overview of the online education which has been practiced and discussed intensively in the pandemic period. Moreover, the general impact of online education on academic achievement will be analyzed, considering different variables. In other words, the current study will allow to totally evaluate the study results from the related literature, and to analyze the results considering several cultures, lectures, and class levels. Considering all the related points, this study seeks to answer the following research questions:

What is the effect size of online education on academic achievement?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the country?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the class level?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the lecture?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the online education approaches?

This study aims at determining the effect size of online education, which has been highly used since the beginning of the pandemic, on students’ academic achievement in different courses by using a meta-analysis method. Meta-analysis is a synthesis method that enables gathering of several study results accurately and efficiently, and getting the total results in the end (Tsagris & Fragkos, 2018 ).

2.1 Selecting and coding the data (studies)

The required literature for the meta-analysis study was reviewed in July, 2020, and the follow-up review was conducted in September, 2020. The purpose of the follow-up review was to include the studies which were published in the conduction period of this study, and which met the related inclusion criteria. However, no study was encountered to be included in the follow-up review.

In order to access the studies in the meta-analysis, the databases of Web of Science, ERIC, and SCOPUS were reviewed by utilizing the keywords ‘online learning and online education’. Not every database has a search engine that grants access to the studies by writing the keywords, and this obstacle was considered to be an important problem to be overcome. Therefore, a platform that has a special design was utilized by the researcher. With this purpose, through the open access system of Cukurova University Library, detailed reviews were practiced using EBSCO Information Services (EBSCO) that allow reviewing the whole collection of research through a sole searching box. Since the fundamental variables of this study are online education and online learning, the literature was systematically reviewed in the related databases (Web of Science, ERIC, and SCOPUS) by referring to the keywords. Within this scope, 225 articles were accessed, and the studies were included in the coding key list formed by the researcher. The name of the researchers, the year, the database (Web of Science, ERIC, and SCOPUS), the sample group and size, the lectures that the academic achievement was tested in, the country that the study was conducted in, and the class levels were all included in this coding key.

The following criteria were identified to include 225 research studies which were coded based on the theoretical basis of the meta-analysis study: (1) The studies should be published in the refereed journals between the years 2020 and 2021, (2) The studies should be experimental studies that try to determine the effect of online education and online learning on academic achievement, (3) The values of the stated variables or the required statistics to calculate these values should be stated in the results of the studies, and (4) The sample group of the study should be at a primary education level. These criteria were also used as the exclusion criteria in the sense that the studies that do not meet the required criteria were not included in the present study.

After the inclusion criteria were determined, a systematic review process was conducted, following the year criterion of the study by means of EBSCO. Within this scope, 290,365 studies that analyze the effect of online education and online learning on academic achievement were accordingly accessed. The database (Web of Science, ERIC, and SCOPUS) was also used as a filter by analyzing the inclusion criteria. Hence, the number of the studies that were analyzed was 58,616. Afterwards, the keyword ‘primary education’ was used as the filter and the number of studies included in the study decreased to 3152. Lastly, the literature was reviewed by using the keyword ‘academic achievement’ and 225 studies were accessed. All the information of 225 articles was included in the coding key.

It is necessary for the coders to review the related studies accurately and control the validity, safety, and accuracy of the studies (Stewart & Kamins, 2001 ). Within this scope, the studies that were determined based on the variables used in this study were first reviewed by three researchers from primary education field, then the accessed studies were combined and processed in the coding key by the researcher. All these studies that were processed in the coding key were analyzed in accordance with the inclusion criteria by all the researchers in the meetings, and it was decided that 27 studies met the inclusion criteria (Atici & Polat, 2010 ; Carreon, 2018 ; Ceylan & Elitok Kesici, 2017 ; Chae & Shin, 2016 ; Chiang et al. 2014 ; Ercan, 2014 ; Ercan et al., 2016 ; Gwo-Jen et al., 2018 ; Hayes & Stewart, 2016 ; Hwang et al., 2012 ; Kert et al., 2017 ; Lai & Chen, 2010 ; Lai et al., 2015 ; Meyers et al., 2015 ; Ravenel et al., 2014 ; Sung et al., 2016 ; Wang & Chen, 2013 ; Yu, 2019 ; Yu & Chen, 2014 ; Yu & Pan, 2014 ; Yu et al., 2010 ; Zhong et al., 2017 ). The data from the studies meeting the inclusion criteria were independently processed in the second coding key by three researchers, and consensus meetings were arranged for further discussion. After the meetings, researchers came to an agreement that the data were coded accurately and precisely. Having identified the effect sizes and heterogeneity of the study, moderator variables that will show the differences between the effect sizes were determined. The data related to the determined moderator variables were added to the coding key by three researchers, and a new consensus meeting was arranged. After the meeting, researchers came to an agreement that moderator variables were coded accurately and precisely.

2.2 Study group

27 studies are included in the meta-analysis. The total sample size of the studies that are included in the analysis is 1772. The characteristics of the studies included are given in Table 1 .

2.3 Publication bias

Publication bias is the low capability of published studies on a research subject to represent all completed studies on the same subject (Card, 2011 ; Littell et al., 2008 ). Similarly, publication bias is the state of having a relationship between the probability of the publication of a study on a subject, and the effect size and significance that it produces. Within this scope, publication bias may occur when the researchers do not want to publish the study as a result of failing to obtain the expected results, or not being approved by the scientific journals, and consequently not being included in the study synthesis (Makowski et al., 2019 ). The high possibility of publication bias in a meta-analysis study negatively affects (Pecoraro, 2018 ) the accuracy of the combined effect size, causing the average effect size to be reported differently than it should be (Borenstein et al., 2009 ). For this reason, the possibility of publication bias in the included studies was tested before determining the effect sizes of the relationships between the stated variables. The possibility of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

2.4 Selecting the model

After determining the probability of publication bias of this meta-analysis study, the statistical model used to calculate the effect sizes was selected. The main approaches used in the effect size calculations according to the differentiation level of inter-study variance are fixed and random effects models (Pigott, 2012 ). Fixed effects model refers to the homogeneity of the characteristics of combined studies apart from the sample sizes, while random effects model refers to the parameter diversity between the studies (Cumming, 2012 ). While calculating the average effect size in the random effects model (Deeks et al., 2008 ) that is based on the assumption that effect predictions of different studies are only the result of a similar distribution, it is necessary to consider several situations such as the effect size apart from the sample error of combined studies, characteristics of the participants, duration, scope, and pattern of the study (Littell et al., 2008 ). While deciding the model in the meta-analysis study, the assumptions on the sample characteristics of the studies included in the analysis and the inferences that the researcher aims to make should be taken into consideration. The fact that the sample characteristics of the studies conducted in the field of social sciences are affected by various parameters shows that using random effects model is more appropriate in this sense. Besides, it is stated that the inferences made with the random effects model are beyond the studies included in the meta-analysis (Field, 2003 ; Field & Gillett, 2010 ). Therefore, using random effects model also contributes to the generalization of research data. The specified criteria for the statistical model selection show that according to the nature of the meta-analysis study, the model should be selected just before the analysis (Borenstein et al., 2007 ; Littell et al., 2008 ). Within this framework, it was decided to make use of the random effects model, considering that the students who are the samples of the studies included in the meta-analysis are from different countries and cultures, the sample characteristics of the studies differ, and the patterns and scopes of the studies vary as well.

2.5 Heterogeneity

Meta-analysis facilitates analyzing the research subject with different parameters by showing the level of diversity between the included studies. Within this frame, whether there is a heterogeneous distribution between the studies included in the study or not has been evaluated in the present study. The heterogeneity of the studies combined in this meta-analysis study has been determined through Q and I 2 tests. Q test evaluates the random distribution probability of the differences between the observed results (Deeks et al., 2008 ). Q value exceeding 2 value calculated according to the degree of freedom and significance, indicates the heterogeneity of the combined effect sizes (Card, 2011 ). I 2 test, which is the complementary of the Q test, shows the heterogeneity amount of the effect sizes (Cleophas & Zwinderman, 2017 ). I 2 value being higher than 75% is explained as high level of heterogeneity.

In case of encountering heterogeneity in the studies included in the meta-analysis, the reasons of heterogeneity can be analyzed by referring to the study characteristics. The study characteristics which may be related to the heterogeneity between the included studies can be interpreted through subgroup analysis or meta-regression analysis (Deeks et al., 2008 ). While determining the moderator variables, the sufficiency of the number of variables, the relationship between the moderators, and the condition to explain the differences between the results of the studies have all been considered in the present study. Within this scope, it was predicted in this meta-analysis study that the heterogeneity can be explained with the country, class level, and lecture moderator variables of the study in terms of the effect of online education, which has been highly used since the beginning of the pandemic, and it has an impact on the students’ academic achievement in different lectures. Some subgroups were evaluated and categorized together, considering that the number of effect sizes of the sub-dimensions of the specified variables is not sufficient to perform moderator analysis (e.g. the countries where the studies were conducted).

2.6 Interpreting the effect sizes

Effect size is a factor that shows how much the independent variable affects the dependent variable positively or negatively in each included study in the meta-analysis (Dinçer, 2014 ). While interpreting the effect sizes obtained from the meta-analysis, the classifications of Cohen et al. ( 2007 ) have been utilized. The case of differentiating the specified relationships of the situation of the country, class level, and school subject variables of the study has been identified through the Q test, degree of freedom, and p significance value Fig.  1 and 2 .

3 Findings and results

The purpose of this study is to determine the effect size of online education on academic achievement. Before determining the effect sizes in the study, the probability of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

When the funnel plots are examined, it is seen that the studies included in the analysis are distributed symmetrically on both sides of the combined effect size axis, and they are generally collected in the middle and lower sections. The probability of publication bias is low according to the plots. However, since the results of the funnel scatter plots may cause subjective interpretations, they have been supported by additional analyses (Littell et al., 2008 ). Therefore, in order to provide an extra proof for the probability of publication bias, it has been analyzed through Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test (Table 2 ).

Table 2 consists of the results of the rates of publication bias probability before counting the effect size of online education on academic achievement. According to the table, Orwin Safe N analysis results show that it is not necessary to add new studies to the meta-analysis in order for Hedges g to reach a value outside the range of ± 0.01. The Duval and Tweedie test shows that excluding the studies that negatively affect the symmetry of the funnel scatter plots for each meta-analysis or adding their exact symmetrical equivalents does not significantly differentiate the calculated effect size. The insignificance of the Egger tests results reveals that there is no publication bias in the meta-analysis study. The results of the analysis indicate the high internal validity of the effect sizes and the adequacy of representing the studies conducted on the relevant subject.

In this study, it was aimed to determine the effect size of online education on academic achievement after testing the publication bias. In line with the first purpose of the study, the forest graph regarding the effect size of online education on academic achievement is shown in Fig.  3 , and the statistics regarding the effect size are given in Table 3 .

figure 1

The flow chart of the scanning and selection process of the studies

figure 2

Funnel plot graphics representing the effect size of the effects of online education on academic success

figure 3

Forest graph related to the effect size of online education on academic success

The square symbols in the forest graph in Fig.  3 represent the effect sizes, while the horizontal lines show the intervals in 95% confidence of the effect sizes, and the diamond symbol shows the overall effect size. When the forest graph is analyzed, it is seen that the lower and upper limits of the combined effect sizes are generally close to each other, and the study loads are similar. This similarity in terms of study loads indicates the similarity of the contribution of the combined studies to the overall effect size.

Figure  3 clearly represents that the study of Liu and others (Liu et al., 2018 ) has the lowest, and the study of Ercan and Bilen ( 2014 ) has the highest effect sizes. The forest graph shows that all the combined studies and the overall effect are positive. Furthermore, it is simply understood from the forest graph in Fig.  3 and the effect size statistics in Table 3 that the results of the meta-analysis study conducted with 27 studies and analyzing the effect of online education on academic achievement illustrate that this relationship is on average level (= 0.409).

After the analysis of the effect size in the study, whether the studies included in the analysis are distributed heterogeneously or not has also been analyzed. The heterogeneity of the combined studies was determined through the Q and I 2 tests. As a result of the heterogeneity test, Q statistical value was calculated as 29.576. With 26 degrees of freedom at 95% significance level in the chi-square table, the critical value is accepted as 38.885. The Q statistical value (29.576) counted in this study is lower than the critical value of 38.885. The I 2 value, which is the complementary of the Q statistics, is 12.100%. This value indicates that the accurate heterogeneity or the total variability that can be attributed to variability between the studies is 12%. Besides, p value is higher than (0.285) p = 0.05. All these values [Q (26) = 29.579, p = 0.285; I2 = 12.100] indicate that there is a homogeneous distribution between the effect sizes, and fixed effects model should be used to interpret these effect sizes. However, some researchers argue that even if the heterogeneity is low, it should be evaluated based on the random effects model (Borenstein et al., 2007 ). Therefore, this study gives information about both models. The heterogeneity of the combined studies has been attempted to be explained with the characteristics of the studies included in the analysis. In this context, the final purpose of the study is to determine the effect of the country, academic level, and year variables on the findings. Accordingly, the statistics regarding the comparison of the stated relations according to the countries where the studies were conducted are given in Table 4 .

As seen in Table 4 , the effect of online education on academic achievement does not differ significantly according to the countries where the studies were conducted in. Q test results indicate the heterogeneity of the relationships between the variables in terms of countries where the studies were conducted in. According to the table, the effect of online education on academic achievement was reported as the highest in other countries, and the lowest in the US. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 5 .

As seen in Table 5 , the effect of online education on academic achievement does not differ according to the class level. However, the effect of online education on academic achievement is the highest in the 4 th class. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 6 .

As seen in Table 6 , the effect of online education on academic achievement does not differ according to the school subjects included in the studies. However, the effect of online education on academic achievement is the highest in ICT subject.

The obtained effect size in the study was formed as a result of the findings attained from primary studies conducted in 7 different countries. In addition, these studies are the ones on different approaches to online education (online learning environments, social networks, blended learning, etc.). In this respect, the results may raise some questions about the validity and generalizability of the results of the study. However, the moderator analyzes, whether for the country variable or for the approaches covered by online education, did not create significant differences in terms of the effect sizes. If significant differences were to occur in terms of effect sizes, we could say that the comparisons we will make by comparing countries under the umbrella of online education would raise doubts in terms of generalizability. Moreover, no study has been found in the literature that is not based on a special approach or does not contain a specific technique conducted under the name of online education alone. For instance, one of the commonly used definitions is blended education which is defined as an educational model in which online education is combined with traditional education method (Colis & Moonen, 2001 ). Similarly, Rasmussen ( 2003 ) defines blended learning as “a distance education method that combines technology (high technology such as television, internet, or low technology such as voice e-mail, conferences) with traditional education and training.” Further, Kerres and Witt (2003) define blended learning as “combining face-to-face learning with technology-assisted learning.” As it is clearly observed, online education, which has a wider scope, includes many approaches.

As seen in Table 7 , the effect of online education on academic achievement does not differ according to online education approaches included in the studies. However, the effect of online education on academic achievement is the highest in Web Based Problem Solving Approach.

4 Conclusions and discussion

Considering the developments during the pandemics, it is thought that the diversity in online education applications as an interdisciplinary pragmatist field will increase, and the learning content and processes will be enriched with the integration of new technologies into online education processes. Another prediction is that more flexible and accessible learning opportunities will be created in online education processes, and in this way, lifelong learning processes will be strengthened. As a result, it is predicted that in the near future, online education and even digital learning with a newer name will turn into the main ground of education instead of being an alternative or having a support function in face-to-face learning. The lessons learned from the early period online learning experience, which was passed with rapid adaptation due to the Covid19 epidemic, will serve to develop this method all over the world, and in the near future, online learning will become the main learning structure through increasing its functionality with the contribution of new technologies and systems. If we look at it from this point of view, there is a necessity to strengthen online education.

In this study, the effect of online learning on academic achievement is at a moderate level. To increase this effect, the implementation of online learning requires support from teachers to prepare learning materials, to design learning appropriately, and to utilize various digital-based media such as websites, software technology and various other tools to support the effectiveness of online learning (Rolisca & Achadiyah, 2014 ). According to research conducted by Rahayu et al. ( 2017 ), it has been proven that the use of various types of software increases the effectiveness and quality of online learning. Implementation of online learning can affect students' ability to adapt to technological developments in that it makes students use various learning resources on the internet to access various types of information, and enables them to get used to performing inquiry learning and active learning (Hart et al., 2019 ; Prestiadi et al., 2019 ). In addition, there may be many reasons for the low level of effect in this study. The moderator variables examined in this study could be a guide in increasing the level of practical effect. However, the effect size did not differ significantly for all moderator variables. Different moderator analyzes can be evaluated in order to increase the level of impact of online education on academic success. If confounding variables that significantly change the effect level are detected, it can be spoken more precisely in order to increase this level. In addition to the technical and financial problems, the level of impact will increase if a few other difficulties are eliminated such as students, lack of interaction with the instructor, response time, and lack of traditional classroom socialization.

In addition, COVID-19 pandemic related social distancing has posed extreme difficulties for all stakeholders to get online as they have to work in time constraints and resource constraints. Adopting the online learning environment is not just a technical issue, it is a pedagogical and instructive challenge as well. Therefore, extensive preparation of teaching materials, curriculum, and assessment is vital in online education. Technology is the delivery tool and requires close cross-collaboration between teaching, content and technology teams (CoSN, 2020 ).

Online education applications have been used for many years. However, it has come to the fore more during the pandemic process. This result of necessity has brought with it the discussion of using online education instead of traditional education methods in the future. However, with this research, it has been revealed that online education applications are moderately effective. The use of online education instead of face-to-face education applications can only be possible with an increase in the level of success. This may have been possible with the experience and knowledge gained during the pandemic process. Therefore, the meta-analysis of experimental studies conducted in the coming years will guide us. In this context, experimental studies using online education applications should be analyzed well. It would be useful to identify variables that can change the level of impacts with different moderators. Moderator analyzes are valuable in meta-analysis studies (for example, the role of moderators in Karl Pearson's typhoid vaccine studies). In this context, each analysis study sheds light on future studies. In meta-analyses to be made about online education, it would be beneficial to go beyond the moderators determined in this study. Thus, the contribution of similar studies to the field will increase more.

The purpose of this study is to determine the effect of online education on academic achievement. In line with this purpose, the studies that analyze the effect of online education approaches on academic achievement have been included in the meta-analysis. The total sample size of the studies included in the meta-analysis is 1772. While the studies included in the meta-analysis were conducted in the US, Taiwan, Turkey, China, Philippines, Ireland, and Georgia, the studies carried out in Europe could not be reached. The reason may be attributed to that there may be more use of quantitative research methods from a positivist perspective in the countries with an American academic tradition. As a result of the study, it was found out that the effect size of online education on academic achievement (g = 0.409) was moderate. In the studies included in the present research, we found that online education approaches were more effective than traditional ones. However, contrary to the present study, the analysis of comparisons between online and traditional education in some studies shows that face-to-face traditional learning is still considered effective compared to online learning (Ahmad et al., 2016 ; Hamdani & Priatna, 2020 ; Wei & Chou, 2020 ). Online education has advantages and disadvantages. The advantages of online learning compared to face-to-face learning in the classroom is the flexibility of learning time in online learning, the learning time does not include a single program, and it can be shaped according to circumstances (Lai et al., 2019 ). The next advantage is the ease of collecting assignments for students, as these can be done without having to talk to the teacher. Despite this, online education has several weaknesses, such as students having difficulty in understanding the material, teachers' inability to control students, and students’ still having difficulty interacting with teachers in case of internet network cuts (Swan, 2007 ). According to Astuti et al ( 2019 ), face-to-face education method is still considered better by students than e-learning because it is easier to understand the material and easier to interact with teachers. The results of the study illustrated that the effect size (g = 0.409) of online education on academic achievement is of medium level. Therefore, the results of the moderator analysis showed that the effect of online education on academic achievement does not differ in terms of country, lecture, class level, and online education approaches variables. After analyzing the literature, several meta-analyses on online education were published (Bernard et al., 2004 ; Machtmes & Asher, 2000 ; Zhao et al., 2005 ). Typically, these meta-analyzes also include the studies of older generation technologies such as audio, video, or satellite transmission. One of the most comprehensive studies on online education was conducted by Bernard et al. ( 2004 ). In this study, 699 independent effect sizes of 232 studies published from 1985 to 2001 were analyzed, and face-to-face education was compared to online education, with respect to success criteria and attitudes of various learners from young children to adults. In this meta-analysis, an overall effect size close to zero was found for the students' achievement (g +  = 0.01).

In another meta-analysis study carried out by Zhao et al. ( 2005 ), 98 effect sizes were examined, including 51 studies on online education conducted between 1996 and 2002. According to the study of Bernard et al. ( 2004 ), this meta-analysis focuses on the activities done in online education lectures. As a result of the research, an overall effect size close to zero was found for online education utilizing more than one generation technology for students at different levels. However, the salient point of the meta-analysis study of Zhao et al. is that it takes the average of different types of results used in a study to calculate an overall effect size. This practice is problematic because the factors that develop one type of learner outcome (e.g. learner rehabilitation), particularly course characteristics and practices, may be quite different from those that develop another type of outcome (e.g. learner's achievement), and it may even cause damage to the latter outcome. While mixing the studies with different types of results, this implementation may obscure the relationship between practices and learning.

Some meta-analytical studies have focused on the effectiveness of the new generation distance learning courses accessed through the internet for specific student populations. For instance, Sitzmann and others (Sitzmann et al., 2006 ) reviewed 96 studies published from 1996 to 2005, comparing web-based education of job-related knowledge or skills with face-to-face one. The researchers found that web-based education in general was slightly more effective than face-to-face education, but it is insufficient in terms of applicability ("knowing how to apply"). In addition, Sitzmann et al. ( 2006 ) revealed that Internet-based education has a positive effect on theoretical knowledge in quasi-experimental studies; however, it positively affects face-to-face education in experimental studies performed by random assignment. This moderator analysis emphasizes the need to pay attention to the factors of designs of the studies included in the meta-analysis. The designs of the studies included in this meta-analysis study were ignored. This can be presented as a suggestion to the new studies that will be conducted.

Another meta-analysis study was conducted by Cavanaugh et al. ( 2004 ), in which they focused on online education. In this study on internet-based distance education programs for students under 12 years of age, the researchers combined 116 results from 14 studies published between 1999 and 2004 to calculate an overall effect that was not statistically different from zero. The moderator analysis carried out in this study showed that there was no significant factor affecting the students' success. This meta-analysis used multiple results of the same study, ignoring the fact that different results of the same student would not be independent from each other.

In conclusion, some meta-analytical studies analyzed the consequences of online education for a wide range of students (Bernard et al., 2004 ; Zhao et al., 2005 ), and the effect sizes were generally low in these studies. Furthermore, none of the large-scale meta-analyzes considered the moderators, database quality standards or class levels in the selection of the studies, while some of them just referred to the country and lecture moderators. Advances in internet-based learning tools, the pandemic process, and increasing popularity in different learning contexts have required a precise meta-analysis of students' learning outcomes through online learning. Previous meta-analysis studies were typically based on the studies, involving narrow range of confounding variables. In the present study, common but significant moderators such as class level and lectures during the pandemic process were discussed. For instance, the problems have been experienced especially in terms of eligibility of class levels in online education platforms during the pandemic process. It was found that there is a need to study and make suggestions on whether online education can meet the needs of teachers and students.

Besides, the main forms of online education in the past were to watch the open lectures of famous universities and educational videos of institutions. In addition, online education is mainly a classroom-based teaching implemented by teachers in their own schools during the pandemic period, which is an extension of the original school education. This meta-analysis study will stand as a source to compare the effect size of the online education forms of the past decade with what is done today, and what will be done in the future.

Lastly, the heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

*Studies included in meta-analysis

Ahmad, S., Sumardi, K., & Purnawan, P. (2016). Komparasi Peningkatan Hasil Belajar Antara Pembelajaran Menggunakan Sistem Pembelajaran Online Terpadu Dengan Pembelajaran Klasikal Pada Mata Kuliah Pneumatik Dan Hidrolik. Journal of Mechanical Engineering Education, 2 (2), 286–292.

Article   Google Scholar  

Ally, M. (2004). Foundations of educational theory for online learning. Theory and Practice of Online Learning, 2 , 15–44. Retrieved on the 11th of September, 2020 from https://eddl.tru.ca/wp-content/uploads/2018/12/01_Anderson_2008-Theory_and_Practice_of_Online_Learning.pdf

Arat, T., & Bakan, Ö. (2011). Uzaktan eğitim ve uygulamaları. Selçuk Üniversitesi Sosyal Bilimler Meslek Yüksek Okulu Dergisi , 14 (1–2), 363–374. https://doi.org/10.29249/selcuksbmyd.540741

Astuti, C. C., Sari, H. M. K., & Azizah, N. L. (2019). Perbandingan Efektifitas Proses Pembelajaran Menggunakan Metode E-Learning dan Konvensional. Proceedings of the ICECRS, 2 (1), 35–40.

*Atici, B., & Polat, O. C. (2010). Influence of the online learning environments and tools on the student achievement and opinions. Educational Research and Reviews, 5 (8), 455–464. Retrieved on the 11th of October, 2020 from https://academicjournals.org/journal/ERR/article-full-text-pdf/4C8DD044180.pdf

Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare with classroom instruction? A meta- analysis of the empirical literature. Review of Educational Research, 3 (74), 379–439. https://doi.org/10.3102/00346543074003379

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis . Wiley.

Book   Google Scholar  

Borenstein, M., Hedges, L., & Rothstein, H. (2007). Meta-analysis: Fixed effect vs. random effects . UK: Wiley.

Card, N. A. (2011). Applied meta-analysis for social science research: Methodology in the social sciences . Guilford.

Google Scholar  

*Carreon, J. R. (2018 ). Facebook as integrated blended learning tool in technology and livelihood education exploratory. Retrieved on the 1st of October, 2020 from https://files.eric.ed.gov/fulltext/EJ1197714.pdf

Cavanaugh, C., Gillan, K. J., Kromrey, J., Hess, M., & Blomeyer, R. (2004). The effects of distance education on K-12 student outcomes: A meta-analysis. Learning Point Associates/North Central Regional Educational Laboratory (NCREL) . Retrieved on the 11th of September, 2020 from https://files.eric.ed.gov/fulltext/ED489533.pdf

*Ceylan, V. K., & Elitok Kesici, A. (2017). Effect of blended learning to academic achievement. Journal of Human Sciences, 14 (1), 308. https://doi.org/10.14687/jhs.v14i1.4141

*Chae, S. E., & Shin, J. H. (2016). Tutoring styles that encourage learner satisfaction, academic engagement, and achievement in an online environment. Interactive Learning Environments, 24(6), 1371–1385. https://doi.org/10.1080/10494820.2015.1009472

*Chiang, T. H. C., Yang, S. J. H., & Hwang, G. J. (2014). An augmented reality-based mobile learning system to improve students’ learning achievements and motivations in natural science inquiry activities. Educational Technology and Society, 17 (4), 352–365. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Gwo_Jen_Hwang/publication/287529242_An_Augmented_Reality-based_Mobile_Learning_System_to_Improve_Students'_Learning_Achievements_and_Motivations_in_Natural_Science_Inquiry_Activities/links/57198c4808ae30c3f9f2c4ac.pdf

Chiao, H. M., Chen, Y. L., & Huang, W. H. (2018). Examining the usability of an online virtual tour-guiding platform for cultural tourism education. Journal of Hospitality, Leisure, Sport & Tourism Education, 23 (29–38), 1. https://doi.org/10.1016/j.jhlste.2018.05.002

Chizmar, J. F., & Walbert, M. S. (1999). Web-based learning environments guided by principles of good teaching practice. Journal of Economic Education, 30 (3), 248–264. https://doi.org/10.2307/1183061

Cleophas, T. J., & Zwinderman, A. H. (2017). Modern meta-analysis: Review and update of methodologies . Switzerland: Springer. https://doi.org/10.1007/978-3-319-55895-0

Cohen, L., Manion, L., & Morrison, K. (2007). Observation.  Research Methods in Education, 6 , 396–412. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Nabil_Ashraf2/post/How_to_get_surface_potential_Vs_Voltage_curve_from_CV_and_GV_measurements_of_MOS_capacitor/attachment/5ac6033cb53d2f63c3c405b4/AS%3A612011817844736%401522926396219/download/Very+important_C-V+characterization+Lehigh+University+thesis.pdf

Colis, B., & Moonen, J. (2001). Flexible Learning in a Digital World: Experiences and Expectations. Open & Distance Learning Series . Stylus Publishing.

CoSN. (2020). COVID-19 Response: Preparing to Take School Online. CoSN. (2020). COVID-19 Response: Preparing to Take School Online. Retrieved on the 3rd of September, 2021 from https://www.cosn.org/sites/default/files/COVID-19%20Member%20Exclusive_0.pdf

Cumming, G. (2012). Understanding new statistics: Effect sizes, confidence intervals, and meta-analysis. New York, USA: Routledge. https://doi.org/10.4324/9780203807002

Deeks, J. J., Higgins, J. P. T., & Altman, D. G. (2008). Analysing data and undertaking meta-analyses . In J. P. T. Higgins & S. Green (Eds.), Cochrane handbook for systematic reviews of interventions (pp. 243–296). Sussex: John Wiley & Sons. https://doi.org/10.1002/9780470712184.ch9

Demiralay, R., Bayır, E. A., & Gelibolu, M. F. (2016). Öğrencilerin bireysel yenilikçilik özellikleri ile çevrimiçi öğrenmeye hazır bulunuşlukları ilişkisinin incelenmesi. Eğitim ve Öğretim Araştırmaları Dergisi, 5 (1), 161–168. https://doi.org/10.23891/efdyyu.2017.10

Dinçer, S. (2014). Eğitim bilimlerinde uygulamalı meta-analiz. Pegem Atıf İndeksi, 2014(1), 1–133. https://doi.org/10.14527/pegem.001

*Durak, G., Cankaya, S., Yunkul, E., & Ozturk, G. (2017). The effects of a social learning network on students’ performances and attitudes. European Journal of Education Studies, 3 (3), 312–333. 10.5281/zenodo.292951

*Ercan, O. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes . European Journal of Educational Research, 3 (1), 9–23. https://doi.org/10.12973/eu-jer.3.1.9

Ercan, O., & Bilen, K. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes. European Journal of Educational Research, 3 (1), 9–23.

*Ercan, O., Bilen, K., & Ural, E. (2016). “Earth, sun and moon”: Computer assisted instruction in secondary school science - Achievement and attitudes. Issues in Educational Research, 26 (2), 206–224. https://doi.org/10.12973/eu-jer.3.1.9

Field, A. P. (2003). The problems in using fixed-effects models of meta-analysis on real-world data. Understanding Statistics, 2 (2), 105–124. https://doi.org/10.1207/s15328031us0202_02

Field, A. P., & Gillett, R. (2010). How to do a meta-analysis. British Journal of Mathematical and Statistical Psychology, 63 (3), 665–694. https://doi.org/10.1348/00071010x502733

Geostat. (2019). ‘Share of households with internet access’, National statistics office of Georgia . Retrieved on the 2nd September 2020 from https://www.geostat.ge/en/modules/categories/106/information-and-communication-technologies-usage-in-households

*Gwo-Jen, H., Nien-Ting, T., & Xiao-Ming, W. (2018). Creating interactive e-books through learning by design: The impacts of guided peer-feedback on students’ learning achievements and project outcomes in science courses. Journal of Educational Technology & Society., 21 (1), 25–36. Retrieved on the 2nd of October, 2020 https://ae-uploads.uoregon.edu/ISTE/ISTE2019/PROGRAM_SESSION_MODEL/HANDOUTS/112172923/CreatingInteractiveeBooksthroughLearningbyDesignArticle2018.pdf

Hamdani, A. R., & Priatna, A. (2020). Efektifitas implementasi pembelajaran daring (full online) dimasa pandemi Covid-19 pada jenjang Sekolah Dasar di Kabupaten Subang. Didaktik: Jurnal Ilmiah PGSD STKIP Subang, 6 (1), 1–9.

Hart, C. M., Berger, D., Jacob, B., Loeb, S., & Hill, M. (2019). Online learning, offline outcomes: Online course taking and high school student performance. Aera Open, 5(1).

*Hayes, J., & Stewart, I. (2016). Comparing the effects of derived relational training and computer coding on intellectual potential in school-age children. The British Journal of Educational Psychology, 86 (3), 397–411. https://doi.org/10.1111/bjep.12114

Horton, W. K. (2000). Designing web-based training: How to teach anyone anything anywhere anytime (Vol. 1). Wiley Publishing.

*Hwang, G. J., Wu, P. H., & Chen, C. C. (2012). An online game approach for improving students’ learning performance in web-based problem-solving activities. Computers and Education, 59 (4), 1246–1256. https://doi.org/10.1016/j.compedu.2012.05.009

*Kert, S. B., Köşkeroğlu Büyükimdat, M., Uzun, A., & Çayiroğlu, B. (2017). Comparing active game-playing scores and academic performances of elementary school students. Education 3–13, 45 (5), 532–542. https://doi.org/10.1080/03004279.2016.1140800

*Lai, A. F., & Chen, D. J. (2010). Web-based two-tier diagnostic test and remedial learning experiment. International Journal of Distance Education Technologies, 8 (1), 31–53. https://doi.org/10.4018/jdet.2010010103

*Lai, A. F., Lai, H. Y., Chuang W. H., & Wu, Z.H. (2015). Developing a mobile learning management system for outdoors nature science activities based on 5e learning cycle. Proceedings of the International Conference on e-Learning, ICEL. Proceedings of the International Association for Development of the Information Society (IADIS) International Conference on e-Learning (Las Palmas de Gran Canaria, Spain, July 21–24, 2015). Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/ED562095.pdf

Lai, C. H., Lin, H. W., Lin, R. M., & Tho, P. D. (2019). Effect of peer interaction among online learning community on learning engagement and achievement. International Journal of Distance Education Technologies (IJDET), 17 (1), 66–77.

Littell, J. H., Corcoran, J., & Pillai, V. (2008). Systematic reviews and meta-analysis . Oxford University.

*Liu, K. P., Tai, S. J. D., & Liu, C. C. (2018). Enhancing language learning through creation: the effect of digital storytelling on student learning motivation and performance in a school English course. Educational Technology Research and Development, 66 (4), 913–935. https://doi.org/10.1007/s11423-018-9592-z

Machtmes, K., & Asher, J. W. (2000). A meta-analysis of the effectiveness of telecourses in distance education. American Journal of Distance Education, 14 (1), 27–46. https://doi.org/10.1080/08923640009527043

Makowski, D., Piraux, F., & Brun, F. (2019). From experimental network to meta-analysis: Methods and applications with R for agronomic and environmental sciences. Dordrecht: Springer. https://doi.org/10.1007/978-94-024_1696-1

* Meyers, C., Molefe, A., & Brandt, C. (2015). The Impact of the" Enhancing Missouri's Instructional Networked Teaching Strategies"(eMINTS) Program on Student Achievement, 21st-Century Skills, and Academic Engagement--Second-Year Results . Society for Research on Educational Effectiveness. Retrieved on the 14 th November, 2020 from https://files.eric.ed.gov/fulltext/ED562508.pdf

OECD. (2020). ‘A framework to guide an education response to the COVID-19 Pandemic of 2020 ’. https://doi.org/10.26524/royal.37.6

Pecoraro, V. (2018). Appraising evidence . In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 99–114). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_9

Pigott, T. (2012). Advances in meta-analysis . Springer.

Pillay, H. , Irving, K., & Tones, M. (2007). Validation of the diagnostic tool for assessing Tertiary students’ readiness for online learning. Higher Education Research & Development, 26 (2), 217–234. https://doi.org/10.1080/07294360701310821

Prestiadi, D., Zulkarnain, W., & Sumarsono, R. B. (2019). Visionary leadership in total quality management: efforts to improve the quality of education in the industrial revolution 4.0. In the 4th International Conference on Education and Management (COEMA 2019). Atlantis Press

Poole, D. M. (2000). Student participation in a discussion-oriented online course: a case study. Journal of Research on Computing in Education, 33 (2), 162–177. https://doi.org/10.1080/08886504.2000.10782307

Rahayu, F. S., Budiyanto, D., & Palyama, D. (2017). Analisis penerimaan e-learning menggunakan technology acceptance model (Tam)(Studi Kasus: Universitas Atma Jaya Yogyakarta). Jurnal Terapan Teknologi Informasi, 1 (2), 87–98.

Rasmussen, R. C. (2003). The quantity and quality of human interaction in a synchronous blended learning environment . Brigham Young University Press.

*Ravenel, J., T. Lambeth, D., & Spires, B. (2014). Effects of computer-based programs on mathematical achievement scores for fourth-grade students. i-manager’s Journal on School Educational Technology, 10 (1), 8–21. https://doi.org/10.26634/jsch.10.1.2830

Rolisca, R. U. C., & Achadiyah, B. N. (2014). Pengembangan media evaluasi pembelajaran dalam bentuk online berbasis e-learning menggunakan software wondershare quiz creator dalam mata pelajaran akuntansi SMA Brawijaya Smart School (BSS). Jurnal Pendidikan Akuntansi Indonesia, 12(2).

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effective- ness of Web-based and classroom instruction: A meta-analysis . Personnel Psychology, 59 (3), 623–664. https://doi.org/10.1111/j.1744-6570.2006.00049.x

Stewart, D. W., & Kamins, M. A. (2001). Developing a coding scheme and coding study reports. In M. W. Lipsey & D. B. Wilson (Eds.), Practical meta­analysis: Applied social research methods series (Vol. 49, pp. 73–90). Sage.

Swan, K. (2007). Research on online learning. Journal of Asynchronous Learning Networks, 11 (1), 55–59.

*Sung, H. Y., Hwang, G. J., & Chang, Y. C. (2016). Development of a mobile learning system based on a collaborative problem-posing strategy. Interactive Learning Environments, 24 (3), 456–471. https://doi.org/10.1080/10494820.2013.867889

Tsagris, M., & Fragkos, K. C. (2018). Meta-analyses of clinical trials versus diagnostic test accuracy studies. In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 31–42). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_4

UNESCO. (2020, Match 13). COVID-19 educational disruption and response. Retrieved on the 14 th November 2020 from https://en.unesco.org/themes/education-emergencies/ coronavirus-school-closures

Usta, E. (2011a). The effect of web-based learning environments on attitudes of students regarding computer and internet. Procedia-Social and Behavioral Sciences, 28 (262–269), 1. https://doi.org/10.1016/j.sbspro.2011.11.051

Usta, E. (2011b). The examination of online self-regulated learning skills in web-based learning environments in terms of different variables. Turkish Online Journal of Educational Technology-TOJET, 10 (3), 278–286. Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/EJ944994.pdf

Vrasidas, C. & MsIsaac, M. S. (2000). Principles of pedagogy and evaluation for web-based learning. Educational Media International, 37 (2), 105–111. https://doi.org/10.1080/095239800410405

*Wang, C. H., & Chen, C. P. (2013). Effects of facebook tutoring on learning english as a second language. Proceedings of the International Conference e-Learning 2013, (2009), 135–142. Retrieved on the 15th November 2020 from https://files.eric.ed.gov/fulltext/ED562299.pdf

Wei, H. C., & Chou, C. (2020). Online learning performance and satisfaction: Do perceptions and readiness matter? Distance Education, 41 (1), 48–69.

*Yu, F. Y. (2019). The learning potential of online student-constructed tests with citing peer-generated questions. Interactive Learning Environments, 27 (2), 226–241. https://doi.org/10.1080/10494820.2018.1458040

*Yu, F. Y., & Chen, Y. J. (2014). Effects of student-generated questions as the source of online drill-and-practice activities on learning . British Journal of Educational Technology, 45 (2), 316–329. https://doi.org/10.1111/bjet.12036

*Yu, F. Y., & Pan, K. J. (2014). The effects of student question-generation with online prompts on learning. Educational Technology and Society, 17 (3), 267–279. Retrieved on the 15th November 2020 from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.565.643&rep=rep1&type=pdf

*Yu, W. F., She, H. C., & Lee, Y. M. (2010). The effects of web-based/non-web-based problem-solving instruction and high/low achievement on students’ problem-solving ability and biology achievement. Innovations in Education and Teaching International, 47 (2), 187–199. https://doi.org/10.1080/14703291003718927

Zhao, Y., Lei, J., Yan, B, Lai, C., & Tan, S. (2005). A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107 (8). https://doi.org/10.1111/j.1467-9620.2005.00544.x

*Zhong, B., Wang, Q., Chen, J., & Li, Y. (2017). Investigating the period of switching roles in pair programming in a primary school. Educational Technology and Society, 20 (3), 220–233. Retrieved on the 15th November 2020 from https://repository.nie.edu.sg/bitstream/10497/18946/1/ETS-20-3-220.pdf

Download references

Author information

Authors and affiliations.

Primary Education, Ministry of Turkish National Education, Mersin, Turkey

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hakan Ulum .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Ulum, H. The effects of online education on academic success: A meta-analysis study. Educ Inf Technol 27 , 429–450 (2022). https://doi.org/10.1007/s10639-021-10740-8

Download citation

Received : 06 December 2020

Accepted : 30 August 2021

Published : 06 September 2021

Issue Date : January 2022

DOI : https://doi.org/10.1007/s10639-021-10740-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Student achievement
  • Academic success
  • Meta-analysis
  • Find a journal
  • Publish with us
  • Track your research

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

COVID-19’s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis

Roles Data curation, Formal analysis, Methodology, Writing – review & editing

¶ ‡ JZ and YD are contributed equally to this work as first authors.

Affiliation School of Educational Information Technology, South China Normal University, Guangzhou, Guangdong, China

Roles Data curation, Formal analysis, Methodology, Writing – original draft

Affiliations School of Educational Information Technology, South China Normal University, Guangzhou, Guangdong, China, Hangzhou Zhongce Vocational School Qiantang, Hangzhou, Zhejiang, China

Roles Data curation, Writing – original draft

Roles Data curation

Roles Writing – original draft

Affiliation Faculty of Education, Shenzhen University, Shenzhen, Guangdong, China

Roles Conceptualization, Supervision, Writing – review & editing

* E-mail: [email protected] (JH); [email protected] (YZ)

ORCID logo

  • Junyi Zhang, 
  • Yigang Ding, 
  • Xinru Yang, 
  • Jinping Zhong, 
  • XinXin Qiu, 
  • Zhishan Zou, 
  • Yujie Xu, 
  • Xiunan Jin, 
  • Xiaomin Wu, 

PLOS

  • Published: August 23, 2022
  • https://doi.org/10.1371/journal.pone.0273016
  • Reader Comments

Table 1

The COVID-19 outbreak brought online learning to the forefront of education. Scholars have conducted many studies on online learning during the pandemic, but only a few have performed quantitative comparative analyses of students’ online learning behavior before and after the outbreak. We collected review data from China’s massive open online course platform called icourse.163 and performed social network analysis on 15 courses to explore courses’ interaction characteristics before, during, and after the COVID-19 pan-demic. Specifically, we focused on the following aspects: (1) variations in the scale of online learning amid COVID-19; (2a) the characteristics of online learning interaction during the pandemic; (2b) the characteristics of online learning interaction after the pandemic; and (3) differences in the interaction characteristics of social science courses and natural science courses. Results revealed that only a small number of courses witnessed an uptick in online interaction, suggesting that the pandemic’s role in promoting the scale of courses was not significant. During the pandemic, online learning interaction became more frequent among course network members whose interaction scale increased. After the pandemic, although the scale of interaction declined, online learning interaction became more effective. The scale and level of interaction in Electrodynamics (a natural science course) and Economics (a social science course) both rose during the pan-demic. However, long after the pandemic, the Economics course sustained online interaction whereas interaction in the Electrodynamics course steadily declined. This discrepancy could be due to the unique characteristics of natural science courses and social science courses.

Citation: Zhang J, Ding Y, Yang X, Zhong J, Qiu X, Zou Z, et al. (2022) COVID-19’s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis. PLoS ONE 17(8): e0273016. https://doi.org/10.1371/journal.pone.0273016

Editor: Heng Luo, Central China Normal University, CHINA

Received: April 20, 2022; Accepted: July 29, 2022; Published: August 23, 2022

Copyright: © 2022 Zhang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: The data underlying the results presented in the study were downloaded from https://www.icourse163.org/ and are now shared fully on Github ( https://github.com/zjyzhangjunyi/dataset-from-icourse163-for-SNA ). These data have no private information and can be used for academic research free of charge.

Funding: The author(s) received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.

1. Introduction

The development of the mobile internet has spurred rapid advances in online learning, offering novel prospects for teaching and learning and a learning experience completely different from traditional instruction. Online learning harnesses the advantages of network technology and multimedia technology to transcend the boundaries of conventional education [ 1 ]. Online courses have become a popular learning mode owing to their flexibility and openness. During online learning, teachers and students are in different physical locations but interact in multiple ways (e.g., via online forum discussions and asynchronous group discussions). An analysis of online learning therefore calls for attention to students’ participation. Alqurashi [ 2 ] defined interaction in online learning as the process of constructing meaningful information and thought exchanges between more than two people; such interaction typically occurs between teachers and learners, learners and learners, and the course content and learners.

Massive open online courses (MOOCs), a 21st-century teaching mode, have greatly influenced global education. Data released by China’s Ministry of Education in 2020 show that the country ranks first globally in the number and scale of higher education MOOCs. The COVID-19 outbreak has further propelled this learning mode, with universities being urged to leverage MOOCs and other online resource platforms to respond to government’s “School’s Out, But Class’s On” policy [ 3 ]. Besides MOOCs, to reduce in-person gatherings and curb the spread of COVID-19, various online learning methods have since become ubiquitous [ 4 ]. Though Lederman asserted that the COVID-19 outbreak has positioned online learning technologies as the best way for teachers and students to obtain satisfactory learning experiences [ 5 ], it remains unclear whether the COVID-19 pandemic has encouraged interaction in online learning, as interactions between students and others play key roles in academic performance and largely determine the quality of learning experiences [ 6 ]. Similarly, it is also unclear what impact the COVID-19 pandemic has had on the scale of online learning.

Social constructivism paints learning as a social phenomenon. As such, analyzing the social structures or patterns that emerge during the learning process can shed light on learning-based interaction [ 7 ]. Social network analysis helps to explain how a social network, rooted in interactions between learners and their peers, guides individuals’ behavior, emotions, and outcomes. This analytical approach is especially useful for evaluating interactive relationships between network members [ 8 ]. Mohammed cited social network analysis (SNA) as a method that can provide timely information about students, learning communities and interactive networks. SNA has been applied in numerous fields, including education, to identify the number and characteristics of interelement relationships. For example, Lee et al. also used SNA to explore the effects of blogs on peer relationships [ 7 ]. Therefore, adopting SNA to examine interactions in online learning communities during the COVID-19 pandemic can uncover potential issues with this online learning model.

Taking China’s icourse.163 MOOC platform as an example, we chose 15 courses with a large number of participants for SNA, focusing on learners’ interaction characteristics before, during, and after the COVID-19 outbreak. We visually assessed changes in the scale of network interaction before, during, and after the outbreak along with the characteristics of interaction in Gephi. Examining students’ interactions in different courses revealed distinct interactive network characteristics, the pandemic’s impact on online courses, and relevant suggestions. Findings are expected to promote effective interaction and deep learning among students in addition to serving as a reference for the development of other online learning communities.

2. Literature review and research questions

Interaction is deemed as central to the educational experience and is a major focus of research on online learning. Moore began to study the problem of interaction in distance education as early as 1989. He defined three core types of interaction: student–teacher, student–content, and student–student [ 9 ]. Lear et al. [ 10 ] described an interactivity/ community-process model of distance education: they specifically discussed the relationships between interactivity, community awareness, and engaging learners and found interactivity and community awareness to be correlated with learner engagement. Zulfikar et al. [ 11 ] suggested that discussions initiated by the students encourage more students’ engagement than discussions initiated by the instructors. It is most important to afford learners opportunities to interact purposefully with teachers, and improving the quality of learner interaction is crucial to fostering profound learning [ 12 ]. Interaction is an important way for learners to communicate and share information, and a key factor in the quality of online learning [ 13 ].

Timely feedback is the main component of online learning interaction. Woo and Reeves discovered that students often become frustrated when they fail to receive prompt feedback [ 14 ]. Shelley et al. conducted a three-year study of graduate and undergraduate students’ satisfaction with online learning at universities and found that interaction with educators and students is the main factor affecting satisfaction [ 15 ]. Teachers therefore need to provide students with scoring justification, support, and constructive criticism during online learning. Some researchers examined online learning during the COVID-19 pandemic. They found that most students preferred face-to-face learning rather than online learning due to obstacles faced online, such as a lack of motivation, limited teacher-student interaction, and a sense of isolation when learning in different times and spaces [ 16 , 17 ]. However, it can be reduced by enhancing the online interaction between teachers and students [ 18 ].

Research showed that interactions contributed to maintaining students’ motivation to continue learning [ 19 ]. Baber argued that interaction played a key role in students’ academic performance and influenced the quality of the online learning experience [ 20 ]. Hodges et al. maintained that well-designed online instruction can lead to unique teaching experiences [ 21 ]. Banna et al. mentioned that using discussion boards, chat sessions, blogs, wikis, and other tools could promote student interaction and improve participation in online courses [ 22 ]. During the COVID-19 pandemic, Mahmood proposed a series of teaching strategies suitable for distance learning to improve its effectiveness [ 23 ]. Lapitan et al. devised an online strategy to ease the transition from traditional face-to-face instruction to online learning [ 24 ]. The preceding discussion suggests that online learning goes beyond simply providing learning resources; teachers should ideally design real-life activities to give learners more opportunities to participate.

As mentioned, COVID-19 has driven many scholars to explore the online learning environment. However, most have ignored the uniqueness of online learning during this time and have rarely compared pre- and post-pandemic online learning interaction. Taking China’s icourse.163 MOOC platform as an example, we chose 15 courses with a large number of participants for SNA, centering on student interaction before and after the pandemic. Gephi was used to visually analyze changes in the scale and characteristics of network interaction. The following questions were of particular interest:

  • (1) Can the COVID-19 pandemic promote the expansion of online learning?
  • (2a) What are the characteristics of online learning interaction during the pandemic?
  • (2b) What are the characteristics of online learning interaction after the pandemic?
  • (3) How do interaction characteristics differ between social science courses and natural science courses?

3. Methodology

3.1 research context.

We selected several courses with a large number of participants and extensive online interaction among hundreds of courses on the icourse.163 MOOC platform. These courses had been offered on the platform for at least three semesters, covering three periods (i.e., before, during, and after the COVID-19 outbreak). To eliminate the effects of shifts in irrelevant variables (e.g., course teaching activities), we chose several courses with similar teaching activities and compared them on multiple dimensions. All course content was taught online. The teachers of each course posted discussion threads related to learning topics; students were expected to reply via comments. Learners could exchange ideas freely in their responses in addition to asking questions and sharing their learning experiences. Teachers could answer students’ questions as well. Conversations in the comment area could partly compensate for a relative absence of online classroom interaction. Teacher–student interaction is conducive to the formation of a social network structure and enabled us to examine teachers’ and students’ learning behavior through SNA. The comment areas in these courses were intended for learners to construct knowledge via reciprocal communication. Meanwhile, by answering students’ questions, teachers could encourage them to reflect on their learning progress. These courses’ successive terms also spanned several phases of COVID-19, allowing us to ascertain the pandemic’s impact on online learning.

3.2 Data collection and preprocessing

To avoid interference from invalid or unclear data, the following criteria were applied to select representative courses: (1) generality (i.e., public courses and professional courses were chosen from different schools across China); (2) time validity (i.e., courses were held before during, and after the pandemic); and (3) notability (i.e., each course had at least 2,000 participants). We ultimately chose 15 courses across the social sciences and natural sciences (see Table 1 ). The coding is used to represent the course name.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0273016.t001

To discern courses’ evolution during the pandemic, we gathered data on three terms before, during, and after the COVID-19 outbreak in addition to obtaining data from two terms completed well before the pandemic and long after. Our final dataset comprised five sets of interactive data. Finally, we collected about 120,000 comments for SNA. Because each course had a different start time—in line with fluctuations in the number of confirmed COVID-19 cases in China and the opening dates of most colleges and universities—we divided our sample into five phases: well before the pandemic (Phase I); before the pandemic (Phase Ⅱ); during the pandemic (Phase Ⅲ); after the pandemic (Phase Ⅳ); and long after the pandemic (Phase Ⅴ). We sought to preserve consistent time spans to balance the amount of data in each period ( Fig 1 ).

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g001

3.3 Instrumentation

Participants’ comments and “thumbs-up” behavior data were converted into a network structure and compared using social network analysis (SNA). Network analysis, according to M’Chirgui, is an effective tool for clarifying network relationships by employing sophisticated techniques [ 25 ]. Specifically, SNA can help explain the underlying relationships among team members and provide a better understanding of their internal processes. Yang and Tang used SNA to discuss the relationship between team structure and team performance [ 26 ]. Golbeck argued that SNA could improve the understanding of students’ learning processes and reveal learners’ and teachers’ role dynamics [ 27 ].

To analyze Question (1), the number of nodes and diameter in the generated network were deemed as indicators of changes in network size. Social networks are typically represented as graphs with nodes and degrees, and node count indicates the sample size [ 15 ]. Wellman et al. proposed that the larger the network scale, the greater the number of network members providing emotional support, goods, services, and companionship [ 28 ]. Jan’s study measured the network size by counting the nodes which represented students, lecturers, and tutors [ 29 ]. Similarly, network nodes in the present study indicated how many learners and teachers participated in the course, with more nodes indicating more participants. Furthermore, we investigated the network diameter, a structural feature of social networks, which is a common metric for measuring network size in SNA [ 30 ]. The network diameter refers to the longest path between any two nodes in the network. There has been evidence that a larger network diameter leads to greater spread of behavior [ 31 ]. Likewise, Gašević et al. found that larger networks were more likely to spread innovative ideas about educational technology when analyzing MOOC-related research citations [ 32 ]. Therefore, we employed node count and network diameter to measure the network’s spatial size and further explore the expansion characteristic of online courses. Brief introduction of these indicators can be summarized in Table 2 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t002

To address Question (2), a list of interactive analysis metrics in SNA were introduced to scrutinize learners’ interaction characteristics in online learning during and after the pandemic, as shown below:

  • (1) The average degree reflects the density of the network by calculating the average number of connections for each node. As Rong and Xu suggested, the average degree of a network indicates how active its participants are [ 33 ]. According to Hu, a higher average degree implies that more students are interacting directly with each other in a learning context [ 34 ]. The present study inherited the concept of the average degree from these previous studies: the higher the average degree, the more frequent the interaction between individuals in the network.
  • (2) Essentially, a weighted average degree in a network is calculated by multiplying each degree by its respective weight, and then taking the average. Bydžovská took the strength of the relationship into account when determining the weighted average degree [ 35 ]. By calculating friendship’s weighted value, Maroulis assessed peer achievement within a small-school reform [ 36 ]. Accordingly, we considered the number of interactions as the weight of the degree, with a higher average degree indicating more active interaction among learners.
  • (3) Network density is the ratio between actual connections and potential connections in a network. The more connections group members have with each other, the higher the network density. In SNA, network density is similar to group cohesion, i.e., a network of more strong relationships is more cohesive [ 37 ]. Network density also reflects how much all members are connected together [ 38 ]. Therefore, we adopted network density to indicate the closeness among network members. Higher network density indicates more frequent interaction and closer communication among students.
  • (4) Clustering coefficient describes local network attributes and indicates that two nodes in the network could be connected through adjacent nodes. The clustering coefficient measures users’ tendency to gather (cluster) with others in the network: the higher the clustering coefficient, the more frequently users communicate with other group members. We regarded this indicator as a reflection of the cohesiveness of the group [ 39 ].
  • (5) In a network, the average path length is the average number of steps along the shortest paths between any two nodes. Oliveres has observed that when an average path length is small, the route from one node to another is shorter when graphed [ 40 ]. This is especially true in educational settings where students tend to become closer friends. So we consider that the smaller the average path length, the greater the possibility of interaction between individuals in the network.
  • (6) A network with a large number of nodes, but whose average path length is surprisingly small, is known as the small-world effect [ 41 ]. A higher clustering coefficient and shorter average path length are important indicators of a small-world network: a shorter average path length enables the network to spread information faster and more accurately; a higher clustering coefficient can promote frequent knowledge exchange within the group while boosting the timeliness and accuracy of knowledge dissemination [ 42 ]. Brief introduction of these indicators can be summarized in Table 3 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t003

To analyze Question 3, we used the concept of closeness centrality, which determines how close a vertex is to others in the network. As Opsahl et al. explained, closeness centrality reveals how closely actors are coupled with their entire social network [ 43 ]. In order to analyze social network-based engineering education, Putnik et al. examined closeness centrality and found that it was significantly correlated with grades [ 38 ]. We used closeness centrality to measure the position of an individual in the network. Brief introduction of these indicators can be summarized in Table 4 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t004

3.4 Ethics statement

This study was approved by the Academic Committee Office (ACO) of South China Normal University ( http://fzghb.scnu.edu.cn/ ), Guangzhou, China. Research data were collected from the open platform and analyzed anonymously. There are thus no privacy issues involved in this study.

4.1 COVID-19’s role in promoting the scale of online courses was not as important as expected

As shown in Fig 2 , the number of course participants and nodes are closely correlated with the pandemic’s trajectory. Because the number of participants in each course varied widely, we normalized the number of participants and nodes to more conveniently visualize course trends. Fig 2 depicts changes in the chosen courses’ number of participants and nodes before the pandemic (Phase II), during the pandemic (Phase III), and after the pandemic (Phase IV). The number of participants in most courses during the pandemic exceeded those before and after the pandemic. But the number of people who participate in interaction in some courses did not increase.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g002

In order to better analyze the trend of interaction scale in online courses before, during, and after the pandemic, the selected courses were categorized according to their scale change. When the number of participants increased (decreased) beyond 20% (statistical experience) and the diameter also increased (decreased), the course scale was determined to have increased (decreased); otherwise, no significant change was identified in the course’s interaction scale. Courses were subsequently divided into three categories: increased interaction scale, decreased interaction scale, and no significant change. Results appear in Table 5 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t005

From before the pandemic until it broke out, the interaction scale of five courses increased, accounting for 33.3% of the full sample; one course’s interaction scale declined, accounting for 6.7%. The interaction scale of nine courses decreased, accounting for 60%. The pandemic’s role in promoting online courses thus was not as important as anticipated, and most courses’ interaction scale did not change significantly throughout.

No courses displayed growing interaction scale after the pandemic: the interaction scale of nine courses fell, accounting for 60%; and the interaction scale of six courses did not shift significantly, accounting for 40%. Courses with an increased scale of interaction during the pandemic did not maintain an upward trend. On the contrary, the improvement in the pandemic caused learners’ enthusiasm for online learning to wane. We next analyzed several interaction metrics to further explore course interaction during different pandemic periods.

4.2 Characteristics of online learning interaction amid COVID-19

4.2.1 during the covid-19 pandemic, online learning interaction in some courses became more active..

Changes in course indicators with the growing interaction scale during the pandemic are presented in Fig 3 , including SS5, SS6, NS1, NS3, and NS8. The horizontal ordinate indicates the number of courses, with red color representing the rise of the indicator value on the vertical ordinate and blue representing the decline.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g003

Specifically: (1) The average degree and weighted average degree of the five course networks demonstrated an upward trend. The emergence of the pandemic promoted students’ enthusiasm; learners were more active in the interactive network. (2) Fig 3 shows that 3 courses had increased network density and 2 courses had decreased. The higher the network density, the more communication within the team. Even though the pandemic accelerated the interaction scale and frequency, the tightness between learners in some courses did not improve. (3) The clustering coefficient of social science courses rose whereas the clustering coefficient and small-world property of natural science courses fell. The higher the clustering coefficient and the small-world property, the better the relationship between adjacent nodes and the higher the cohesion [ 39 ]. (4) Most courses’ average path length increased as the interaction scale increased. However, when the average path length grew, adverse effects could manifest: communication between learners might be limited to a small group without multi-directional interaction.

When the pandemic emerged, the only declining network scale belonged to a natural science course (NS2). The change in each course index is pictured in Fig 4 . The abscissa indicates the size of the value, with larger values to the right. The red dot indicates the index value before the pandemic; the blue dot indicates its value during the pandemic. If the blue dot is to the right of the red dot, then the value of the index increased; otherwise, the index value declined. Only the weighted average degree of the course network increased. The average degree, network density decreased, indicating that network members were not active and that learners’ interaction degree and communication frequency lessened. Despite reduced learner interaction, the average path length was small and the connectivity between learners was adequate.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g004

4.2.2 After the COVID-19 pandemic, the scale decreased rapidly, but most course interaction was more effective.

Fig 5 shows the changes in various courses’ interaction indicators after the pandemic, including SS1, SS2, SS3, SS6, SS7, NS2, NS3, NS7, and NS8.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g005

Specifically: (1) The average degree and weighted average degree of most course networks decreased. The scope and intensity of interaction among network members declined rapidly, as did learners’ enthusiasm for communication. (2) The network density of seven courses also fell, indicating weaker connections between learners in most courses. (3) In addition, the clustering coefficient and small-world property of most course networks decreased, suggesting little possibility of small groups in the network. The scope of interaction between learners was not limited to a specific space, and the interaction objects had no significant tendencies. (4) Although the scale of course interaction became smaller in this phase, the average path length of members’ social networks shortened in nine courses. Its shorter average path length would expedite the spread of information within the network as well as communication and sharing among network members.

Fig 6 displays the evolution of course interaction indicators without significant changes in interaction scale after the pandemic, including SS4, SS5, NS1, NS4, NS5, and NS6.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g006

Specifically: (1) Some course members’ social networks exhibited an increase in the average and weighted average. In these cases, even though the course network’s scale did not continue to increase, communication among network members rose and interaction became more frequent and deeper than before. (2) Network density and average path length are indicators of social network density. The greater the network density, the denser the social network; the shorter the average path length, the more concentrated the communication among network members. However, at this phase, the average path length and network density in most courses had increased. Yet the network density remained small despite having risen ( Table 6 ). Even with more frequent learner interaction, connections remained distant and the social network was comparatively sparse.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t006

In summary, the scale of interaction did not change significantly overall. Nonetheless, some course members’ frequency and extent of interaction increased, and the relationships between network members became closer as well. In the study, we found it interesting that the interaction scale of Economics (a social science course) course and Electrodynamics (a natural science course) course expanded rapidly during the pandemic and retained their interaction scale thereafter. We next assessed these two courses to determine whether their level of interaction persisted after the pandemic.

4.3 Analyses of natural science courses and social science courses

4.3.1 analyses of the interaction characteristics of economics and electrodynamics..

Economics and Electrodynamics are social science courses and natural science courses, respectively. Members’ interaction within these courses was similar: the interaction scale increased significantly when COVID-19 broke out (Phase Ⅲ), and no significant changes emerged after the pandemic (Phase Ⅴ). We hence focused on course interaction long after the outbreak (Phase V) and compared changes across multiple indicators, as listed in Table 7 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t007

As the pandemic continued to improve, the number of participants and the diameter long after the outbreak (Phase V) each declined for Economics compared with after the pandemic (Phase IV). The interaction scale decreased, but the interaction between learners was much deeper. Specifically: (1) The weighted average degree, network density, clustering coefficient, and small-world property each reflected upward trends. The pandemic therefore exerted a strong impact on this course. Interaction was well maintained even after the pandemic. The smaller network scale promoted members’ interaction and communication. (2) Compared with after the pandemic (Phase IV), members’ network density increased significantly, showing that relationships between learners were closer and that cohesion was improving. (3) At the same time, as the clustering coefficient and small-world property grew, network members demonstrated strong small-group characteristics: the communication between them was deepening and their enthusiasm for interaction was higher. (4) Long after the COVID-19 outbreak (Phase V), the average path length was reduced compared with previous terms, knowledge flowed more quickly among network members, and the degree of interaction gradually deepened.

The average degree, weighted average degree, network density, clustering coefficient, and small-world property of Electrodynamics all decreased long after the COVID-19 outbreak (Phase V) and were lower than during the outbreak (Phase Ⅲ). The level of learner interaction therefore gradually declined long after the outbreak (Phase V), and connections between learners were no longer active. Although the pandemic increased course members’ extent of interaction, this rise was merely temporary: students’ enthusiasm for learning waned rapidly and their interaction decreased after the pandemic (Phase IV). To further analyze the interaction characteristics of course members in Economics and Electrodynamics, we evaluated the closeness centrality of their social networks, as shown in section 4.3.2.

4.3.2 Analysis of the closeness centrality of Economics and Electrodynamics.

The change in the closeness centrality of social networks in Economics was small, and no sharp upward trend appeared during the pandemic outbreak, as shown in Fig 7 . The emergence of COVID-19 apparently fostered learners’ interaction in Economics albeit without a significant impact. The closeness centrality changed in Electrodynamics varied from that of Economics: upon the COVID-19 outbreak, closeness centrality was significantly different from other semesters. Communication between learners was closer and interaction was more effective. Electrodynamics course members’ social network proximity decreased rapidly after the pandemic. Learners’ communication lessened. In general, Economics course showed better interaction before the outbreak and was less affected by the pandemic; Electrodynamics course was more affected by the pandemic and showed different interaction characteristics at different periods of the pandemic.

thumbnail

(Note: "****" indicates the significant distinction in closeness centrality between the two periods, otherwise no significant distinction).

https://doi.org/10.1371/journal.pone.0273016.g007

5. Discussion

We referred to discussion forums from several courses on the icourse.163 MOOC platform to compare online learning before, during, and after the COVID-19 pandemic via SNA and to delineate the pandemic’s effects on online courses. Only 33.3% of courses in our sample increased in terms of interaction during the pandemic; the scale of interaction did not rise in any courses thereafter. When the courses scale rose, the scope and frequency of interaction showed upward trends during the pandemic; and the clustering coefficient of natural science courses and social science courses differed: the coefficient for social science courses tended to rise whereas that for natural science courses generally declined. When the pandemic broke out, the interaction scale of a single natural science course decreased along with its interaction scope and frequency. The amount of interaction in most courses shrank rapidly during the pandemic and network members were not as active as they had been before. However, after the pandemic, some courses saw declining interaction but greater communication between members; interaction also became more frequent and deeper than before.

5.1 During the COVID-19 pandemic, the scale of interaction increased in only a few courses

The pandemic outbreak led to a rapid increase in the number of participants in most courses; however, the change in network scale was not significant. The scale of online interaction expanded swiftly in only a few courses; in others, the scale either did not change significantly or displayed a downward trend. After the pandemic, the interaction scale in most courses decreased quickly; the same pattern applied to communication between network members. Learners’ enthusiasm for online interaction reduced as the circumstances of the pandemic improved—potentially because, during the pandemic, China’s Ministry of Education declared “School’s Out, But Class’s On” policy. Major colleges and universities were encouraged to use the Internet and informational resources to provide learning support, hence the sudden increase in the number of participants and interaction in online courses [ 46 ]. After the pandemic, students’ enthusiasm for online learning gradually weakened, presumably due to easing of the pandemic [ 47 ]. More activities also transitioned from online to offline, which tempered learners’ online discussion. Research has shown that long-term online learning can even bore students [ 48 ].

Most courses’ interaction scale decreased significantly after the pandemic. First, teachers and students occupied separate spaces during the outbreak, had few opportunities for mutual cooperation and friendship, and lacked a sense of belonging [ 49 ]. Students’ enthusiasm for learning dissipated over time [ 50 ]. Second, some teachers were especially concerned about adapting in-person instructional materials for digital platforms; their pedagogical methods were ineffective, and they did not provide learning activities germane to student interaction [ 51 ]. Third, although teachers and students in remote areas were actively engaged in online learning, some students could not continue to participate in distance learning due to inadequate technology later in the outbreak [ 52 ].

5.2 Characteristics of online learning interaction during and after the COVID-19 pandemic

5.2.1 during the covid-19 pandemic, online interaction in most courses did not change significantly..

The interaction scale of only a few courses increased during the pandemic. The interaction scope and frequency of these courses climbed as well. Yet even as the degree of network interaction rose, course network density did not expand in all cases. The pandemic sparked a surge in the number of online learners and a rapid increase in network scale, but students found it difficult to interact with all learners. Yau pointed out that a greater network scale did not enrich the range of interaction between individuals; rather, the number of individuals who could interact directly was limited [ 53 ]. The internet facilitates interpersonal communication. However, not everyone has the time or ability to establish close ties with others [ 54 ].

In addition, social science courses and natural science courses in our sample revealed disparate trends in this regard: the clustering coefficient of social science courses increased and that of natural science courses decreased. Social science courses usually employ learning approaches distinct from those in natural science courses [ 55 ]. Social science courses emphasize critical and innovative thinking along with personal expression [ 56 ]. Natural science courses focus on practical skills, methods, and principles [ 57 ]. Therefore, the content of social science courses can spur large-scale discussion among learners. Some course evaluations indicated that the course content design was suboptimal as well: teachers paid close attention to knowledge transmission and much less to piquing students’ interest in learning. In addition, the thread topics that teachers posted were scarcely diversified and teachers’ questions lacked openness. These attributes could not spark active discussion among learners.

5.2.2 Online learning interaction declined after the COVID-19 pandemic.

Most courses’ interaction scale and intensity decreased rapidly after the pandemic, but some did not change. Courses with a larger network scale did not continue to expand after the outbreak, and students’ enthusiasm for learning paled. The pandemic’s reduced severity also influenced the number of participants in online courses. Meanwhile, restored school order moved many learning activities from virtual to in-person spaces. Face-to-face learning has gradually replaced online learning, resulting in lower enrollment and less interaction in online courses. Prolonged online courses could have also led students to feel lonely and to lack a sense of belonging [ 58 ].

The scale of interaction in some courses did not change substantially after the pandemic yet learners’ connections became tighter. We hence recommend that teachers seize pandemic-related opportunities to design suitable activities. Additionally, instructors should promote student-teacher and student-student interaction, encourage students to actively participate online, and generally intensify the impact of online learning.

5.3 What are the characteristics of interaction in social science courses and natural science courses?

The level of interaction in Economics (a social science course) was significantly higher than that in Electrodynamics (a natural science course), and the small-world property in Economics increased as well. To boost online courses’ learning-related impacts, teachers can divide groups of learners based on the clustering coefficient and the average path length. Small groups of students may benefit teachers in several ways: to participate actively in activities intended to expand students’ knowledge, and to serve as key actors in these small groups. Cultivating students’ keenness to participate in class activities and self-management can also help teachers guide learner interaction and foster deep knowledge construction.

As evidenced by comments posted in the Electrodynamics course, we observed less interaction between students. Teachers also rarely urged students to contribute to conversations. These trends may have arisen because teachers and students were in different spaces. Teachers might have struggled to discern students’ interaction status. Teachers could also have failed to intervene in time, to design online learning activities that piqued learners’ interest, and to employ sound interactive theme planning and guidance. Teachers are often active in traditional classroom settings. Their roles are comparatively weakened online, such that they possess less control over instruction [ 59 ]. Online instruction also requires a stronger hand in learning: teachers should play a leading role in regulating network members’ interactive communication [ 60 ]. Teachers can guide learners to participate, help learners establish social networks, and heighten students’ interest in learning [ 61 ]. Teachers should attend to core members in online learning while also considering edge members; by doing so, all network members can be driven to share their knowledge and become more engaged. Finally, teachers and assistant teachers should help learners develop knowledge, exchange topic-related ideas, pose relevant questions during course discussions, and craft activities that enable learners to interact online [ 62 ]. These tactics can improve the effectiveness of online learning.

As described, network members displayed distinct interaction behavior in Economics and Electrodynamics courses. First, these courses varied in their difficulty: the social science course seemed easier to understand and focused on divergent thinking. Learners were often willing to express their views in comments and to ponder others’ perspectives [ 63 ]. The natural science course seemed more demanding and was oriented around logical thinking and skills [ 64 ]. Second, courses’ content differed. In general, social science courses favor the acquisition of declarative knowledge and creative knowledge compared with natural science courses. Social science courses also entertain open questions [ 65 ]. Natural science courses revolve around principle knowledge, strategic knowledge, and transfer knowledge [ 66 ]. Problems in these courses are normally more complicated than those in social science courses. Third, the indicators affecting students’ attitudes toward learning were unique. Guo et al. discovered that “teacher feedback” most strongly influenced students’ attitudes towards learning social science courses but had less impact on students in natural science courses [ 67 ]. Therefore, learners in social science courses likely expect more feedback from teachers and greater interaction with others.

6. Conclusion and future work

Our findings show that the network interaction scale of some online courses expanded during the COVID-19 pandemic. The network scale of most courses did not change significantly, demonstrating that the pandemic did not notably alter the scale of course interaction. Online learning interaction among course network members whose interaction scale increased also became more frequent during the pandemic. Once the outbreak was under control, although the scale of interaction declined, the level and scope of some courses’ interactive networks continued to rise; interaction was thus particularly effective in these cases. Overall, the pandemic appeared to have a relatively positive impact on online learning interaction. We considered a pair of courses in detail and found that Economics (a social science course) fared much better than Electrodynamics (a natural science course) in classroom interaction; learners were more willing to partake in-class activities, perhaps due to these courses’ unique characteristics. Brint et al. also came to similar conclusions [ 57 ].

This study was intended to be rigorous. Even so, several constraints can be addressed in future work. The first limitation involves our sample: we focused on a select set of courses hosted on China’s icourse.163 MOOC platform. Future studies should involve an expansive collection of courses to provide a more holistic understanding of how the pandemic has influenced online interaction. Second, we only explored the interactive relationship between learners and did not analyze interactive content. More in-depth content analysis should be carried out in subsequent research. All in all, the emergence of COVID-19 has provided a new path for online learning and has reshaped the distance learning landscape. To cope with associated challenges, educational practitioners will need to continue innovating in online instructional design, strengthen related pedagogy, optimize online learning conditions, and bolster teachers’ and students’ competence in online learning.

  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 30. Serrat O. Social network analysis. Knowledge solutions: Springer; 2017. p. 39–43. https://doi.org/10.1007/978-981-10-0983-9_9
  • 33. Rong Y, Xu E, editors. Strategies for the Management of the Government Affairs Microblogs in China Based on the SNA of Fifty Government Affairs Microblogs in Beijing. 14th International Conference on Service Systems and Service Management 2017.
  • 34. Hu X, Chu S, editors. A comparison on using social media in a professional experience course. International Conference on Social Media and Society; 2013.
  • 35. Bydžovská H. A Comparative Analysis of Techniques for Predicting Student Performance. Proceedings of the 9th International Conference on Educational Data Mining; Raleigh, NC, USA: International Educational Data Mining Society2016. p. 306–311.
  • 40. Olivares D, Adesope O, Hundhausen C, et al., editors. Using social network analysis to measure the effect of learning analytics in computing education. 19th IEEE International Conference on Advanced Learning Technologies 2019.
  • 41. Travers J, Milgram S. An experimental study of the small world problem. Social Networks: Elsevier; 1977. p. 179–197. https://doi.org/10.1016/B978-0-12-442450-0.50018–3
  • 43. Okamoto K, Chen W, Li X-Y, editors. Ranking of closeness centrality for large-scale social networks. International workshop on frontiers in algorithmics; 2008; Springer, Berlin, Heidelberg: Springer.
  • 47. Ding Y, Yang X, Zheng Y, editors. COVID-19’s Effects on the Scope, Effectiveness, and Roles of Teachers in Online Learning Based on Social Network Analysis: A Case Study. International Conference on Blended Learning; 2021: Springer.
  • 64. Boys C, Brennan J., Henkel M., Kirkland J., Kogan M., Youl P. Higher Education and Preparation for Work. Jessica Kingsley Publishers. 1988. https://doi.org/10.1080/03075079612331381467

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 09 January 2024

Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership

  • Bandar N. Alarifi 1 &
  • Steve Song 2  

Humanities and Social Sciences Communications volume  11 , Article number:  86 ( 2024 ) Cite this article

11k Accesses

3 Citations

3 Altmetric

Metrics details

  • Science, technology and society

This study is a comparative analysis of online distance learning and traditional in-person education at King Saud University in Saudi Arabia, with a focus on understanding how different educational modalities affect student achievement. The justification for this study lies in the rapid shift towards online learning, especially highlighted by the educational changes during the COVID-19 pandemic. By analyzing the final test scores of freshman students in five core courses over the 2020 (in-person) and 2021 (online) academic years, the research provides empirical insights into the efficacy of online versus traditional education. Initial observations suggested that students in online settings scored lower in most courses. However, after adjusting for variables like gender, class size, and admission scores using multiple linear regression, a more nuanced picture emerged. Three courses showed better performance in the 2021 online cohort, one favored the 2020 in-person group, and one was unaffected by the teaching format. The study emphasizes the crucial need for a nuanced, data-driven strategy in integrating online learning within higher education systems. It brings to light the fact that the success of educational methodologies is highly contingent on specific contextual factors. This finding advocates for educational administrators and policymakers to exercise careful and informed judgment when adopting online learning modalities. It encourages them to thoroughly evaluate how different subjects and instructional approaches might interact with online formats, considering the variable effects these might have on learning outcomes. This approach ensures that decisions about implementing online education are made with a comprehensive understanding of its diverse and context-specific impacts, aiming to optimize educational effectiveness and student success.

Similar content being viewed by others

research study about online learning

Elementary school teachers’ perspectives about learning during the COVID-19 pandemic

research study about online learning

Quality of a master’s degree in education in Ecuador

research study about online learning

Co-designing inclusive excellence in higher education: Students’ and teachers’ perspectives on the ideal online learning environment using the I-TPACK model

Introduction.

The year 2020 marked an extraordinary period, characterized by the global disruption caused by the COVID-19 pandemic. Governments and institutions worldwide had to adapt to unforeseen challenges across various domains, including health, economy, and education. In response, many educational institutions quickly transitioned to distance teaching (also known as e-learning, online learning, or virtual classrooms) to ensure continued access to education for their students. However, despite this rapid and widespread shift to online learning, a comprehensive examination of its effects on student achievement in comparison to traditional in-person instruction remains largely unexplored.

In research examining student outcomes in the context of online learning, the prevailing trend is the consistent observation that online learners often achieve less favorable results when compared to their peers in traditional classroom settings (e.g., Fischer et al., 2020 ; Bettinger et al., 2017 ; Edvardsson and Oskarsson, 2008 ). However, it is important to note that a significant portion of research on online learning has primarily focused on its potential impact (Kuhfeld et al., 2020 ; Azevedo et al., 2020 ; Di Pietro et al., 2020 ) or explored various perspectives (Aucejo et al., 2020 ; Radha et al., 2020 ) concerning distance education. These studies have often omitted a comprehensive and nuanced examination of its concrete academic consequences, particularly in terms of test scores and grades.

Given the dearth of research on the academic impact of online learning, especially in light of Covid-19 in the educational arena, the present study aims to address that gap by assessing the effectiveness of distance learning compared to in-person teaching in five required freshmen-level courses at King Saud University, Saudi Arabia. To accomplish this objective, the current study compared the final exam results of 8297 freshman students who were enrolled in the five courses in person in 2020 to their 8425 first-year counterparts who has taken the same courses at the same institution in 2021 but in an online format.

The final test results of the five courses (i.e., University Skills 101, Entrepreneurship 101, Computer Skills 101, Computer Skills 101, and Fitness and Health Culture 101) were examined, accounting for potential confounding factors such as gender, class size and admission scores, which have been cited in past research to be correlated with student achievement (e.g., Meinck and Brese, 2019 ; Jepsen, 2015 ) Additionally, as the preparatory year at King Saud University is divided into five tracks—health, nursing, science, business, and humanity, the study classified students based on their respective disciplines.

Motivation for the study

The rapid expansion of distance learning in higher education, particularly highlighted during the recent COVID-19 pandemic (Volk et al., 2020 ; Bettinger et al., 2017 ), underscores the need for alternative educational approaches during crises. Such disruptions can catalyze innovation and the adoption of distance learning as a contingency plan (Christensen et al., 2015 ). King Saud University, like many institutions worldwide, faced the challenge of transitioning abruptly to online learning in response to the pandemic.

E-learning has gained prominence in higher education due to technological advancements, offering institutions a competitive edge (Valverde-Berrocoso et al., 2020 ). Especially during conditions like the COVID-19 pandemic, electronic communication was utilized across the globe as a feasible means to overcome barriers and enhance interactions (Bozkurt, 2019 ).

Distance learning, characterized by flexibility, became crucial when traditional in-person classes are hindered by unforeseen circumstance such as the ones posed by COVID-19 (Arkorful and Abaidoo, 2015 ). Scholars argue that it allows students to learn at their own pace, often referred to as self-directed learning (Hiemstra, 1994 ) or self-education (Gadamer, 2001 ). Additional advantages include accessibility, cost-effectiveness, and flexibility (Sadeghi, 2019 ).

However, distance learning is not immune to its own set of challenges. Technical impediments, encompassing network issues, device limitations, and communication hiccups, represent formidable hurdles (Sadeghi, 2019 ). Furthermore, concerns about potential distractions in the online learning environment, fueled by the ubiquity of the internet and social media, have surfaced (Hall et al., 2020 ; Ravizza et al., 2017 ). The absence of traditional face-to-face interactions among students and between students and instructors is also viewed as a potential drawback (Sadeghi, 2019 ).

Given the evolving understanding of the pros and cons of distance learning, this study aims to contribute to the existing literature by assessing the effectiveness of distance learning, specifically in terms of student achievement, as compared to in-person classroom learning at King Saud University, one of Saudi Arabia’s largest higher education institutions.

Academic achievement: in-person vs online learning

The primary driving force behind the rapid integration of technology in education has been its emphasis on student performance (Lai and Bower, 2019 ). Over the past decade, numerous studies have undertaken comparisons of student academic achievement in online and in-person settings (e.g., Bettinger et al., 2017 ; Fischer et al., 2020 ; Iglesias-Pradas et al., 2021 ). This section offers a concise review of the disparities in academic achievement between college students engaged in in-person and online learning, as identified in existing research.

A number of studies point to the superiority of traditional in-person education over online learning in terms of academic outcomes. For example, Fischer et al. ( 2020 ) conducted a comprehensive study involving 72,000 university students across 433 subjects, revealing that online students tend to achieve slightly lower academic results than their in-class counterparts. Similarly, Bettinger et al. ( 2017 ) found that students at for-profit online universities generally underperformed when compared to their in-person peers. Supporting this trend, Figlio et al. ( 2013 ) indicated that in-person instruction consistently produced better results, particularly among specific subgroups like males, lower-performing students, and Hispanic learners. Additionally, Kaupp’s ( 2012 ) research in California community colleges demonstrated that online students faced lower completion and success rates compared to their traditional in-person counterparts (Fig. 1 ).

figure 1

The figure compared student achievement in the final tests in the five courses by year, using independent-samples t-tests; the results show a statistically-significant drop in test scores from 2020 (in person) to 2021 (online) for all courses except CT_101.

In contrast, other studies present evidence of online students outperforming their in-person peers. For example, Iglesias-Pradas et al. ( 2021 ) conducted a comparative analysis of 43 bachelor courses at Telecommunication Engineering College in Malaysia, revealing that online students achieved higher academic outcomes than their in-person counterparts. Similarly, during the COVID-19 pandemic, Gonzalez et al. ( 2020 ) found that students engaged in online learning performed better than those who had previously taken the same subjects in traditional in-class settings.

Expanding on this topic, several studies have reported mixed results when comparing the academic performance of online and in-person students, with various student and instructor factors emerging as influential variables. Chesser et al. ( 2020 ) noted that student traits such as conscientiousness, agreeableness, and extraversion play a substantial role in academic achievement, regardless of the learning environment—be it traditional in-person classrooms or online settings. Furthermore, Cacault et al. ( 2021 ) discovered that online students with higher academic proficiency tend to outperform those with lower academic capabilities, suggesting that differences in students’ academic abilities may impact their performance. In contrast, Bergstrand and Savage ( 2013 ) found that online classes received lower overall ratings and exhibited a less respectful learning environment when compared to in-person instruction. Nevertheless, they also observed that the teaching efficiency of both in-class and online courses varied significantly depending on the instructors’ backgrounds and approaches. These findings underscore the multifaceted nature of the online vs. in-person learning debate, highlighting the need for a nuanced understanding of the factors at play.

Theoretical framework

Constructivism is a well-established learning theory that places learners at the forefront of their educational experience, emphasizing their active role in constructing knowledge through interactions with their environment (Duffy and Jonassen, 2009 ). According to constructivist principles, learners build their understanding by assimilating new information into their existing cognitive frameworks (Vygotsky, 1978 ). This theory highlights the importance of context, active engagement, and the social nature of learning (Dewey, 1938 ). Constructivist approaches often involve hands-on activities, problem-solving tasks, and opportunities for collaborative exploration (Brooks and Brooks, 1999 ).

In the realm of education, subject-specific pedagogy emerges as a vital perspective that acknowledges the distinctive nature of different academic disciplines (Shulman, 1986 ). It suggests that teaching methods should be tailored to the specific characteristics of each subject, recognizing that subjects like mathematics, literature, or science require different approaches to facilitate effective learning (Shulman, 1987 ). Subject-specific pedagogy emphasizes that the methods of instruction should mirror the ways experts in a particular field think, reason, and engage with their subject matter (Cochran-Smith and Zeichner, 2005 ).

When applying these principles to the design of instruction for online and in-person learning environments, the significance of adapting methods becomes even more pronounced. Online learning often requires unique approaches due to its reliance on technology, asynchronous interactions, and potential for reduced social presence (Anderson, 2003 ). In-person learning, on the other hand, benefits from face-to-face interactions and immediate feedback (Allen and Seaman, 2016 ). Here, the interplay of constructivism and subject-specific pedagogy becomes evident.

Online learning. In an online environment, constructivist principles can be upheld by creating interactive online activities that promote exploration, reflection, and collaborative learning (Salmon, 2000 ). Discussion forums, virtual labs, and multimedia presentations can provide opportunities for students to actively engage with the subject matter (Harasim, 2017 ). By integrating subject-specific pedagogy, educators can design online content that mirrors the discipline’s methodologies while leveraging technology for authentic experiences (Koehler and Mishra, 2009 ). For instance, an online history course might incorporate virtual museum tours, primary source analysis, and collaborative timeline projects.

In-person learning. In a traditional brick-and-mortar classroom setting, constructivist methods can be implemented through group activities, problem-solving tasks, and in-depth discussions that encourage active participation (Jonassen et al., 2003 ). Subject-specific pedagogy complements this by shaping instructional methods to align with the inherent characteristics of the subject (Hattie, 2009). For instance, in a physics class, hands-on experiments and real-world applications can bring theoretical concepts to life (Hake, 1998 ).

In sum, the fusion of constructivism and subject-specific pedagogy offers a versatile approach to instructional design that adapts to different learning environments (Garrison, 2011 ). By incorporating the principles of both theories, educators can tailor their methods to suit the unique demands of online and in-person learning, ultimately providing students with engaging and effective learning experiences that align with the nature of the subject matter and the mode of instruction.

Course description

The Self-Development Skills Department at King Saud University (KSU) offers five mandatory freshman-level courses. These courses aim to foster advanced thinking skills and cultivate scientific research abilities in students. They do so by imparting essential skills, identifying higher-level thinking patterns, and facilitating hands-on experience in scientific research. The design of these classes is centered around aiding students’ smooth transition into university life. Brief descriptions of these courses are as follows:

University Skills 101 (CI 101) is a three-hour credit course designed to nurture essential academic, communication, and personal skills among all preparatory year students at King Saud University. The primary goal of this course is to equip students with the practical abilities they need to excel in their academic pursuits and navigate their university lives effectively. CI 101 comprises 12 sessions and is an integral part of the curriculum for all incoming freshmen, ensuring a standardized foundation for skill development.

Fitness and Health 101 (FAJB 101) is a one-hour credit course. FAJB 101 focuses on the aspects of self-development skills in terms of health and physical, and the skills related to personal health, nutrition, sports, preventive, psychological, reproductive, and first aid. This course aims to motivate students’ learning process through entertainment, sports activities, and physical exercises to maintain their health. This course is required for all incoming freshmen students at King Saud University.

Entrepreneurship 101 (ENT 101) is a one-hour- credit course. ENT 101 aims to develop students’ skills related to entrepreneurship. The course provides students with knowledge and skills to generate and transform ideas and innovations into practical commercial projects in business settings. The entrepreneurship course consists of 14 sessions and is taught only to students in the business track.

Computer Skills 101 (CT 101) is a three-hour credit course. This provides students with the basic computer skills, e.g., components, operating systems, applications, and communication backup. The course explores data visualization, introductory level of modern programming with algorithms and information security. CT 101 course is taught for all tracks except those in the human track.

Computer Skills 102 (CT 102) is a three-hour credit course. It provides IT skills to the students to utilize computers with high efficiency, develop students’ research and scientific skills, and increase capability to design basic educational software. CT 102 course focuses on operating systems such as Microsoft Office. This course is only taught for students in the human track.

Structure and activities

These courses ranged from one to three hours. A one-hour credit means that students must take an hour of the class each week during the academic semester. The same arrangement would apply to two and three credit-hour courses. The types of activities in each course are shown in Table 1 .

At King Saud University, each semester spans 15 weeks in duration. The total number of semester hours allocated to each course serves as an indicator of its significance within the broader context of the academic program, including the diverse tracks available to students. Throughout the two years under study (i.e., 2020 and 2021), course placements (fall or spring), course content, and the organizational structure remained consistent and uniform.

Participants

The study’s data comes from test scores of a cohort of 16,722 first-year college students enrolled at King Saud University in Saudi Arabia over the span of two academic years: 2020 and 2021. Among these students, 8297 were engaged in traditional, in-person learning in 2020, while 8425 had transitioned to online instruction for the same courses in 2021 due to the Covid-19 pandemic. In 2020, the student population consisted of 51.5% females and 48.5% males. However, in 2021, there was a reversal in these proportions, with female students accounting for 48.5% and male students comprising 51.5% of the total participants.

Regarding student enrollment in the five courses, Table 2 provides a detailed breakdown by average class size, admission scores, and the number of students enrolled in the courses during the two years covered by this study. While the total number of students in each course remained relatively consistent across the two years, there were noticeable fluctuations in average class sizes. Specifically, four out of the five courses experienced substantial increases in class size, with some nearly doubling in size (e.g., ENT_101 and CT_102), while one course (CT_101) showed a reduction in its average class size.

In this study, it must be noted that while some students enrolled in up to three different courses within the same academic year, none repeated the same exam in both years. Specifically, students who failed to pass their courses in 2020 were required to complete them in summer sessions and were consequently not included in this study’s dataset. To ensure clarity and precision in our analysis, the research focused exclusively on student test scores to evaluate and compare the academic effectiveness of online and traditional in-person learning methods. This approach was chosen to provide a clear, direct comparison of the educational impacts associated with each teaching format.

Descriptive analysis of the final exam scores for the two years (2020 and 2021) were conducted. Additionally, comparison of student outcomes in in-person classes in 2020 to their online platform peers in 2021 were conducted using an independent-samples t -test. Subsequently, in order to address potential disparities between the two groups arising from variables such as gender, class size, and admission scores (which serve as an indicator of students’ academic aptitude and pre-enrollment knowledge), multiple regression analyses were conducted. In these multivariate analyses, outcomes of both in-person and online cohorts were assessed within their respective tracks. By carefully considering essential aforementioned variables linked to student performance, the study aimed to ensure a comprehensive and equitable evaluation.

Study instrument

The study obtained students’ final exam scores for the years 2020 (in-person) and 2021 (online) from the school’s records office through their examination management system. In the preparatory year at King Saud University, final exams for all courses are developed by committees composed of faculty members from each department. To ensure valid comparisons, the final exam questions, crafted by departmental committees of professors, remained consistent and uniform for the two years under examination.

Table 3 provides a comprehensive assessment of the reliability of all five tests included in our analysis. These tests exhibit a strong degree of internal consistency, with Cronbach’s alpha coefficients spanning a range from 0.77 to 0.86. This robust and consistent internal consistency measurement underscores the dependable nature of these tests, affirming their reliability and suitability for the study’s objectives.

In terms of assessing test validity, content validity was ensured through a thorough review by university subject matter experts, resulting in test items that align well with the content domain and learning objectives. Additionally, criterion-related validity was established by correlating students’ admissions test scores with their final required freshman test scores in the five subject areas, showing a moderate and acceptable relationship (0.37 to 0.56) between the test scores and the external admissions test. Finally, construct validity was confirmed through reviews by experienced subject instructors, leading to improvements in test content. With guidance from university subject experts, construct validity was established, affirming the effectiveness of the final tests in assessing students’ subject knowledge at the end of their coursework.

Collectively, these validity and reliability measures affirm the soundness and integrity of the final subject tests, establishing their suitability as effective assessment tools for evaluating students’ knowledge in their five mandatory freshman courses at King Saud University.

After obtaining research approval from the Research Committee at King Saud University, the coordinators of the five courses (CI_101, ENT_101, CT_101, CT_102, and FAJB_101) supplied the researchers with the final exam scores of all first-year preparatory year students at King Saud University for the initial semester of the academic years 2020 and 2021. The sample encompassed all students who had completed these five courses during both years, resulting in a total of 16,722 students forming the final group of participants.

Limitations

Several limitations warrant acknowledgment in this study. First, the research was conducted within a well-resourced major public university. As such, the experiences with online classes at other types of institutions (e.g., community colleges, private institutions) may vary significantly. Additionally, the limited data pertaining to in-class teaching practices and the diversity of learning activities across different courses represents a gap that could have provided valuable insights for a more thorough interpretation and explanation of the study’s findings.

To compare student achievement in the final tests in the five courses by year, independent-samples t -tests were conducted. Table 4 shows a statistically-significant drop in test scores from 2020 (in person) to 2021 (online) for all courses except CT_101. The biggest decline was with CT_102 with 3.58 points, and the smallest decline was with CI_101 with 0.18 points.

However, such simple comparison of means between the two years (via t -tests) by subjects does not account for the differences in gender composition, class size, and admission scores between the two academic years, all of which have been associated with student outcomes (e.g., Ho and Kelman, 2014 ; De Paola et al., 2013 ). To account for such potential confounding variables, multiple regressions were conducted to compare the 2 years’ results while controlling for these three factors associated with student achievement.

Table 5 presents the regression results, illustrating the variation in final exam scores between 2020 and 2021, while controlling for gender, class size, and admission scores. Importantly, these results diverge significantly from the outcomes obtained through independent-sample t -test analyses.

Taking into consideration the variables mentioned earlier, students in the 2021 online cohort demonstrated superior performance compared to their 2020 in-person counterparts in CI_101, FAJB_101, and CT_101, with score advantages of 0.89, 0.56, and 5.28 points, respectively. Conversely, in the case of ENT_101, online students in 2021 scored 0.69 points lower than their 2020 in-person counterparts. With CT_102, there were no statistically significant differences in final exam scores between the two cohorts of students.

The study sought to assess the effectiveness of distance learning compared to in-person learning in the higher education setting in Saudi Arabia. We analyzed the final exam scores of 16,722 first-year college students in King Saud University in five required subjects (i.e., CI_101, ENT_101, CT_101, CT_102, and FAJB_101). The study initially performed a simple comparison of mean scores by tracks by year (via t -tests) and then a number of multiple regression analyses which controlled for class size, gender composition, and admission scores.

Overall, the study’s more in-depth findings using multiple regression painted a wholly different picture than the results obtained using t -tests. After controlling for class size, gender composition, and admissions scores, online students in 2021 performed better than their in-person instruction peers in 2020 in University Skills (CI_101), Fitness and Health (FAJB_101), and Computer Skills (CT_101), whereas in-person students outperformed their online peers in Entrepreneurship (ENT_101). There was no meaningful difference in outcomes for students in the Computer Skills (CT_102) course for the two years.

In light of these findings, it raises the question: why do we observe minimal differences (less than a one-point gain or loss) in student outcomes in courses like University Skills, Fitness and Health, Entrepreneurship, and Advanced Computer Skills based on the mode of instruction? Is it possible that when subjects are primarily at a basic or introductory level, as is the case with these courses, the mode of instruction may have a limited impact as long as the concepts are effectively communicated in a manner familiar and accessible to students?

In today’s digital age, one could argue that students in more developed countries, such as Saudi Arabia, generally possess the skills and capabilities to effectively engage with materials presented in both in-person and online formats. However, there is a notable exception in the Basic Computer Skills course, where the online cohort outperformed their in-person counterparts by more than 5 points. Insights from interviews with the instructors of this course suggest that this result may be attributed to the course’s basic and conceptual nature, coupled with the availability of instructional videos that students could revisit at their own pace.

Given that students enter this course with varying levels of computer skills, self-paced learning may have allowed them to cover course materials at their preferred speed, concentrating on less familiar topics while swiftly progressing through concepts they already understood. The advantages of such self-paced learning have been documented by scholars like Tullis and Benjamin ( 2011 ), who found that self-paced learners often outperform those who spend the same amount of time studying identical materials. This approach allows learners to allocate their time more effectively according to their individual learning pace, providing greater ownership and control over their learning experience. As such, in courses like introductory computer skills, it can be argued that becoming familiar with fundamental and conceptual topics may not require extensive in-class collaboration. Instead, it may be more about exposure to and digestion of materials in a format and at a pace tailored to students with diverse backgrounds, knowledge levels, and skill sets.

Further investigation is needed to more fully understand why some classes benefitted from online instruction while others did not, and vice versa. Perhaps, it could be posited that some content areas are more conducive to in-person (or online) format while others are not. Or it could be that the different results of the two modes of learning were driven by students of varying academic abilities and engagement, with low-achieving students being more vulnerable to the limitations of online learning (e.g., Kofoed et al., 2021 ). Whatever the reasons, the results of the current study can be enlightened by a more in-depth analysis of the various factors associated with such different forms of learning. Moreover, although not clear cut, what the current study does provide is additional evidence against any dire consequences to student learning (at least in the higher ed setting) as a result of sudden increase in online learning with possible benefits of its wider use being showcased.

Based on the findings of this study, we recommend that educational leaders adopt a measured approach to online learning—a stance that neither fully embraces nor outright denounces it. The impact on students’ experiences and engagement appears to vary depending on the subjects and methods of instruction, sometimes hindering, other times promoting effective learning, while some classes remain relatively unaffected.

Rather than taking a one-size-fits-all approach, educational leaders should be open to exploring the nuances behind these outcomes. This involves examining why certain courses thrived with online delivery, while others either experienced a decline in student achievement or remained largely unaffected. By exploring these differentiated outcomes associated with diverse instructional formats, leaders in higher education institutions and beyond can make informed decisions about resource allocation. For instance, resources could be channeled towards in-person learning for courses that benefit from it, while simultaneously expanding online access for courses that have demonstrated improved outcomes through its virtual format. This strategic approach not only optimizes resource allocation but could also open up additional revenue streams for the institution.

Considering the enduring presence of online learning, both before the pandemic and its accelerated adoption due to Covid-19, there is an increasing need for institutions of learning and scholars in higher education, as well as other fields, to prioritize the study of its effects and optimal utilization. This study, which compares student outcomes between two cohorts exposed to in-person and online instruction (before and during Covid-19) at the largest university in Saudi Arabia, represents a meaningful step in this direction.

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Allen IE, Seaman J (2016) Online report card: Tracking online education in the United States . Babson Survey Group

Anderson T (2003) Getting the mix right again: an updated and theoretical rationale for interaction. Int Rev Res Open Distrib Learn , 4 (2). https://doi.org/10.19173/irrodl.v4i2.149

Arkorful V, Abaidoo N (2015) The role of e-learning, advantages and disadvantages of its adoption in higher education. Int J Instruct Technol Distance Learn 12(1):29–42

Google Scholar  

Aucejo EM, French J, Araya MP, Zafar B (2020) The impact of COVID-19 on student experiences and expectations: Evidence from a survey. Journal of Public Economics 191:104271. https://doi.org/10.1016/j.jpubeco.2020.104271

Article   PubMed   PubMed Central   Google Scholar  

Azevedo JP, Hasan A, Goldemberg D, Iqbal SA, and Geven K (2020) Simulating the potential impacts of COVID-19 school closures on schooling and learning outcomes: a set of global estimates. World Bank Policy Research Working Paper

Bergstrand K, Savage SV (2013) The chalkboard versus the avatar: Comparing the effectiveness of online and in-class courses. Teach Sociol 41(3):294–306. https://doi.org/10.1177/0092055X13479949

Article   Google Scholar  

Bettinger EP, Fox L, Loeb S, Taylor ES (2017) Virtual classrooms: How online college courses affect student success. Am Econ Rev 107(9):2855–2875. https://doi.org/10.1257/aer.20151193

Bozkurt A (2019) From distance education to open and distance learning: a holistic evaluation of history, definitions, and theories. Handbook of research on learning in the age of transhumanism , 252–273. https://doi.org/10.4018/978-1-5225-8431-5.ch016

Brooks JG, Brooks MG (1999) In search of understanding: the case for constructivist classrooms . Association for Supervision and Curriculum Development

Cacault MP, Hildebrand C, Laurent-Lucchetti J, Pellizzari M (2021) Distance learning in higher education: evidence from a randomized experiment. J Eur Econ Assoc 19(4):2322–2372. https://doi.org/10.1093/jeea/jvaa060

Chesser S, Murrah W, Forbes SA (2020) Impact of personality on choice of instructional delivery and students’ performance. Am Distance Educ 34(3):211–223. https://doi.org/10.1080/08923647.2019.1705116

Christensen CM, Raynor M, McDonald R (2015) What is disruptive innovation? Harv Bus Rev 93(12):44–53

Cochran-Smith M, Zeichner KM (2005) Studying teacher education: the report of the AERA panel on research and teacher education. Choice Rev Online 43 (4). https://doi.org/10.5860/choice.43-2338

De Paola M, Ponzo M, Scoppa V (2013) Class size effects on student achievement: heterogeneity across abilities and fields. Educ Econ 21(2):135–153. https://doi.org/10.1080/09645292.2010.511811

Dewey, J (1938) Experience and education . Simon & Schuster

Di Pietro G, Biagi F, Costa P, Karpinski Z, Mazza J (2020) The likely impact of COVID-19 on education: reflections based on the existing literature and recent international datasets. Publications Office of the European Union, Luxembourg

Duffy TM, Jonassen DH (2009) Constructivism and the technology of instruction: a conversation . Routledge, Taylor & Francis Group

Edvardsson IR, Oskarsson GK (2008) Distance education and academic achievement in business administration: the case of the University of Akureyri. Int Rev Res Open Distrib Learn, 9 (3). https://doi.org/10.19173/irrodl.v9i3.542

Figlio D, Rush M, Yin L (2013) Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning. J Labor Econ 31(4):763–784. https://doi.org/10.3386/w16089

Fischer C, Xu D, Rodriguez F, Denaro K, Warschauer M (2020) Effects of course modality in summer session: enrollment patterns and student performance in face-to-face and online classes. Internet Higher Educ 45:100710. https://doi.org/10.1016/j.iheduc.2019.100710

Gadamer HG (2001) Education is self‐education. J Philos Educ 35(4):529–538

Garrison DR (2011) E-learning in the 21st century: a framework for research and practice . Routledge. https://doi.org/10.4324/9780203838761

Gonzalez T, de la Rubia MA, Hincz KP, Comas-Lopez M, Subirats L, Fort S, & Sacha GM (2020) Influence of COVID-19 confinement on students’ performance in higher education. PLOS One 15 (10). https://doi.org/10.1371/journal.pone.0239490

Hake RR (1998) Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys 66(1):64–74. https://doi.org/10.1119/1.18809

Article   ADS   Google Scholar  

Hall ACG, Lineweaver TT, Hogan EE, O’Brien SW (2020) On or off task: the negative influence of laptops on neighboring students’ learning depends on how they are used. Comput Educ 153:1–8. https://doi.org/10.1016/j.compedu.2020.103901

Harasim L (2017) Learning theory and online technologies. Routledge. https://doi.org/10.4324/9780203846933

Hiemstra R (1994) Self-directed learning. In WJ Rothwell & KJ Sensenig (Eds), The sourcebook for self-directed learning (pp 9–20). HRD Press

Ho DE, Kelman MG (2014) Does class size affect the gender gap? A natural experiment in law. J Legal Stud 43(2):291–321

Iglesias-Pradas S, Hernández-García Á, Chaparro-Peláez J, Prieto JL (2021) Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: a case study. Comput Hum Behav 119:106713. https://doi.org/10.1016/j.chb.2021.106713

Jepsen C (2015) Class size: does it matter for student achievement? IZA World of Labor . https://doi.org/10.15185/izawol.190

Jonassen DH, Howland J, Moore J, & Marra RM (2003) Learning to solve problems with technology: a constructivist perspective (2nd ed). Columbus: Prentice Hall

Kaupp R (2012) Online penalty: the impact of online instruction on the Latino-White achievement gap. J Appli Res Community Coll 19(2):3–11. https://doi.org/10.46569/10211.3/99362

Koehler MJ, Mishra P (2009) What is technological pedagogical content knowledge? Contemp Issues Technol Teacher Educ 9(1):60–70

Kofoed M, Gebhart L, Gilmore D, & Moschitto R (2021) Zooming to class?: Experimental evidence on college students’ online learning during COVID-19. SSRN Electron J. https://doi.org/10.2139/ssrn.3846700

Kuhfeld M, Soland J, Tarasawa B, Johnson A, Ruzek E, Liu J (2020) Projecting the potential impact of COVID-19 school closures on academic achievement. Educ Res 49(8):549–565. https://doi.org/10.3102/0013189x20965918

Lai JW, Bower M (2019) How is the use of technology in education evaluated? A systematic review. Comput Educ 133:27–42

Meinck S, Brese F (2019) Trends in gender gaps: using 20 years of evidence from TIMSS. Large-Scale Assess Educ 7 (1). https://doi.org/10.1186/s40536-019-0076-3

Radha R, Mahalakshmi K, Kumar VS, Saravanakumar AR (2020) E-Learning during lockdown of COVID-19 pandemic: a global perspective. Int J Control Autom 13(4):1088–1099

Ravizza SM, Uitvlugt MG, Fenn KM (2017) Logged in and zoned out: How laptop Internet use relates to classroom learning. Psychol Sci 28(2):171–180. https://doi.org/10.1177/095679761667731

Article   PubMed   Google Scholar  

Sadeghi M (2019) A shift from classroom to distance learning: advantages and limitations. Int J Res Engl Educ 4(1):80–88

Salmon G (2000) E-moderating: the key to teaching and learning online . Routledge. https://doi.org/10.4324/9780203816684

Shulman LS (1986) Those who understand: knowledge growth in teaching. Edu Res 15(2):4–14

Shulman LS (1987) Knowledge and teaching: foundations of the new reform. Harv Educ Rev 57(1):1–22

Tullis JG, Benjamin AS (2011) On the effectiveness of self-paced learning. J Mem Lang 64(2):109–118. https://doi.org/10.1016/j.jml.2010.11.002

Valverde-Berrocoso J, Garrido-Arroyo MDC, Burgos-Videla C, Morales-Cevallos MB (2020) Trends in educational research about e-learning: a systematic literature review (2009–2018). Sustainability 12(12):5153

Volk F, Floyd CG, Shaler L, Ferguson L, Gavulic AM (2020) Active duty military learners and distance education: factors of persistence and attrition. Am J Distance Educ 34(3):1–15. https://doi.org/10.1080/08923647.2019.1708842

Vygotsky LS (1978) Mind in society: the development of higher psychological processes. Harvard University Press

Download references

Author information

Authors and affiliations.

Department of Sports and Recreation Management, King Saud University, Riyadh, Saudi Arabia

Bandar N. Alarifi

Division of Research and Doctoral Studies, Concordia University Chicago, 7400 Augusta Street, River Forest, IL, 60305, USA

You can also search for this author in PubMed   Google Scholar

Contributions

Dr. Bandar Alarifi collected and organized data for the five courses and wrote the manuscript. Dr. Steve Song analyzed and interpreted the data regarding student achievement and revised the manuscript. These authors jointly supervised this work and approved the final manuscript.

Corresponding author

Correspondence to Bandar N. Alarifi .

Ethics declarations

Competing interests.

The author declares no competing interests.

Ethical approval

This study was approved by the Research Ethics Committee at King Saud University on 25 March 2021 (No. 4/4/255639). This research does not involve the collection or analysis of data that could be used to identify participants (including email addresses or other contact details). All information is anonymized and the submission does not include images that may identify the person. The procedures used in this study adhere to the tenets of the Declaration of Helsinki.

Informed consent

This article does not contain any studies with human participants performed by any of the authors.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Alarifi, B.N., Song, S. Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership. Humanit Soc Sci Commun 11 , 86 (2024). https://doi.org/10.1057/s41599-023-02590-1

Download citation

Received : 07 June 2023

Accepted : 21 December 2023

Published : 09 January 2024

DOI : https://doi.org/10.1057/s41599-023-02590-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

research study about online learning

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

sustainability-logo

Article Menu

research study about online learning

  • Subscribe SciFeed
  • Recommended Articles
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Assessing the impact of online-learning effectiveness and benefits in knowledge management, the antecedent of online-learning strategies and motivations: an empirical study.

research study about online learning

1. Introduction

2. literature review and research hypothesis, 2.1. online-learning self-efficacy terminology, 2.2. online-learning monitoring terminology, 2.3. online-learning confidence in technology terminology, 2.4. online-learning willpower terminology, 2.5. online-learning attitude terminology, 2.6. online-learning motivation terminology, 2.7. online-learning strategies and online-learning effectiveness terminology, 2.8. online-learning effectiveness terminology, 3. research method, 3.1. instruments, 3.2. data analysis and results, 4.1. reliability and validity analysis, 4.2. hypothesis result, 5. discussion, 6. conclusions, 7. limitations and future directions, author contributions, institutional review board statement, informed consent statement, data availability statement, conflicts of interest.

  • UNESCO. COVID-19 Educational Disruption and Response ; UNESCO: Paris, France, 2020. [ Google Scholar ]
  • Moore, D.R. E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning. Educ. Technol. Res. Dev. 2006 , 54 , 197–200. [ Google Scholar ] [ CrossRef ]
  • McDonald, E.W.; Boulton, J.L.; Davis, J.L. E-learning and nursing assessment skills and knowledge–An integrative review. Nurse Educ. Today 2018 , 66 , 166–174. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Homan, S.R.; Wood, K. Taming the mega-lecture: Wireless quizzing. Syllabus Sunnyvale Chatsworth 2003 , 17 , 23–27. [ Google Scholar ]
  • Emran, M.A.; Shaalan, K. E-podium technology: A medium of managing knowledge at al buraimi university college via mlearning. In Proceedings of the 2nd BCS International IT Conference, Abu Dhabi, United Arab Emirates, 9–10 March 2014; pp. 1–4. [ Google Scholar ]
  • Tenório, T.; Bittencourt, I.I.; Isotani, S.; Silva, A.P. Does peer assessment in on-line learning environments work? A systematic review of the literature. Comput. Hum. Behav. 2016 , 64 , 94–107. [ Google Scholar ] [ CrossRef ]
  • Sheshasaayee, A.; Bee, M.N. Analyzing online learning effectiveness for knowledge society. In Information Systems Design and Intelligent Applications ; Bhateja, V., Nguyen, B., Nguyen, N., Satapathy, S., Le, D.N., Eds.; Springer: Singapore, 2018; pp. 995–1002. [ Google Scholar ]
  • Panigrahi, R.; Srivastava, P.R.; Sharma, D. Online learning: Adoption, continuance, and learning outcome—A review of literature. Int. J. Inform. Manag. 2018 , 43 , 1–14. [ Google Scholar ] [ CrossRef ]
  • Al-Rahmi, W.M.; Alias, N.; Othman, M.S.; Alzahrani, A.I.; Alfarraj, O.; Saged, A.A. Use of e-learning by university students in Malaysian higher educational institutions: A case in Universiti Teknologi Malaysia. IEEE Access 2018 , 6 , 14268–14276. [ Google Scholar ] [ CrossRef ]
  • Al-Rahmi, W.M.; Yahaya, N.; Aldraiweesh, A.A.; Alamri, M.M.; Aljarboa, N.A.; Alturki, U. Integrating technology acceptance model with innovation diffusion theory: An empirical investigation on students’ intention to use E-learning systems. IEEE Access 2019 , 7 , 26797–26809. [ Google Scholar ] [ CrossRef ]
  • Gunawan, I.; Hui, L.K.; Ma’sum, M.A. Enhancing learning effectiveness by using online learning management system. In Proceedings of the 6th International Conference on Education and Technology (ICET), Beijing, China, 18–20 June 2021; pp. 48–52. [ Google Scholar ]
  • Nguyen, P.H.; Tangworakitthaworn, P.; Gilbert, L. Individual learning effectiveness based on cognitive taxonomies and constructive alignment. In Proceedings of the IEEE Region 10 Conference (Tencon), Osaka, Japan, 16–19 November 2020; pp. 1002–1006. [ Google Scholar ]
  • Pee, L.G. Enhancing the learning effectiveness of ill-structured problem solving with online co-creation. Stud. High. Educ. 2020 , 45 , 2341–2355. [ Google Scholar ] [ CrossRef ]
  • Kintu, M.J.; Zhu, C.; Kagambe, E. Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. Int. J. Educ. Technol. High. Educ. 2017 , 14 , 1–20. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Wang, M.H.; Vogel, D.; Ran, W.J. Creating a performance-oriented e-learning environment: A design science approach. Inf. Manag. 2011 , 48 , 260–269. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Hew, K.F.; Cheung, W.S. Students’ and instructors’ use of massive open online courses (MOOCs): Motivations and challenges. Educ. Res. Rev. 2014 , 12 , 45–58. [ Google Scholar ] [ CrossRef ]
  • Bryant, J.; Bates, A.J. Creating a constructivist online instructional environment. TechTrends 2015 , 59 , 17–22. [ Google Scholar ] [ CrossRef ]
  • Lee, M.C. Explaining and predicting users’ continuance intention toward e-learning: An extension of the expectation–confirmation model. Comput. Educ. 2010 , 54 , 506–516. [ Google Scholar ] [ CrossRef ]
  • Lin, K.M. E-Learning continuance intention: Moderating effects of user e-learning experience. Comput. Educ. 2011 , 56 , 515–526. [ Google Scholar ] [ CrossRef ]
  • Huang, E.Y.; Lin, S.W.; Huang, T.K. What type of learning style leads to online participation in the mixed-mode e-learning environment? A study of software usage instruction. Comput. Educ. 2012 , 58 , 338–349. [ Google Scholar ]
  • Chu, T.H.; Chen, Y.Y. With good we become good: Understanding e-learning adoption by theory of planned behavior and group influences. Comput. Educ. 2016 , 92 , 37–52. [ Google Scholar ] [ CrossRef ]
  • Bandura, A. Self-efficacy: Toward a unifying theory of behavioral change. Psychol. Rev. 1977 , 84 , 191–215. [ Google Scholar ] [ CrossRef ]
  • Torkzadeh, G.; Van Dyke, T.P. Development and validation of an Internet self-efficacy scale. Behav. Inform. Technol. 2001 , 20 , 275–280. [ Google Scholar ] [ CrossRef ]
  • Saadé, R.G.; Kira, D. Computer anxiety in e-learning: The effect of computer self-efficacy. J. Inform. Technol. Educ. Res. 2009 , 8 , 177–191. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Tucker, J.; Gentry, G. Developing an E-Learning strategy in higher education. In Proceedings of the SITE 2009–Society for Information Technology & Teacher Education International Conference, Charleston, SC, USA, 2–6 March 2009; pp. 2702–2707. [ Google Scholar ]
  • Wang, Y.; Peng, H.M.; Huang, R.H.; Hou, Y.; Wang, J. Characteristics of distance learners: Research on relationships of learning motivation, learning strategy, self-efficacy, attribution and learning results. Open Learn. J. Open Distance Elearn. 2008 , 23 , 17–28. [ Google Scholar ] [ CrossRef ]
  • Mahmud, B.H. Study on the impact of motivation, self-efficacy and learning strategies of faculty of education undergraduates studying ICT courses. In Proceedings of the 4th International Postgraduate Research Colloquium (IPRC) Proceedings, Bangkok, Thailand, 29 October 2009; pp. 59–80. [ Google Scholar ]
  • Yusuf, M. Investigating relationship between self-efficacy, achievement motivation, and self-regulated learning strategies of undergraduate Students: A study of integrated motivational models. Procedia Soc. Behav. Sci. 2011 , 15 , 2614–2617. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • De la Fuente, J.; Martínez-Vicente, J.M.; Peralta-Sánchez, F.J.; GarzónUmerenkova, A.; Vera, M.M.; Paoloni, P. Applying the SRL vs. ERL theory to the knowledge of achievement emotions in undergraduate university students. Front. Psychol. 2019 , 10 , 2070. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Ahmadi, S. Academic self-esteem, academic self-efficacy and academic achievement: A path analysis. J. Front. Psychol. 2020 , 5 , 155. [ Google Scholar ]
  • Meyen, E.L.; Aust, R.J.; Bui, Y.N. Assessing and monitoring student progress in an E-learning personnel preparation environment. Teach. Educ. Spec. Educ. 2002 , 25 , 187–198. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Dunlosky, J.; Kubat-Silman, A.K.; Christopher, H. Training monitoring skills improves older adults’ self-paced associative learning. Psychol. Aging 2003 , 18 , 340–345. [ Google Scholar ] [ CrossRef ]
  • Zhang, H.J. Research on the relationship between English learning motivation. Self-monitoring and Test Score. Ethnic Educ. Res. 2005 , 6 , 66–71. [ Google Scholar ]
  • Rosenberg, M.J. E-Learning: Strategies for Delivering Knowledge in the Digital Age ; McGraw-Hill: New York, NY, USA, 2001. [ Google Scholar ]
  • Bhat, S.A.; Bashir, M. Measuring ICT orientation: Scale development & validation. Educ. Inf. Technol. 2018 , 23 , 1123–1143. [ Google Scholar ]
  • Achuthan, K.; Francis, S.P.; Diwakar, S. Augmented reflective learning and knowledge retention perceived among students in classrooms involving virtual laboratories. Educ. Inf. Technol. 2017 , 22 , 2825–2855. [ Google Scholar ] [ CrossRef ]
  • Hu, X.; Yelland, N. An investigation of preservice early childhood teachers’ adoption of ICT in a teaching practicum context in Hong Kong. J. Early Child. Teach. Educ. 2017 , 38 , 259–274. [ Google Scholar ] [ CrossRef ]
  • Fraillon, J.; Ainley, J.; Schulz, W.; Friedman, T.; Duckworth, D. Preparing for Life in a Digital World: The IEA International Computer and Information Literacy Study 2018 International Report ; Springer: New York, NY, USA, 2019. [ Google Scholar ]
  • Huber, S.G.; Helm, C. COVID-19 and schooling: Evaluation, assessment and accountability in times of crises—Reacting quickly to explore key issues for policy, practice and research with the school barometer. Educ. Assess. Eval. Account. 2020 , 32 , 237–270. [ Google Scholar ] [ CrossRef ]
  • Eickelmann, B.; Gerick, J. Learning with digital media: Objectives in times of Corona and under special consideration of social Inequities. Dtsch. Schule. 2020 , 16 , 153–162. [ Google Scholar ]
  • Shehzadi, S.; Nisar, Q.A.; Hussain, M.S.; Basheer, M.F.; Hameed, W.U.; Chaudhry, N.I. The role of e-learning toward students’ satisfaction and university brand image at educational institutes of Pakistan: A post-effect of COVID-19. Asian Educ. Dev. Stud. 2020 , 10 , 275–294. [ Google Scholar ] [ CrossRef ]
  • Miller, E.M.; Walton, G.M.; Dweck, C.S.; Job, V.; Trzesniewski, K.; McClure, S. Theories of willpower affect sustained learning. PLoS ONE 2012 , 7 , 38680. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Moriña, A.; Molina, V.M.; Cortés-Vega, M.D. Voices from Spanish students with disabilities: Willpower and effort to survive university. Eur. J. Spec. Needs Educ. 2018 , 33 , 481–494. [ Google Scholar ] [ CrossRef ]
  • Koballa, T.R., Jr.; Crawley, F.E. The influence of attitude on science teaching and learning. Sch. Sci. Math. 1985 , 85 , 222–232. [ Google Scholar ] [ CrossRef ]
  • Chao, C.Y.; Chen, Y.T.; Chuang, K.Y. Exploring students’ learning attitude and achievement in flipped learning supported computer aided design curriculum: A study in high school engineering education. Comput. Appl. Eng. Educ. 2015 , 23 , 514–526. [ Google Scholar ] [ CrossRef ]
  • Stefan, M.; Ciomos, F. The 8th and 9th grades students’ attitude towards teaching and learning physics. Acta Didact. Napocensia. 2010 , 3 , 7–14. [ Google Scholar ]
  • Sedighi, F.; Zarafshan, M.A. Effects of attitude and motivation on the use of language learning strategies by Iranian EFL University students. J. Soc. Sci. Humanit. Shiraz Univ. 2007 , 23 , 71–80. [ Google Scholar ]
  • Megan, S.; Jennifer, H.C.; Stephanie, V.; Kyla, H. The relationship among middle school students’ motivation orientations, learning strategies, and academic achievement. Middle Grades Res. J. 2013 , 8 , 1–12. [ Google Scholar ]
  • Nasser, O.; Majid, V. Motivation, attitude, and language learning. Procedia Soc. Behav. Sci. 2011 , 29 , 994–1000. [ Google Scholar ]
  • Özhan, Ş.Ç.; Kocadere, S.A. The effects of flow, emotional engagement, and motivation on success in a gamified online learning environment. J. Educ. Comput. Res. 2020 , 57 , 2006–2031. [ Google Scholar ] [ CrossRef ]
  • Wang, A.P.; Che, H.S. A research on the relationship between learning anxiety, learning attitude, motivation and test performance. Psychol. Dev. Educ. 2005 , 21 , 55–59. [ Google Scholar ]
  • Lin, C.H.; Zhang, Y.N.; Zheng, B.B. The roles of learning strategies and motivation in online language learning: A structural equation modeling analysis. Comput. Educ. 2017 , 113 , 75–85. [ Google Scholar ] [ CrossRef ]
  • Deschênes, M.F.; Goudreau, J.; Fernandez, N. Learning strategies used by undergraduate nursing students in the context of a digital educational strategy based on script concordance: A descriptive study. Nurse Educ. Today 2020 , 95 , 104607. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Jerusalem, M.; Schwarzer, R. Self-efficacy as a resource factor in stress appraisal processes. In Self-Efficacy: Thought Control of Action ; Schwarzer, R., Ed.; Hemisphere Publishing Corp: Washington, DC, USA, 1992; pp. 195–213. [ Google Scholar ]
  • Zimmerman, B.J. Becoming a self-regulated learner: An overview. Theory Pract. 2002 , 41 , 64–70. [ Google Scholar ] [ CrossRef ]
  • Pintrich, P.R.; Smith, D.A.F.; García, T.; McKeachie, W.J. A Manual for the Use of the Motivated Strategies Questionnaire (MSLQ) ; University of Michigan, National Center for Research to Improve Post Secondary Teaching and Learning: Ann Arbor, MI, USA, 1991. [ Google Scholar ]
  • Knowles, E.; Kerkman, D. An investigation of students attitude and motivation toward online learning. InSight Collect. Fac. Scholarsh. 2007 , 2 , 70–80. [ Google Scholar ] [ CrossRef ]
  • Hair, J.F., Jr.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis: A Global Perspective , 7th ed.; Pearson Education International: Upper Saddle River, NJ, USA, 2010. [ Google Scholar ]
  • Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981 , 18 , 39–50. [ Google Scholar ] [ CrossRef ]
  • Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM) ; Sage: Los Angeles, CA, USA, 2016. [ Google Scholar ]
  • Kiliç-Çakmak, E. Learning strategies and motivational factors predicting information literacy self-efficacy of e-learners. Aust. J. Educ. Technol. 2010 , 26 , 192–208. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Zheng, C.; Liang, J.C.; Li, M.; Tsai, C. The relationship between English language learners’ motivation and online self-regulation: A structural equation modelling approach. System 2018 , 76 , 144–157. [ Google Scholar ] [ CrossRef ]
  • May, M.; George, S.; Prévôt, P. TrAVis to enhance students’ self-monitoring in online learning supported by computer-mediated communication tools. Int. J. Comput. Inform. Syst. Ind. Manag. Appl. 2011 , 3 , 623–634. [ Google Scholar ]
  • Rafart, M.A.; Bikfalvi, A.; Soler, J.; Poch, J. Impact of using automatic E-Learning correctors on teaching business subjects to engineers. Int. J. Eng. Educ. 2019 , 35 , 1630–1641. [ Google Scholar ]
  • Lee, P.M.; Tsui, W.H.; Hsiao, T.C. A low-cost scalable solution for monitoring affective state of students in E-learning environment using mouse and keystroke data. In Intelligent Tutoring Systems ; Cerri, S.A., Clancey, W.J., Papadourakis, G., Panourgia, K., Eds.; Springer: Berlin, Germany, 2012; pp. 679–680. [ Google Scholar ]
  • Metz, D.; Karadgi, S.S.; Müller, U.J.; Grauer, M. Self-Learning monitoring and control of manufacturing processes based on rule induction and event processing. In Proceedings of the 4th International Conference on Information, Process, and Knowledge Management eKNOW, Valencia, Spain, 21–25 November 2012; pp. 78–85. [ Google Scholar ]
  • Fitch, J.L.; Ravlin, E.C. Willpower and perceived behavioral control: Intention-behavior relationship and post behavior attributions. Soc. Behav. Pers. Int. J. 2005 , 33 , 105–124. [ Google Scholar ] [ CrossRef ]
  • Sridharan, B.; Deng, H.; Kirk, J.; Brian, C. Structural equation modeling for evaluating the user perceptions of e-learning effectiveness in higher education. In Proceedings of the ECIS 2010: 18th European Conference on Information Systems, Pretoria, South Africa, 7–9 June 2010. [ Google Scholar ]
  • Tarhini, A.; Hone, K.; Liu, X. The effects of individual differences on e-learning users’ behaviour in developing countries: A structural equation model. Comput. Hum. Behav. 2014 , 41 , 153–163. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • de Leeuw, R.A.; Logger, D.N.; Westerman, M.; Bretschneider, J.; Plomp, M.; Scheele, F. Influencing factors in the implementation of postgraduate medical e-learning: A thematic analysis. BMC Med. Educ. 2019 , 19 , 300. [ Google Scholar ] [ CrossRef ] [ Green Version ]
  • Erenler, H.H.T. A structural equation model to evaluate students’ learning and satisfaction. Comput. Appl. Eng. Educ. 2020 , 28 , 254–267. [ Google Scholar ] [ CrossRef ]
  • Fee, K. Delivering E-learning: A complete strategy for design, application and assessment. Dev. Learn. Organ. 2013 , 27 , 40–52. [ Google Scholar ] [ CrossRef ]
  • So, W.W.N.; Chen, Y.; Wan, Z.H. Multimedia e-Learning and self-regulated science learning: A study of primary school learners’ experiences and perceptions. J. Sci. Educ. Technol. 2019 , 28 , 508–522. [ Google Scholar ] [ CrossRef ]

Click here to enlarge figure

VariablesCategoryFrequencyPercentage
GenderMale24351.81
Female22648.19
Education program levelUndergraduate program21044.78
Master program15432.84
Doctoral program10522.39
Online learning toolsSmartphone25554.37
Computer/PC12526.65
Tablet8918.98
Online learning mediaGoogle Meet13228.14
Microsoft Teams9921.11
Zoom19641.79
Others428.96
ConstructMeasurement ItemsFactor Loading/Coefficient (t-Value)AVEComposite ReliabilityCronbach’s Alpha
Online Learning Benefit (LBE)LBE10.880.680.860.75
LBE20.86
LBE30.71
Online-learning effectiveness (LEF)LEF10.830.760.900.84
LEF20.88
LEF30.90
Online-learning motivation (LMT)LMT10.860.770.910.85
LMT20.91
LMT30.85
Online-learning strategies (LST)LST10.900.750.900.84
LST20.87
LST30.83
Online-learning attitude (OLA)OLA10.890.750.900.84
OLA20.83
OLA30.87
Online-learning confidence-in-technology (OLC)OLC10.870.690.870.76
OLC20.71
OLC30.89
Online-learning monitoring (OLM)OLM10.880.750.890.83
OLM20.91
OLM30.79
Online-learning self-efficacy (OLS)OLS10.790.640.840.73
OLS20.81
OLS30.89
Online-learning willpower (OLW)OLW10.910.690.870.77
OLW20.84
OLW30.73
LBELEFLMTLSTOLAOLCOLMOLSOLW
LBE
LEF0.82
LMT0.810.80
LST0.800.840.86
OLA0.690.630.780.81
OLC0.760.790.850.790.72
OLM0.810.850.810.760.630.83
OLS0.710.590.690.570.560.690.75
OLW0.750.750.800.740.640.810.800.79
LBELEFLMTLSTOLAOLCOLMOLSOLW
LBE10.880.760.870.660.540.790.780.630.74
LBE20.860.680.740.630.570.750.910.730.79
LBE30.710.540.590.710.630.550.500.360.53
LEF10.630.830.720.650.510.620.690.460.57
LEF20.770.880.780.710.550.730.780.520.69
LEF30.720.900.800.830.570.720.760.580.69
LMT10.880.760.870.660.540.790.780.630.74
LMT20.790.890.910.790.620.730.880.610.67
LMT30.720.650.850.770.890.720.670.590.69
LST10.610.630.680.900.780.640.570.390.57
LST20.740.590.720.870.780.680.610.480.63
LST30.720.900.800.830.570.720.760.580.69
OLA10.720.650.850.790.890.720.670.590.69
OLA20.510.480.550.590.830.580.470.420.43
OLA30.520.440.550.700.870.550.430.390.47
OLC10.780.700.730.650.530.870.770.650.91
OLC20.510.530.570.620.750.710.460.390.47
OLC30.810.730.780.690.550.890.800.660.75
OLM10.790.890.910.790.620.730.880.610.69
OLM20.860.680.740.630.570.750.910.730.79
OLM30.690.550.570.470.390.670.790.610.73
OLS10.410.230.350.280.390.410.400.690.49
OLS20.450.410.480.380.430.480.520.810.49
OLS30.750.660.720.600.490.690.770.890.82
OLW10.780.700.730.650.530.870.770.650.91
OLW20.750.650.710.590.510.690.770.870.84
OLW30.570.490.540.590.570.570.530.390.73
HypothesisPathStandardized Path Coefficientt-ValueResult
H1OLS → LST0.29 ***2.14Accepted
H2OLM → LST0.24 ***2.29Accepted
H3OLC → LST0.28 ***1.99Accepted
H4OLC → LMT0.36 ***2.96Accepted
H5OLW → LMT0.26 ***2.55Accepted
H6OLA → LMT0.34 ***4.68Accepted
H7LMT → LST0.71 ***4.96Accepted
H8LMT → LEF0.60 ***5.89Accepted
H9LST → LEF0.32 ***3.04Accepted
H10LEF → LBE0.81 ***23.6Accepted
MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

Hongsuchon, T.; Emary, I.M.M.E.; Hariguna, T.; Qhal, E.M.A. Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study. Sustainability 2022 , 14 , 2570. https://doi.org/10.3390/su14052570

Hongsuchon T, Emary IMME, Hariguna T, Qhal EMA. Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study. Sustainability . 2022; 14(5):2570. https://doi.org/10.3390/su14052570

Hongsuchon, Tanaporn, Ibrahiem M. M. El Emary, Taqwa Hariguna, and Eissa Mohammed Ali Qhal. 2022. "Assessing the Impact of Online-Learning Effectiveness and Benefits in Knowledge Management, the Antecedent of Online-Learning Strategies and Motivations: An Empirical Study" Sustainability 14, no. 5: 2570. https://doi.org/10.3390/su14052570

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

ORIGINAL RESEARCH article

Insights into students’ experiences and perceptions of remote learning methods: from the covid-19 pandemic to best practice for the future.

\r\nTrang Nguyen

  • 1 Minerva Schools at Keck Graduate Institute, San Francisco, CA, United States
  • 2 Ronin Institute for Independent Scholarship, Montclair, NJ, United States
  • 3 Department of Physics, University of Toronto, Toronto, ON, Canada

This spring, students across the globe transitioned from in-person classes to remote learning as a result of the COVID-19 pandemic. This unprecedented change to undergraduate education saw institutions adopting multiple online teaching modalities and instructional platforms. We sought to understand students’ experiences with and perspectives on those methods of remote instruction in order to inform pedagogical decisions during the current pandemic and in future development of online courses and virtual learning experiences. Our survey gathered quantitative and qualitative data regarding students’ experiences with synchronous and asynchronous methods of remote learning and specific pedagogical techniques associated with each. A total of 4,789 undergraduate participants representing institutions across 95 countries were recruited via Instagram. We find that most students prefer synchronous online classes, and students whose primary mode of remote instruction has been synchronous report being more engaged and motivated. Our qualitative data show that students miss the social aspects of learning on campus, and it is possible that synchronous learning helps to mitigate some feelings of isolation. Students whose synchronous classes include active-learning techniques (which are inherently more social) report significantly higher levels of engagement, motivation, enjoyment, and satisfaction with instruction. Respondents’ recommendations for changes emphasize increased engagement, interaction, and student participation. We conclude that active-learning methods, which are known to increase motivation, engagement, and learning in traditional classrooms, also have a positive impact in the remote-learning environment. Integrating these elements into online courses will improve the student experience.

Introduction

The COVID-19 pandemic has dramatically changed the demographics of online students. Previously, almost all students engaged in online learning elected the online format, starting with individual online courses in the mid-1990s through today’s robust online degree and certificate programs. These students prioritize convenience, flexibility and ability to work while studying and are older than traditional college age students ( Harris and Martin, 2012 ; Levitz, 2016 ). These students also find asynchronous elements of a course are more useful than synchronous elements ( Gillingham and Molinari, 2012 ). In contrast, students who chose to take courses in-person prioritize face-to-face instruction and connection with others and skew considerably younger ( Harris and Martin, 2012 ). This leaves open the question of whether students who prefer to learn in-person but are forced to learn remotely will prefer synchronous or asynchronous methods. One study of student preferences following a switch to remote learning during the COVID-19 pandemic indicates that students enjoy synchronous over asynchronous course elements and find them more effective ( Gillis and Krull, 2020 ). Now that millions of traditional in-person courses have transitioned online, our survey expands the data on student preferences and explores if those preferences align with pedagogical best practices.

An extensive body of research has explored what instructional methods improve student learning outcomes (Fink. 2013). Considerable evidence indicates that active-learning or student-centered approaches result in better learning outcomes than passive-learning or instructor-centered approaches, both in-person and online ( Freeman et al., 2014 ; Chen et al., 2018 ; Davis et al., 2018 ). Active-learning approaches include student activities or discussion in class, whereas passive-learning approaches emphasize extensive exposition by the instructor ( Freeman et al., 2014 ). Constructivist learning theories argue that students must be active participants in creating their own learning, and that listening to expert explanations is seldom sufficient to trigger the neurological changes necessary for learning ( Bostock, 1998 ; Zull, 2002 ). Some studies conclude that, while students learn more via active learning, they may report greater perceptions of their learning and greater enjoyment when passive approaches are used ( Deslauriers et al., 2019 ). We examine student perceptions of remote learning experiences in light of these previous findings.

In this study, we administered a survey focused on student perceptions of remote learning in late May 2020 through the social media account of @unjadedjade to a global population of English speaking undergraduate students representing institutions across 95 countries. We aim to explore how students were being taught, the relationship between pedagogical methods and student perceptions of their experience, and the reasons behind those perceptions. Here we present an initial analysis of the results and share our data set for further inquiry. We find that positive student perceptions correlate with synchronous courses that employ a variety of interactive pedagogical techniques, and that students overwhelmingly suggest behavioral and pedagogical changes that increase social engagement and interaction. We argue that these results support the importance of active learning in an online environment.

Materials and Methods

Participant pool.

Students were recruited through the Instagram account @unjadedjade. This social media platform, run by influencer Jade Bowler, focuses on education, effective study tips, ethical lifestyle, and promotes a positive mindset. For this reason, the audience is presumably academically inclined, and interested in self-improvement. The survey was posted to her account and received 10,563 responses within the first 36 h. Here we analyze the 4,789 of those responses that came from undergraduates. While we did not collect demographic or identifying information, we suspect that women are overrepresented in these data as followers of @unjadedjade are 80% women. A large minority of respondents were from the United Kingdom as Jade Bowler is a British influencer. Specifically, 43.3% of participants attend United Kingdom institutions, followed by 6.7% attending university in the Netherlands, 6.1% in Germany, 5.8% in the United States and 4.2% in Australia. Ninety additional countries are represented in these data (see Supplementary Figure 1 ).

Survey Design

The purpose of this survey is to learn about students’ instructional experiences following the transition to remote learning in the spring of 2020.

This survey was initially created for a student assignment for the undergraduate course Empirical Analysis at Minerva Schools at KGI. That version served as a robust pre-test and allowed for identification of the primary online platforms used, and the four primary modes of learning: synchronous (live) classes, recorded lectures and videos, uploaded or emailed materials, and chat-based communication. We did not adapt any open-ended questions based on the pre-test survey to avoid biasing the results and only corrected language in questions for clarity. We used these data along with an analysis of common practices in online learning to revise the survey. Our revised survey asked students to identify the synchronous and asynchronous pedagogical methods and platforms that they were using for remote learning. Pedagogical methods were drawn from literature assessing active and passive teaching strategies in North American institutions ( Fink, 2013 ; Chen et al., 2018 ; Davis et al., 2018 ). Open-ended questions asked students to describe why they preferred certain modes of learning and how they could improve their learning experience. Students also reported on their affective response to learning and participation using a Likert scale.

The revised survey also asked whether students had responded to the earlier survey. No significant differences were found between responses of those answering for the first and second times (data not shown). See Supplementary Appendix 1 for survey questions. Survey data was collected from 5/21/20 to 5/23/20.

Qualitative Coding

We applied a qualitative coding framework adapted from Gale et al. (2013) to analyze student responses to open-ended questions. Four researchers read several hundred responses and noted themes that surfaced. We then developed a list of themes inductively from the survey data and deductively from the literature on pedagogical practice ( Garrison et al., 1999 ; Zull, 2002 ; Fink, 2013 ; Freeman et al., 2014 ). The initial codebook was revised collaboratively based on feedback from researchers after coding 20–80 qualitative comments each. Before coding their assigned questions, alignment was examined through coding of 20 additional responses. Researchers aligned in identifying the same major themes. Discrepancies in terms identified were resolved through discussion. Researchers continued to meet weekly to discuss progress and alignment. The majority of responses were coded by a single researcher using the final codebook ( Supplementary Table 1 ). All responses to questions 3 (4,318 responses) and 8 (4,704 responses), and 2,512 of 4,776 responses to question 12 were analyzed. Valence was also indicated where necessary (i.e., positive or negative discussion of terms). This paper focuses on the most prevalent themes from our initial analysis of the qualitative responses. The corresponding author reviewed codes to ensure consistency and accuracy of reported data.

Statistical Analysis

The survey included two sets of Likert-scale questions, one consisting of a set of six statements about students’ perceptions of their experiences following the transition to remote learning ( Table 1 ). For each statement, students indicated their level of agreement with the statement on a five-point scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). The second set asked the students to respond to the same set of statements, but about their retroactive perceptions of their experiences with in-person instruction before the transition to remote learning. This set was not the subject of our analysis but is present in the published survey results. To explore correlations among student responses, we used CrossCat analysis to calculate the probability of dependence between Likert-scale responses ( Mansinghka et al., 2016 ).

www.frontiersin.org

Table 1. Likert-scale questions.

Mean values are calculated based on the numerical scores associated with each response. Measures of statistical significance for comparisons between different subgroups of respondents were calculated using a two-sided Mann-Whitney U -test, and p -values reported here are based on this test statistic. We report effect sizes in pairwise comparisons using the common-language effect size, f , which is the probability that the response from a random sample from subgroup 1 is greater than the response from a random sample from subgroup 2. We also examined the effects of different modes of remote learning and technological platforms using ordinal logistic regression. With the exception of the mean values, all of these analyses treat Likert-scale responses as ordinal-scale, rather than interval-scale data.

Students Prefer Synchronous Class Sessions

Students were asked to identify their primary mode of learning given four categories of remote course design that emerged from the pilot survey and across literature on online teaching: live (synchronous) classes, recorded lectures and videos, emailed or uploaded materials, and chats and discussion forums. While 42.7% ( n = 2,045) students identified live classes as their primary mode of learning, 54.6% ( n = 2613) students preferred this mode ( Figure 1 ). Both recorded lectures and live classes were preferred over uploaded materials (6.22%, n = 298) and chat (3.36%, n = 161).

www.frontiersin.org

Figure 1. Actual (A) and preferred (B) primary modes of learning.

In addition to a preference for live classes, students whose primary mode was synchronous were more likely to enjoy the class, feel motivated and engaged, be satisfied with instruction and report higher levels of participation ( Table 2 and Supplementary Figure 2 ). Regardless of primary mode, over two-thirds of students reported they are often distracted during remote courses.

www.frontiersin.org

Table 2. The effect of synchronous vs. asynchronous primary modes of learning on student perceptions.

Variation in Pedagogical Techniques for Synchronous Classes Results in More Positive Perceptions of the Student Learning Experience

To survey the use of passive vs. active instructional methods, students reported the pedagogical techniques used in their live classes. Among the synchronous methods, we identify three different categories ( National Research Council, 2000 ; Freeman et al., 2014 ). Passive methods (P) include lectures, presentations, and explanation using diagrams, white boards and/or other media. These methods all rely on instructor delivery rather than student participation. Our next category represents active learning through primarily one-on-one interactions (A). The methods in this group are in-class assessment, question-and-answer (Q&A), and classroom chat. Group interactions (F) included classroom discussions and small-group activities. Given these categories, Mann-Whitney U pairwise comparisons between the 7 possible combinations and Likert scale responses about student experience showed that the use of a variety of methods resulted in higher ratings of experience vs. the use of a single method whether or not that single method was active or passive ( Table 3 ). Indeed, students whose classes used methods from each category (PAF) had higher ratings of enjoyment, motivation, and satisfaction with instruction than those who only chose any single method ( p < 0.0001) and also rated higher rates of participation and engagement compared to students whose only method was passive (P) or active through one-on-one interactions (A) ( p < 0.00001). Student ratings of distraction were not significantly different for any comparison. Given that sets of Likert responses often appeared significant together in these comparisons, we ran a CrossCat analysis to look at the probability of dependence across Likert responses. Responses have a high probability of dependence on each other, limiting what we can claim about any discrete response ( Supplementary Figure 3 ).

www.frontiersin.org

Table 3. Comparison of combinations of synchronous methods on student perceptions. Effect size (f).

Mann-Whitney U pairwise comparisons were also used to check if improvement in student experience was associated with the number of methods used vs. the variety of types of methods. For every comparison, we found that more methods resulted in higher scores on all Likert measures except distraction ( Table 4 ). Even comparison between four or fewer methods and greater than four methods resulted in a 59% chance that the latter enjoyed the courses more ( p < 0.00001) and 60% chance that they felt more motivated to learn ( p < 0.00001). Students who selected more than four methods ( n = 417) were also 65.1% ( p < 0.00001), 62.9% ( p < 0.00001) and 64.3% ( p < 0.00001) more satisfied with instruction, engaged, and actively participating, respectfully. Therefore, there was an overlap between how the number and variety of methods influenced students’ experiences. Since the number of techniques per category is 2–3, we cannot fully disentangle the effect of number vs. variety. Pairwise comparisons to look at subsets of data with 2–3 methods from a single group vs. 2–3 methods across groups controlled for this but had low sample numbers in most groups and resulted in no significant findings (data not shown). Therefore, from the data we have in our survey, there seems to be an interdependence between number and variety of methods on students’ learning experiences.

www.frontiersin.org

Table 4. Comparison of the number of synchronous methods on student perceptions. Effect size (f).

Variation in Asynchronous Pedagogical Techniques Results in More Positive Perceptions of the Student Learning Experience

Along with synchronous pedagogical methods, students reported the asynchronous methods that were used for their classes. We divided these methods into three main categories and conducted pairwise comparisons. Learning methods include video lectures, video content, and posted study materials. Interacting methods include discussion/chat forums, live office hours, and email Q&A with professors. Testing methods include assignments and exams. Our results again show the importance of variety in students’ perceptions ( Table 5 ). For example, compared to providing learning materials only, providing learning materials, interaction, and testing improved enjoyment ( f = 0.546, p < 0.001), motivation ( f = 0.553, p < 0.0001), satisfaction with instruction ( f = 0.596, p < 0.00001), engagement ( f = 0.572, p < 0.00001) and active participation ( f = 0.563, p < 0.00001) (row 6). Similarly, compared to just being interactive with conversations, the combination of all three methods improved five out of six indicators, except for distraction in class (row 11).

www.frontiersin.org

Table 5. Comparison of combinations of asynchronous methods on student perceptions. Effect size (f).

Ordinal logistic regression was used to assess the likelihood that the platforms students used predicted student perceptions ( Supplementary Table 2 ). Platform choices were based on the answers to open-ended questions in the pre-test survey. The synchronous and asynchronous methods used were consistently more predictive of Likert responses than the specific platforms. Likewise, distraction continued to be our outlier with no differences across methods or platforms.

Students Prefer In-Person and Synchronous Online Learning Largely Due to Social-Emotional Reasoning

As expected, 86.1% (4,123) of survey participants report a preference for in-person courses, while 13.9% (666) prefer online courses. When asked to explain the reasons for their preference, students who prefer in-person courses most often mention the importance of social interaction (693 mentions), engagement (639 mentions), and motivation (440 mentions). These students are also more likely to mention a preference for a fixed schedule (185 mentions) vs. a flexible schedule (2 mentions).

In addition to identifying social reasons for their preference for in-person learning, students’ suggestions for improvements in online learning focus primarily on increasing interaction and engagement, with 845 mentions of live classes, 685 mentions of interaction, 126 calls for increased participation and calls for changes related to these topics such as, “Smaller teaching groups for live sessions so that everyone is encouraged to talk as some people don’t say anything and don’t participate in group work,” and “Make it less of the professor reading the pdf that was given to us and more interaction.”

Students who prefer online learning primarily identify independence and flexibility (214 mentions) and reasons related to anxiety and discomfort in in-person settings (41 mentions). Anxiety was only mentioned 12 times in the much larger group that prefers in-person learning.

The preference for synchronous vs. asynchronous modes of learning follows similar trends ( Table 6 ). Students who prefer live classes mention engagement and interaction most often while those who prefer recorded lectures mention flexibility.

www.frontiersin.org

Table 6. Most prevalent themes for students based on their preferred mode of remote learning.

Student Perceptions Align With Research on Active Learning

The first, and most robust, conclusion is that incorporation of active-learning methods correlates with more positive student perceptions of affect and engagement. We can see this clearly in the substantial differences on a number of measures, where students whose classes used only passive-learning techniques reported lower levels of engagement, satisfaction, participation, and motivation when compared with students whose classes incorporated at least some active-learning elements. This result is consistent with prior research on the value of active learning ( Freeman et al., 2014 ).

Though research shows that student learning improves in active learning classes, on campus, student perceptions of their learning, enjoyment, and satisfaction with instruction are often lower in active-learning courses ( Deslauriers et al., 2019 ). Our finding that students rate enjoyment and satisfaction with instruction higher for active learning online suggests that the preference for passive lectures on campus relies on elements outside of the lecture itself. That might include the lecture hall environment, the social physical presence of peers, or normalization of passive lectures as the expected mode for on-campus classes. This implies that there may be more buy-in for active learning online vs. in-person.

A second result from our survey is that student perceptions of affect and engagement are associated with students experiencing a greater diversity of learning modalities. We see this in two different results. First, in addition to the fact that classes that include active learning outperform classes that rely solely on passive methods, we find that on all measures besides distraction, the highest student ratings are associated with a combination of active and passive methods. Second, we find that these higher scores are associated with classes that make use of a larger number of different methods.

This second result suggests that students benefit from classes that make use of multiple different techniques, possibly invoking a combination of passive and active methods. However, it is unclear from our data whether this effect is associated specifically with combining active and passive methods, or if it is associated simply with the use of multiple different methods, irrespective of whether those methods are active, passive, or some combination. The problem is that the number of methods used is confounded with the diversity of methods (e.g., it is impossible for a classroom using only one method to use both active and passive methods). In an attempt to address this question, we looked separately at the effect of number and diversity of methods while holding the other constant. Across a large number of such comparisons, we found few statistically significant differences, which may be a consequence of the fact that each comparison focused on a small subset of the data.

Thus, our data suggests that using a greater diversity of learning methods in the classroom may lead to better student outcomes. This is supported by research on student attention span which suggests varying delivery after 10–15 min to retain student’s attention ( Bradbury, 2016 ). It is likely that this is more relevant for online learning where students report high levels of distraction across methods, modalities, and platforms. Given that number and variety are key, and there are few passive learning methods, we can assume that some combination of methods that includes active learning improves student experience. However, it is not clear whether we should predict that this benefit would come simply from increasing the number of different methods used, or if there are benefits specific to combining particular methods. Disentangling these effects would be an interesting avenue for future research.

Students Value Social Presence in Remote Learning

Student responses across our open-ended survey questions show a striking difference in reasons for their preferences compared with traditional online learners who prefer flexibility ( Harris and Martin, 2012 ; Levitz, 2016 ). Students reasons for preferring in-person classes and synchronous remote classes emphasize the desire for social interaction and echo the research on the importance of social presence for learning in online courses.

Short et al. (1976) outlined Social Presence Theory in depicting students’ perceptions of each other as real in different means of telecommunications. These ideas translate directly to questions surrounding online education and pedagogy in regards to educational design in networked learning where connection across learners and instructors improves learning outcomes especially with “Human-Human interaction” ( Goodyear, 2002 , 2005 ; Tu, 2002 ). These ideas play heavily into asynchronous vs. synchronous learning, where Tu reports students having positive responses to both synchronous “real-time discussion in pleasantness, responsiveness and comfort with familiar topics” and real-time discussions edging out asynchronous computer-mediated communications in immediate replies and responsiveness. Tu’s research indicates that students perceive more interaction with synchronous mediums such as discussions because of immediacy which enhances social presence and support the use of active learning techniques ( Gunawardena, 1995 ; Tu, 2002 ). Thus, verbal immediacy and communities with face-to-face interactions, such as those in synchronous learning classrooms, lessen the psychological distance of communicators online and can simultaneously improve instructional satisfaction and reported learning ( Gunawardena and Zittle, 1997 ; Richardson and Swan, 2019 ; Shea et al., 2019 ). While synchronous learning may not be ideal for traditional online students and a subset of our participants, this research suggests that non-traditional online learners are more likely to appreciate the value of social presence.

Social presence also connects to the importance of social connections in learning. Too often, current systems of education emphasize course content in narrow ways that fail to embrace the full humanity of students and instructors ( Gay, 2000 ). With the COVID-19 pandemic leading to further social isolation for many students, the importance of social presence in courses, including live interactions that build social connections with classmates and with instructors, may be increased.

Limitations of These Data

Our undergraduate data consisted of 4,789 responses from 95 different countries, an unprecedented global scale for research on online learning. However, since respondents were followers of @unjadedjade who focuses on learning and wellness, these respondents may not represent the average student. Biases in survey responses are often limited by their recruitment techniques and our bias likely resulted in more robust and thoughtful responses to free-response questions and may have influenced the preference for synchronous classes. It is unlikely that it changed students reporting on remote learning pedagogical methods since those are out of student control.

Though we surveyed a global population, our design was rooted in literature assessing pedagogy in North American institutions. Therefore, our survey may not represent a global array of teaching practices.

This survey was sent out during the initial phase of emergency remote learning for most countries. This has two important implications. First, perceptions of remote learning may be clouded by complications of the pandemic which has increased social, mental, and financial stresses globally. Future research could disaggregate the impact of the pandemic from students’ learning experiences with a more detailed and holistic analysis of the impact of the pandemic on students.

Second, instructors, students and institutions were not able to fully prepare for effective remote education in terms of infrastructure, mentality, curriculum building, and pedagogy. Therefore, student experiences reflect this emergency transition. Single-modality courses may correlate with instructors who lacked the resources or time to learn or integrate more than one modality. Regardless, the main insights of this research align well with the science of teaching and learning and can be used to inform both education during future emergencies and course development for online programs that wish to attract traditional college students.

Global Student Voices Improve Our Understanding of the Experience of Emergency Remote Learning

Our survey shows that global student perspectives on remote learning agree with pedagogical best practices, breaking with the often-found negative reactions of students to these practices in traditional classrooms ( Shekhar et al., 2020 ). Our analysis of open-ended questions and preferences show that a majority of students prefer pedagogical approaches that promote both active learning and social interaction. These results can serve as a guide to instructors as they design online classes, especially for students whose first choice may be in-person learning. Indeed, with the near ubiquitous adoption of remote learning during the COVID-19 pandemic, remote learning may be the default for colleges during temporary emergencies. This has already been used at the K-12 level as snow days become virtual learning days ( Aspergren, 2020 ).

In addition to informing pedagogical decisions, the results of this survey can be used to inform future research. Although we survey a global population, our recruitment method selected for students who are English speakers, likely majority female, and have an interest in self-improvement. Repeating this study with a more diverse and representative sample of university students could improve the generalizability of our findings. While the use of a variety of pedagogical methods is better than a single method, more research is needed to determine what the optimal combinations and implementations are for courses in different disciplines. Though we identified social presence as the major trend in student responses, the over 12,000 open-ended responses from students could be analyzed in greater detail to gain a more nuanced understanding of student preferences and suggestions for improvement. Likewise, outliers could shed light on the diversity of student perspectives that we may encounter in our own classrooms. Beyond this, our findings can inform research that collects demographic data and/or measures learning outcomes to understand the impact of remote learning on different populations.

Importantly, this paper focuses on a subset of responses from the full data set which includes 10,563 students from secondary school, undergraduate, graduate, or professional school and additional questions about in-person learning. Our full data set is available here for anyone to download for continued exploration: https://dataverse.harvard.edu/dataset.xhtml?persistentId= doi: 10.7910/DVN/2TGOPH .

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

GS: project lead, survey design, qualitative coding, writing, review, and editing. TN: data analysis, writing, review, and editing. CN and PB: qualitative coding. JW: data analysis, writing, and editing. CS: writing, review, and editing. EV and KL: original survey design and qualitative coding. PP: data analysis. JB: original survey design and survey distribution. HH: data analysis. MP: writing. All authors contributed to the article and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We want to thank Minerva Schools at KGI for providing funding for summer undergraduate research internships. We also want to thank Josh Fost and Christopher V. H.-H. Chen for discussion that helped shape this project.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2021.647986/full#supplementary-material

Aspergren, E. (2020). Snow Days Canceled Because of COVID-19 Online School? Not in These School Districts.sec. Education. USA Today. Available online at: https://www.usatoday.com/story/news/education/2020/12/15/covid-school-canceled-snow-day-online-learning/3905780001/ (accessed December 15, 2020).

Google Scholar

Bostock, S. J. (1998). Constructivism in mass higher education: a case study. Br. J. Educ. Technol. 29, 225–240. doi: 10.1111/1467-8535.00066

CrossRef Full Text | Google Scholar

Bradbury, N. A. (2016). Attention span during lectures: 8 seconds, 10 minutes, or more? Adv. Physiol. Educ. 40, 509–513. doi: 10.1152/advan.00109.2016

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, B., Bastedo, K., and Howard, W. (2018). Exploring best practices for online STEM courses: active learning, interaction & assessment design. Online Learn. 22, 59–75. doi: 10.24059/olj.v22i2.1369

Davis, D., Chen, G., Hauff, C., and Houben, G.-J. (2018). Activating learning at scale: a review of innovations in online learning strategies. Comput. Educ. 125, 327–344. doi: 10.1016/j.compedu.2018.05.019

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., and Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. 116, 19251–19257. doi: 10.1073/pnas.1821936116

Fink, L. D. (2013). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. Somerset, NJ: John Wiley & Sons, Incorporated.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. 111, 8410–8415. doi: 10.1073/pnas.1319030111

Gale, N. K., Heath, G., Cameron, E., Rashid, S., and Redwood, S. (2013). Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med. Res. Methodol. 13:117. doi: 10.1186/1471-2288-13-117

Garrison, D. R., Anderson, T., and Archer, W. (1999). Critical inquiry in a text-based environment: computer conferencing in higher education. Internet High. Educ. 2, 87–105. doi: 10.1016/S1096-7516(00)00016-6

Gay, G. (2000). Culturally Responsive Teaching: Theory, Research, and Practice. Multicultural Education Series. New York, NY: Teachers College Press.

Gillingham, and Molinari, C. (2012). Online courses: student preferences survey. Internet Learn. 1, 36–45. doi: 10.18278/il.1.1.4

Gillis, A., and Krull, L. M. (2020). COVID-19 remote learning transition in spring 2020: class structures, student perceptions, and inequality in college courses. Teach. Sociol. 48, 283–299. doi: 10.1177/0092055X20954263

Goodyear, P. (2002). “Psychological foundations for networked learning,” in Networked Learning: Perspectives and Issues. Computer Supported Cooperative Work , eds C. Steeples and C. Jones (London: Springer), 49–75. doi: 10.1007/978-1-4471-0181-9_4

Goodyear, P. (2005). Educational design and networked learning: patterns, pattern languages and design practice. Australas. J. Educ. Technol. 21, 82–101. doi: 10.14742/ajet.1344

Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. Int. J. Educ. Telecommun. 1, 147–166.

Gunawardena, C. N., and Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. Am. J. Distance Educ. 11, 8–26. doi: 10.1080/08923649709526970

Harris, H. S., and Martin, E. (2012). Student motivations for choosing online classes. Int. J. Scholarsh. Teach. Learn. 6, 1–8. doi: 10.20429/ijsotl.2012.060211

Levitz, R. N. (2016). 2015-16 National Online Learners Satisfaction and Priorities Report. Cedar Rapids: Ruffalo Noel Levitz, 12.

Mansinghka, V., Shafto, P., Jonas, E., Petschulat, C., Gasner, M., and Tenenbaum, J. B. (2016). CrossCat: a fully Bayesian nonparametric method for analyzing heterogeneous, high dimensional data. J. Mach. Learn. Res. 17, 1–49. doi: 10.1007/978-0-387-69765-9_7

National Research Council (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: National Academies Press, doi: 10.17226/9853

Richardson, J. C., and Swan, K. (2019). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Online Learn. 7, 68–88. doi: 10.24059/olj.v7i1.1864

Shea, P., Pickett, A. M., and Pelz, W. E. (2019). A Follow-up investigation of ‘teaching presence’ in the suny learning network. Online Learn. 7, 73–75. doi: 10.24059/olj.v7i2.1856

Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., and Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: a systematic review of underlying reasons. J. Coll. Sci. Teach. 49, 45–54.

Short, J., Williams, E., and Christie, B. (1976). The Social Psychology of Telecommunications. London: John Wiley & Sons.

Tu, C.-H. (2002). The measurement of social presence in an online learning environment. Int. J. E Learn. 1, 34–45. doi: 10.17471/2499-4324/421

Zull, J. E. (2002). The Art of Changing the Brain: Enriching Teaching by Exploring the Biology of Learning , 1st Edn. Sterling, VA: Stylus Publishing.

Keywords : online learning, COVID-19, active learning, higher education, pedagogy, survey, international

Citation: Nguyen T, Netto CLM, Wilkins JF, Bröker P, Vargas EE, Sealfon CD, Puthipiroj P, Li KS, Bowler JE, Hinson HR, Pujar M and Stein GM (2021) Insights Into Students’ Experiences and Perceptions of Remote Learning Methods: From the COVID-19 Pandemic to Best Practice for the Future. Front. Educ. 6:647986. doi: 10.3389/feduc.2021.647986

Received: 30 December 2020; Accepted: 09 March 2021; Published: 09 April 2021.

Reviewed by:

Copyright © 2021 Nguyen, Netto, Wilkins, Bröker, Vargas, Sealfon, Puthipiroj, Li, Bowler, Hinson, Pujar and Stein. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Geneva M. Stein, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

  • Open access
  • Published: 16 September 2021

Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study

  • Meixun Zheng 1 ,
  • Daniel Bender 1 &
  • Cindy Lyon 1  

BMC Medical Education volume  21 , Article number:  495 ( 2021 ) Cite this article

214k Accesses

95 Citations

118 Altmetric

Metrics details

The COVID-19 pandemic forced dental schools to close their campuses and move didactic instruction online. The abrupt transition to online learning, however, has raised several issues that have not been resolved. While several studies have investigated dental students’ attitude towards online learning during the pandemic, mixed results have been reported. Additionally, little research has been conducted to identify and understand factors, especially pedagogical factors, that impacted students’ acceptance of online learning during campus closure. Furthermore, how online learning during the pandemic impacted students’ learning performance has not been empirically investigated. In March 2020, the dental school studied here moved didactic instruction online in response to government issued stay-at-home orders. This first-of-its-kind comparative study examined students’ perceived effectiveness of online courses during summer quarter 2020, explored pedagogical factors impacting their acceptance of online courses, and empirically evaluated the impact of online learning on students’ course performance, during the pandemic.

The study employed a quasi-experimental design. Participants were 482 pre-doctoral students in a U.S dental school. Students’ perceived effectiveness of online courses during the pandemic was assessed with a survey. Students’ course grades for online courses during summer quarter 2020 were compared with that of a control group who received face-to-face instruction for the same courses before the pandemic in summer quarter 2019.

Survey results revealed that most online courses were well accepted by the students, and 80 % of them wanted to continue with some online instruction post pandemic. Regression analyses revealed that students’ perceived engagement with faculty and classmates predicted their perceived effectiveness of the online course. More notably, Chi Square tests demonstrated that in 16 out of the 17 courses compared, the online cohort during summer quarter 2020 was equally or more likely to get an A course grade than the analogous face-to-face cohort during summer quarter 2019.

Conclusions

This is the first empirical study in dental education to demonstrate that online courses during the pandemic could achieve equivalent or better student course performance than the same pre-pandemic in-person courses. The findings fill in gaps in literature and may inform online learning design moving forward.

Peer Review reports

Introduction

Research across disciplines has demonstrated that well-designed online learning can lead to students’ enhanced motivation, satisfaction, and learning [ 1 , 2 , 3 , 4 , 5 , 6 , 7 ]. A report by the U.S. Department of Education [ 8 ], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning outcomes equivalent to or better than face-to-face learning. The more recent systematic review by Pei and Wu [ 9 ] provided additional evidence that online learning is at least as effective as face-to-face learning for undergraduate medical students.

To take advantage of the opportunities presented by online learning, thought leaders in dental education in the U.S. have advocated for the adoption of online learning in the nation’s dental schools [ 10 , 11 , 12 ]. However, digital innovation has been a slow process in academic dentistry [ 13 , 14 , 15 ]. In March 2020, the COVID-19 pandemic brought unprecedented disruption to dental education by necessitating the need for online learning. In accordance with stay-at-home orders to prevent the spread of the virus, dental schools around the world closed their campuses and moved didactic instruction online.

The abrupt transition to online learning, however, has raised several concerns and question. First, while several studies have examined dental students’ online learning satisfaction during the pandemic, mixed results have been reported. Some studies have reported students’ positive attitude towards online learning [ 15 , 16 , 17 , 18 , 19 , 20 ]. Sadid-Zadeh et al. [ 18 ] found that 99 % of the surveyed dental students at University of Buffalo, in the U.S., were satisfied with live web-based lectures during the pandemic. Schlenz et al. [ 15 ] reported that students in a German dental school had a favorable attitude towards online learning and wanted to continue with online instruction in their future curriculum. Other studies, however, have reported students’ negative online learning experience during the pandemic [ 21 , 22 , 23 , 24 , 25 , 26 ]. For instance, dental students at Harvard University felt that learning during the pandemic had worsened and engagement had decreased [ 23 , 24 ]. In a study with medical and dental students in Pakistan, Abbasi et al. [ 21 ] found that 77 % of the students had negative perceptions about online learning and 84 % reported reduced student-instructor interactions.

In addition to these mixed results, little attention has been given to factors affecting students’ acceptance of online learning during the pandemic. With the likelihood that online learning will persist post pandemic [ 27 ], research in this area is warranted to inform online course design moving forward. In particular, prior research has demonstrated that one of the most important factors influencing students’ performance in any learning environment is a sense of belonging, the feeling of being connected with and supported by the instructor and classmates [ 28 , 29 , 30 , 31 ]. Unfortunately, this aspect of the classroom experience has suffered during school closure. While educational events can be held using a video conferencing system, virtual peer interaction on such platforms has been perceived by medical trainees to be not as easy and personal as physical interaction [ 32 ]. The pandemic highlights the need to examine instructional strategies most suited to the current situation to support students’ engagement with faculty and classmates.

Furthermore, there is considerable concern from the academic community about the quality of online learning. Pre-pandemic, some faculty and students were already skeptical about the value of online learning [ 33 ]. The longer the pandemic lasts, the more they may question the value of online education, asking: Can online learning during the pandemic produce learning outcomes that are similar to face-to-face learning before the pandemic? Despite the documented benefits of online learning prior to the pandemic, the actual impact of online learning during the pandemic on students’ academic performance is still unknown due to reasons outlined below.

On one hand, several factors beyond the technology used could influence the effectiveness of online learning, one of which is the teaching context [ 34 ]. The sudden transition to online learning has posed many challenges to faculty and students. Faculty may not have had adequate time to carefully design online courses to take full advantage of the possibilities of the online format. Some faculty may not have had prior online teaching experience and experienced a deeper learning curve when it came to adopting online teaching methods [ 35 ]. Students may have been at the risk of increased anxiety due to concerns about contracting the virus, on time graduation, finances, and employment [ 36 , 37 ], which may have negatively impacted learning performance [ 38 ]. Therefore, whether online learning during the pandemic could produce learning outcomes similar to those of online learning implemented during more normal times remains to be determined.

Most existing studies on online learning in dental education during the pandemic have only reported students’ satisfaction. The actual impact of the online format on academic performance has not been empirically investigated. The few studies that have examined students’ learning outcomes have only used students’ self-reported data from surveys and focus groups. According to Kaczmarek et al. [ 24 ], 50 % of the participating dental faculty at Harvard University perceived student learning to have worsened during the pandemic and 70 % of the students felt the same. Abbasi et al. [ 21 ] reported that 86 % of medical and dental students in a Pakistan college felt that they learned less online. While student opinions are important, research has demonstrated a poor correlation between students’ perceived learning and actual learning gains [ 39 ]. As we continue to navigate the “new normal” in teaching, students’ learning performance needs to be empirically evaluated to help institutions gauge the impact of this grand online learning experiment.

Research purposes

In March 2020, the University of the Pacific Arthur A. Dugoni School of Dentistry, in the U.S., moved didactic instruction online to ensure the continuity of education during building closure. This study examined students’ acceptance of online learning during the pandemic and its impacting factors, focusing on instructional practices pertaining to students’ engagement/interaction with faculty and classmates. Another purpose of this study was to empirically evaluate the impact of online learning during the pandemic on students’ actual course performance by comparing it with that of a pre-pandemic cohort. To understand the broader impact of the institutional-wide online learning effort, we examined all online courses offered in summer quarter 2020 (July to September) that had a didactic component.

This is the first empirical study in dental education to evaluate students’ learning performance during the pandemic. The study aimed to answer the following three questions.

How well was online learning accepted by students, during the summer quarter 2020 pandemic interruption?

How did instructional strategies, centered around students’ engagement with faculty and classmates, impact their acceptance of online learning?

How did online learning during summer quarter 2020 impact students’ course performance as compared with a previous analogous cohort who received face-to-face instruction in summer quarter 2019?

This study employed a quasi-experimental design. The study was approved by the university’s institutional review board (#2020-68).

Study context and participants

The study was conducted at the Arthur A. Dugoni School of Dentistry, University of the Pacific. The program runs on a quarter system. It offers a 3-year accelerated Doctor of Dental Surgery (DDS) program and a 2-year International Dental Studies (IDS) program for international dentists who have obtained a doctoral degree in dentistry from a country outside the U.S. and want to practice in the U.S. Students advance throughout the program in cohorts. IDS students take some courses together with their DDS peers. All three DDS classes (D1/DDS 2023, D2/DDS 2022, and D3/DDS 2021) and both IDS classes (I1/IDS 2022 and I2/IDS 2021) were invited to participate in the study. The number of students in each class was: D1 = 145, D2 = 143, D3 = 143, I1 = 26, and I2 = 25. This resulted in a total of 482 student participants.

During campus closure, faculty delivered remote instruction in various ways, including live online classes via Zoom @  [ 40 ], self-paced online modules on the school’s learning management system Canvas @  [ 41 ], or a combination of live and self-paced delivery. For self-paced modules, students studied assigned readings and/or viewings such as videos and pre-recorded slide presentations. Some faculty also developed self-paced online lessons with SoftChalk @  [ 42 ], a cloud-based platform that supports the inclusion of gamified learning by insertion of various mini learning activities. The SoftChalk lessons were integrated with Canvas @  [ 41 ] and faculty could monitor students’ progress. After students completed the pre-assigned online materials, some faculty held virtual office hours or live online discussion sessions for students to ask questions and discuss key concepts.

Data collection and analysis

Student survey.

Students’ perceived effectiveness of summer quarter 2020 online courses was evaluated by the school’s Office of Academic Affairs in lieu of the regular course evaluation process. A total of 19 courses for DDS students and 10 courses for IDS students were evaluated. An 8-question survey developed by the researchers (Additional file 1 ) was administered online in the last week of summer quarter 2020. Course directors invited student to take the survey during live online classes. The survey introduction stated that taking the survey was voluntary and that their anonymous responses would be reported in aggregated form for research purposes. Students were invited to continue with the survey if they chose to participate; otherwise, they could exit the survey. The number of students in each class who took the survey was as follows: D1 ( n  = 142; 98 %), D2 ( n  = 133; 93 %), D3 ( n  = 61; 43 %), I1 ( n  = 23; 88 %), and I2 ( n  = 20; 80 %). This resulted in a total of 379 (79 %) respondents across all classes.

The survey questions were on a 4-point scale, ranging from Strongly Disagree (1 point), Disagree (2 points), Agree (3 points), and Strongly Agree (4 points). Students were asked to rate each online course by responding to four statements: “ I could fully engage with the instructor and classmates in this course”; “The online format of this course supported my learning”; “Overall this online course is effective.”, and “ I would have preferred face-to-face instruction for this course ”. For the first three survey questions, a higher mean score indicated a more positive attitude toward the online course. For the fourth question “ I would have preferred face-to-face instruction for this course ”, a higher mean score indicated that more students would have preferred face-to-face instruction for the course. Two additional survey questions asked students to select their preferred online delivery method for fully online courses during the pandemic from three given choices (synchronous online/live, asynchronous online/self-paced, and a combination of both), and to report whether they wanted to continue with some online instruction post pandemic. Finally, two open-ended questions at the end of the survey allowed students to comment on the aspects of online format that they found to be helpful and to provide suggestion for improvement. For the purpose of this study, we focused on the quantitative data from the Likert-scale questions.

Descriptive data such as the mean scores were reported for each course. Regression analyses were conducted to examine the relationship between instructional strategies focusing on students’ engagement with faculty and classmates, and their overall perceived effectiveness of the online course. The independent variable was student responses to the question “ I could fully engage with the instructor and classmates in this course ”, and the dependent variable was their answer to the question “ Overall, this online course is effective .”

Student course grades

Using Chi-square tests, student course grade distributions (A, B, C, D, and F) for summer quarter 2020 online courses were compared with that of a previous cohort who received face-to-face instruction for the same course in summer quarter 2019. Note that as a result of the school’s pre-doctoral curriculum redesign implemented in July 2019, not all courses offered in summer quarter 2020 were offered in the previous year in summer quarter 2019. In other words, some of the courses offered in summer quarter 2020 were new courses offered for the first time. Because these new courses did not have a previous face-to-face version to compare to, they were excluded from data analysis. For some other courses, while course content remained the same between 2019 and 2020, the sequence of course topics within the course had changed. These courses were also excluded from data analysis.

After excluding the aforementioned courses, it resulted in a total of 17 “comparable” courses that were included in data analysis (see the subsequent section). For these courses, the instructor, course content, and course goals were the same in both 2019 and 2020. The assessment methods and grading policies also remained the same through both years. For exams and quizzes, multiple choice questions were the dominating format for both years. While some exam questions in 2020 were different from 2019, faculty reported that the overall exam difficulty level was similar. The main difference in assessment was testing conditions. The 2019 cohort took computer-based exams in the physical classroom with faculty proctoring, and the 2020 cohort took exams at home with remote proctoring to ensure exam integrity. The remote proctoring software monitored the student during the exam through a web camera on their computer/laptop. The recorded video file flags suspicious activities for faculty review after exam completion.

Students’ perceived effectiveness of online learning

Table  1 summarized data on DDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, the majority of courses received a mean score that was approaching or over 3 points on the 4-point scale, suggesting that online learning was generally well accepted by students. Despite overall positive online course experiences, for many of the courses examined, there was an equal split in student responses to the question “ I would have preferred face-to-face instruction for this course .” Additionally, for students’ preferred online delivery method for fully online courses, about half of the students in each class preferred a combination of synchronous and asynchronous online learning (see Fig.  1 ). Finally, the majority of students wanted faculty to continue with some online instruction post pandemic: D1class (110; 78.60 %), D2 class (104; 80 %), and D3 class (49; 83.10 %).

While most online courses received favorable ratings, some variations did exist among courses. For D1 courses, “ Anatomy & Histology ” received lower ratings than others. This could be explained by its lab component, which didn’t lend itself as well to the online format. For D2 courses, several of them received lower ratings than others, especially for the survey question on students’ perceived engagement with faculty and classmates.

figure 1

DDS students’ preferred online delivery method for fully online courses

Table  2 summarized IDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, all courses received a mean score that was approaching or over 3 points on a 4-point scale, suggesting that online learning was well accepted by students. For the survey question “ I would have preferred face-to-face instruction for this course ”, for most online courses examined, the percentage of students who would have preferred face-to-face instruction was similar to that of students who preferred online instruction for the course. Like their DDS peers, about half of the IDS students in each class also preferred a combination of synchronous and asynchronous online delivery for fully online courses (See Fig.  2 ). Finally, the majority of IDS students (I1, n = 18, 81.80 %; I2, n = 16, 84.20 %) wanted to continue with some online learning after the pandemic is over.

figure 2

IDS students’ preferred online delivery method for fully online courses

Factors impacting students’ acceptance of online learning

For all 19 online courses taken by DDS students, regression analyses indicated that there was a significantly positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across all courses. The ranges of effect size (r 2 ) were: D1 courses (0.26 to 0.50), D2 courses (0.39 to 0.650), and D3 courses (0.22 to 0.44), indicating moderate to high correlations across courses.

For 9 out of the 10 online courses taken by IDS students, there was a positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across courses. The ranges of effect size were: I1 courses (0.35 to 0.77) and I2 courses (0.47 to 0.63), indicating consistently high correlations across courses. The only course in which students’ perceived engagement with faculty and classmates didn’t predict perceived effective of the course was “ Integrated Clinical Science III (ICS III) ”, which the I2 class took together with their D3 peers.

Impact of online learning on students’ course performance

Chi square test results (Table  3 ) indicated that in 4 out of the 17 courses compared, the online cohort during summer quarter 2020 was more likely to receive an A grade than the face-to-face cohort during summer quarter 2019. In 12 of the courses, the online cohort were equally likely to receive an A grade as the face-to-face cohort. In the remaining one course, the online cohort was less likely to receive an A grade than the face-to-face cohort.

Students’ acceptance of online learning during the pandemic

Survey results revealed that students had generally positive perceptions about online learning during the pandemic and the majority of them wanted to continue with some online learning post pandemic. Overall, our findings supported several other studies in dental [ 18 , 20 ], medical [ 43 , 44 ], and nursing [ 45 ] education that have also reported students’ positive attitudes towards online learning during the pandemic. In their written comments in the survey, students cited enhanced flexibility as one of the greatest benefits of online learning. Some students also commented that typing questions in the chat box during live online classes was less intimidating than speaking in class. Others explicitly stated that not having to commute to/from school provided more time for sleep, which helped with self-care and mental health. Our findings are in line with previous studies which have also demonstrated that online learning offered higher flexibility [ 46 , 47 ]. Meanwhile, consistent with findings of other researchers [ 19 , 21 , 46 ], our students felt difficulty engaging with faculty and classmates in several online courses.

There were some variations among individual courses in students’ acceptance of the online format. One factor that could partially account for the observed differences was instructional strategies. In particular, our regression analysis results demonstrated a positive correlation between students’ perceived engagement with faculty and classmates and their perceived overall effectiveness of the online course. Other aspects of course design might also have influenced students’ overall rating of the online course. For instance, some D2 students commented that the requirements of the course “ Integrated Case-based Seminars (ICS II) ” were not clear and that assessment did not align with lecture materials. It is important to remember that communicating course requirements clearly and aligning course content and assessment are principles that should be applied in any course, whether face-to-face or online. Our results highlighted the importance of providing faculty training on basic educational design principles and online learning design strategies. Furthermore, the nature of the course might also have impacted student ratings. For example, D1 course “ Anatomy and Histology ” had a lab component, which did not lend itself as well to the online format. Many students reported that it was difficult to see faculty’s live demonstration during Zoom lectures, which may have resulted in a lower student satisfaction rating.

As for students’ preferred online delivery method for fully online courses during the pandemic, about half of them preferred a combination of synchronous and asynchronous online learning. In light of this finding, as we continue with remote learning until public health directives allow a return to campus, we will encourage faculty to integrate these two online delivery modalities. Finally, in view of the result that over 80 % of the students wanted to continue with some online instruction after the pandemic, the school will advocate for blended learning in the post-pandemic world [ 48 ]. For future face-to-face courses on campus after the pandemic, faculty are encouraged to deliver some content online to reduce classroom seat time and make learning more flexible. Taken together, our findings not only add to the overall picture of the current situation but may inform learning design moving forward.

Role of online engagement and interaction

To reiterate, we found that students’ perceived engagement with faculty and classmates predicted their perceived overall effectiveness of the online course. This aligns with the larger literature on best practices in online learning design. Extensive research prior to the pandemic has confirmed that the effectiveness of online learning is determined by a number of factors beyond the tools used, including students’ interactions with the instructor and classmates [ 49 , 50 , 51 , 52 ]. Online students may feel isolated due to reduced or lack of interaction [ 53 , 54 ]. Therefore, in designing online learning experiences, it is important to remember that learning is a social process [ 55 ]. Faculty’s role is not only to transmit content but also to promote the different types of interactions that are an integral part of the online learning process [ 33 ]. The online teaching model in which faculty uploads materials online but teach it in the same way as in the physical classroom, without special effort to engage students, doesn’t make the best use of the online format. Putting the “sage on the screen” during a live class meeting on a video conferencing system is not different from “sage on the stage” in the physical classroom - both provide limited space for engagement. Such one-way monologue devalues the potentials that online learning presents.

In light of the critical role that social interaction plays in online learning, faculty are encouraged to use the interactive features of online learning platforms to provide clear channels for student-instructor and student-student interactions. In the open-ended comments, students highlighted several instructional strategies that they perceived to be helpful for learning. For live online classes, these included conducting breakout room activities, using the chat box to facilitate discussions, polling, and integrating gameplay with apps such as Kahoot! @  [ 56 ]. For self-paced classes, students appreciated that faculty held virtual office hours or subsequent live online discussion sessions to reinforce understanding of the pre-assigned materials.

Quality of online education during the pandemic

This study provided empirical evidence in dental education that it was possible to ensure the continuity of education without sacrificing the quality of education provided to students during forced migration to distance learning upon building closure. To reiterate, in all but one online course offered in summer quarter 2020, students were equally or more likely to get an A grade than the face-to-face cohort from summer quarter 2019. Even for courses that had less student support for the online format (e.g., the D1 course “ Anatomy and Histology ”), there was a significant increase in the number of students who earned an A grade in 2020 as compared with the previous year. The reduced capacity for technical training during the pandemic may have resulted in more study time for didactic content. Overall, our results resonate with several studies in health sciences education before the pandemic that the quality of learning is comparable in face-to-face and online formats [ 9 , 57 , 58 ]. For the only course ( Integrated Case-based Seminars ICS II) in which the online cohort had inferior performance than the face-to-face cohort, as mentioned earlier, students reported that assessment was not aligned with course materials and that course expectations were not clear. This might explain why students’ course performance was not as strong as expected.

Limitations

This study used a pre-existing control group from the previous year. There may have been individual differences between students in the online and the face-to-face cohorts, such as motivation, learning style, and prior knowledge, that could have impacted the observed outcomes. Additionally, even though course content and assessment methods were largely the same in 2019 and 2020, changes in other aspects of the course could have impacted students’ course performance. Some faculty may have been more compassionate with grading (e.g., more flexible with assignment deadlines) in summer quarter 2020 given the hardship students experienced during the pandemic. On the other hand, remote proctoring in summer quarter 2020 may have heightened some students’ exam anxiety knowing that they were being monitored through a webcam. The existence and magnitude of effect of these factors needs to be further investigated.

This present study only examined the correlation between students’ perceived online engagement and their perceived overall effectiveness of the online course. Other factors that might impact their acceptance of the online format need to be further researched in future studies. Another future direction is to examine how students’ perceived online engagement correlates with their actual course performance. Because the survey data collected for our present study are anonymous, we cannot match students’ perceived online engagement data with their course grades to run this additional analysis. It should also be noted that this study was focused on didactic online instruction. Future studies might examine how technical training was impacted during the COVID building closure. It was also out of the scope of this study to examine how student characteristics, especially high and low academic performance as reflected by individual grades, affects their online learning experience and performance. We plan to conduct a follow-up study to examine which group of students are most impacted by the online format. Finally, this study was conducted in a single dental school, and so the findings may not be generalizable to other schools and disciplines. Future studies could be conducted in another school or disciplines to compare results.

This study revealed that dental students had generally favorable attitudes towards online learning during the COVID-19 pandemic and that their perceived engagement with faculty and classmates predicted their acceptance of the online course. Most notably, this is the first study in dental education to demonstrate that online learning during the pandemic could achieve similar or better learning outcomes than face-to-face learning before the pandemic. Findings of our study could contribute significantly to the literature on online learning during the COVID-19 pandemic in health sciences education. The results could also inform future online learning design as we re-envision the future of online learning.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Bello G, Pennisi MA, Maviglia R, Maggiore SM, Bocci MG, Montini L, et al. Online vs live methods for teaching difficult airway management to anesthesiology residents. Intensive Care Med. 2005; 31 (4): 547–552.

Article   Google Scholar  

Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006; 81(3): 207–12.

Kavadella A, Tsiklakis K, Vougiouklakis G, Lionarakis A. Evaluation of a blended learning course for teaching oral radiology to undergraduate dental students. Eur J Dent Educ. 2012; 16(1): 88–95.

de Jong N, Verstegen DL, Tan FS, O’Connor SJ. A comparison of classroom and online asynchronous problem-based learning for students undertaking statistics training as part of a public health master’s degree. Adv Health Sci Educ. 2013; 18(2):245–64.

Hegeman JS. Using instructor-generated video lectures in online mathematics coursesimproves student learning. Online Learn. 2015;19(3):70–87.

Gaupp R, Körner M, Fabry G. Effects of a case-based interactive e-learning course on knowledge and attitudes about patient safety: a quasi-experimental study with third-year medical students. BMC Med Educ. 2016; 16(1):172.

Zheng M, Bender D, Reid L, Milani J. An interactive online approach to teaching evidence-based dentistry with Web 2.0 technology. J Dent Educ. 2017; 81(8): 995–1003.

Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Washington D.C. 2009.

Google Scholar  

Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019; 24(1):1666538.

Andrews KG, Demps EL. Distance education in the U.S. and Canadian undergraduate dental curriculum. J Dent Educ. 2003; 67(4):427–38.

Kassebaum DK, Hendricson WD, Taft T, Haden NK. The dental curriculum at North American dental institutions in 2002–03: a survey of current structure, recent innovations, and planned changes. J Dent Educ. 2004; 68(9):914–931.

Haden NK, Hendricson WD, Kassebaum DK, Ranney RR, Weinstein G, Anderson EL, et al. Curriculum changes in dental education, 2003–09. J Dent Educ. 2010; 74(5):539–57.

DeBate RD, Cragun D, Severson HH, Shaw T, Christiansen S, Koerber A, et al. Factors for increasing adoption of e-courses among dental and dental hygiene faculty members. J Dent Educ. 2011; 75 (5): 589–597.

Saeed SG, Bain J, Khoo E, Siqueira WL. COVID-19: Finding silver linings for dental education. J Dent Educ. 2020; 84(10):1060–1063.

Schlenz MA, Schmidt A, Wöstmann B, Krämer N, Schulz-Weidner N. Students’ and lecturers’ perspective on the implementation of online learning in dental education due to SARS-CoV-2 (COVID-19): a cross-sectional study. BMC Med Educ. 2020;20(1):1–7.

Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual Objective Structured Clinical Examination in dental education. A response to COVID-19. Eur J Dent Educ. 2020; https://doi.org/10.1111/eje.12624

Hung M, Licari FW, Hon ES, Lauren E, Su S, Birmingham WC, Wadsworth LL, Lassetter JH, Graff TC, Harman W, et al. In an era of uncertainty: impact of COVID-19 on dental education. J Dent Educ. 2020; 85 (2): 148–156.

Sadid-Zadeh R, Wee A, Li R, Somogyi‐Ganss E. Audience and presenter comparison of live web‐based lectures and traditional classroom lectures during the COVID‐19 pandemic. J Prosthodont. 2020. doi: https://doi.org/10.1111/jopr.13301

Wang K, Zhang L, Ye L. A nationwide survey of online teaching strategies in dental education in China. J Dent Educ. 2020; 85 (2): 128–134.

Rad FA, Otaki F, Baqain Z, Zary N, Al-Halabi M. Rapid transition to distance learning due to COVID-19: Perceptions of postgraduate dental learners and instructors. PLoS One. 2021; 16(2): e0246584.

Abbasi S, Ayoob T, Malik A, Memon SI. Perceptions of students regarding E-learning during Covid-19 at a private medical college. Pak J Med Sci. 2020; 3 6 : 57–61.

Al-Azzam N, Elsalem L, Gombedza F. A cross-sectional study to determine factors affecting dental and medical students’ preference for virtual learning during the COVID-19 outbreak. Heliyon. 6(12). 2020. doi: https://doi.org/10.1016/j.heliyon.2020.e05704

Chen E, Kaczmarek K, Ohyama H. Student perceptions of distance learning strategies during COVID-19. J Dent Educ. 2020. doi: https://doi.org/10.1002/jdd.12339

Kaczmarek K, Chen E, Ohyama H. Distance learning in the COVID-19 era: Comparison of student and faculty perceptions. J Dent Educ. 2020. https://doi.org/10.1002/jdd.12469

Sarwar H, Akhtar H, Naeem MM, Khan JA, Waraich K, Shabbir S, et al. Self-reported effectiveness of e-learning classes during COVID-19 pandemic: A nation-wide survey of Pakistani undergraduate dentistry students. Eur J Dent. 2020; 14 (S01): S34-S43.

Al-Taweel FB, Abdulkareem AA, Gul SS, Alshami ML. Evaluation of technology‐based learning by dental students during the pandemic outbreak of coronavirus disease 2019. Eur J Dent Educ. 2021; 25(1): 183–190.

Elangovan S, Mahrous A, Marchini L. Disruptions during a pandemic: Gaps identified and lessons learned. J Dent Educ. 2020; 84 (11): 1270–1274.

Goodenow C. Classroom belonging among early adolescent students: Relationships to motivation and achievement. J Early Adolesc.1993; 13(1): 21–43.

Goodenow C. The psychological sense of school membership among adolescents: Scale development and educational correlates. Psychol Sch. 1993; 30(1): 79–90.

St-Amand J, Girard S, Smith J. Sense of belonging at school: Defining attributes, determinants, and sustaining strategies. IAFOR Journal of Education. 2017; 5(2):105–19.

Peacock S, Cowan J. Promoting sense of belonging in online learning communities of inquiry at accredited courses. Online Learn. 2019; 23(2): 67–81.

Chan GM, Kanneganti A, Yasin N, Ismail-Pratt I, Logan SJ. Well‐being, obstetrics and gynecology and COVID‐19: Leaving no trainee behind. Aust N Z J Obstet Gynaecol. 2020; 60(6): 983–986.

Hodges C, Moore S, Lockee B, Trust T, Bond A. The difference between emergency remote teaching and online learning. Educause Review. 2020; 2 7 , 1–12.

Means B, Bakia M, Murphy R. Learning online: What research tells us about whether, when and how. Routledge. 2014.

Iyer P, Aziz K, Ojcius DM. Impact of COVID-19 on dental education in the United States. J Dent Educ. 2020; 84(6): 718–722.

Machado RA, Bonan PRF, Perez DEDC, Martelli JÚnior H. 2020. COVID-19 pandemic and the impact on dental education: Discussing current and future perspectives. Braz Oral Res. 2020; 34: e083.

Wu DT, Wu KY, Nguyen TT, Tran SD. The impact of COVID-19 on dental education in North America-Where do we go next? Eur J Dent Educ. 2020; 24(4): 825–827.

de Oliveira Araújo FJ, de Lima LSA, Cidade PIM, Nobre CB, Neto MLR. Impact of Sars-Cov-2 and its reverberation in global higher education and mental health. Psychiatry Res. 2020; 288:112977. doi: https://doi.org/10.1016/j.psychres.2020.112977

Persky AM, Lee E, Schlesselman LS. Perception of learning versus performance as outcome measures of educational research. Am J Pharm Educ. 2020; 8 4 (7): ajpe7782.

Zoom @ . Zoom Video Communications , San Jose, CA, USA. https://zoom.us/

Canvas @ . Instructure, INC. Salt Lake City, UT, USA. https://www.instructure.com/canvas

SoftChalk @ . SoftChalk LLC . San Antonio, TX, USA. https://www.softchalkcloud.com/

Agarwal S, Kaushik JS. Student’s perception of online learning during COVID pandemic. Indian J Pediatr. 2020; 87: 554–554.

Khalil R, Mansour AE, Fadda WA, Almisnid K, Aldamegh M, Al-Nafeesah A, et al. The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: a qualitative study exploring medical students’ perspectives. BMC Med Educ. 2020; 20(1): 1–10.

Riley E, Capps N, Ward N, McCormack L, Staley J. Maintaining academic performance and student satisfaction during the remote transition of a nursing obstetrics course to online instruction. Online Learn. 2021; 25(1), 220–229.

Amir LR, Tanti I, Maharani DA, Wimardhani YS, Julia V, Sulijaya B, et al. Student perspective of classroom and distance learning during COVID-19 pandemic in the undergraduate dental study program Universitas Indonesia. BMC Med Educ. 2020; 20(1):1–8.

Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020; 10(11).

Graham CR, Woodfield W, Harrison JB. A framework for institutional adoption and implementation of blended learning in higher education. Internet High Educ. 2013; 18 : 4–14.

Sing C, Khine M. An analysis of interaction and participation patterns in online community. J Educ Techno Soc. 2006; 9(1): 250–261.

Bernard RM, Abrami PC, Borokhovski E, Wade CA, Tamim RM, Surkes MA, et al. A meta-analysis of three types of interaction treatments in distance education. Rev Educ Res. 2009; 79(3): 1243–1289.

Fedynich L, Bradley KS, Bradley J. Graduate students’ perceptions of online learning. Res High Educ. 2015; 27.

Tanis CJ. The seven principles of online learning: Feedback from faculty and alumni on its importance for teaching and learning. Res Learn Technol. 2020; 28 . https://doi.org/10.25304/rlt.v28.2319

Dixson MD. Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learn. 2015; 19 (4).

Kwary DA, Fauzie S. Students’ achievement and opinions on the implementation of e-learning for phonetics and phonology lectures at Airlangga University. Educ Pesqui. 2018; 44 .

Vygotsky LS. Mind in society: The development of higher psychological processes. Cambridge (MA): Harvard University Press. 1978.

Kahoot! @ . Oslo, Norway. https://kahoot.com/

Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomised controlled trial. BMC Med Educ. 2007; 7(1): 1–6.

Davis J, Crabb S, Rogers E, Zamora J, Khan K. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomized controlled trial. Med Teach. 2008; 30(3): 302–307.

Download references

Acknowledgements

Not applicable.

Authors’ information

MZ is an Associate Professor of Learning Sciences and Senior Instructional Designer at School of Dentistry, University of the Pacific. She has a PhD in Education, with a specialty on learning sciences and technology. She has dedicated her entire career to conducting research on online learning, learning technology, and faculty development. Her research has resulted in several peer-reviewed publications in medical, dental, and educational technology journals. MZ has also presented regularly at national conferences.

DB is an Assistant Dean for Academic Affairs at School of Dentistry, University of the Pacific. He has an EdD degree in education, with a concentration on learning and instruction. Over the past decades, DB has been overseeing and delivering faculty pedagogical development programs to dental faculty. His research interest lies in educational leadership and instructional innovation. DB has co-authored several peer-reviewed publications in health sciences education and presented regularly at national conferences.

CL is Associate Dean of Oral Healthcare Education, School of Dentistry, University of the Pacific. She has a Doctor of Dental Surgery (DDS) degree and an EdD degree with a focus on educational leadership. Her professional interest lies in educational leadership, oral healthcare education innovation, and faculty development. CL has co-authored several publications in peer-reviewed journals in health sciences education and presented regularly at national conferences.

Author information

Authors and affiliations.

Office of Academic Affairs, Arthur A. Dugoni School of Dentistry, University of the Pacific, CA, San Francisco, USA

Meixun Zheng, Daniel Bender & Cindy Lyon

You can also search for this author in PubMed   Google Scholar

Contributions

MZ analyzed the data and wrote the initial draft of the manuscript. DB and CL both provided assistance with research design, data collection, and reviewed and edited the manuscript. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Meixun Zheng .

Ethics declarations

Ethics approval and consent to participate.

The study was approved by the institutional review board at University of the Pacific in the U.S. (#2020-68). Informed consent was obtained from all participants. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:.

Survey of online courses during COVID-19 pandemic.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Zheng, M., Bender, D. & Lyon, C. Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study. BMC Med Educ 21 , 495 (2021). https://doi.org/10.1186/s12909-021-02909-z

Download citation

Received : 31 March 2021

Accepted : 26 August 2021

Published : 16 September 2021

DOI : https://doi.org/10.1186/s12909-021-02909-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Dental education
  • Online learning
  • COVID-19 pandemic
  • Instructional strategies
  • Interaction
  • Learning performance

BMC Medical Education

ISSN: 1472-6920

research study about online learning

RESEARCH ON ONLINE LEARNING

  • February 2019
  • Online Learning 11(1)

Karen Swan at University of Illinois Springfield

  • University of Illinois Springfield

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations
  • Jose Norman

Henry Buemio

  • Adlia Alfiriani

Punaji Setyosari

  • Henry Praherdhiono

Dhion Meitreya Vidhiasi

  • Guanqun Zhao

Na Li

  • Chei Sian Lee
  • Alaa Jaber Zeyab

Abrar Almoosa

  • Amani M Albaqshi
  • Fatima A Alabdullaziz

Jennifer C Richardson

  • Internet High Educ

D. Randy Garrison

  • Judith I. Dziuban
  • Charles D. Dziuban
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

How Effective Is Online Learning? What the Research Does and Doesn’t Tell Us

research study about online learning

  • Share article

Editor’s Note: This is part of a series on the practical takeaways from research.

The times have dictated school closings and the rapid expansion of online education. Can online lessons replace in-school time?

Clearly online time cannot provide many of the informal social interactions students have at school, but how will online courses do in terms of moving student learning forward? Research to date gives us some clues and also points us to what we could be doing to support students who are most likely to struggle in the online setting.

The use of virtual courses among K-12 students has grown rapidly in recent years. Florida, for example, requires all high school students to take at least one online course. Online learning can take a number of different forms. Often people think of Massive Open Online Courses, or MOOCs, where thousands of students watch a video online and fill out questionnaires or take exams based on those lectures.

In the online setting, students may have more distractions and less oversight, which can reduce their motivation.

Most online courses, however, particularly those serving K-12 students, have a format much more similar to in-person courses. The teacher helps to run virtual discussion among the students, assigns homework, and follows up with individual students. Sometimes these courses are synchronous (teachers and students all meet at the same time) and sometimes they are asynchronous (non-concurrent). In both cases, the teacher is supposed to provide opportunities for students to engage thoughtfully with subject matter, and students, in most cases, are required to interact with each other virtually.

Coronavirus and Schools

Online courses provide opportunities for students. Students in a school that doesn’t offer statistics classes may be able to learn statistics with virtual lessons. If students fail algebra, they may be able to catch up during evenings or summer using online classes, and not disrupt their math trajectory at school. So, almost certainly, online classes sometimes benefit students.

In comparisons of online and in-person classes, however, online classes aren’t as effective as in-person classes for most students. Only a little research has assessed the effects of online lessons for elementary and high school students, and even less has used the “gold standard” method of comparing the results for students assigned randomly to online or in-person courses. Jessica Heppen and colleagues at the American Institutes for Research and the University of Chicago Consortium on School Research randomly assigned students who had failed second semester Algebra I to either face-to-face or online credit recovery courses over the summer. Students’ credit-recovery success rates and algebra test scores were lower in the online setting. Students assigned to the online option also rated their class as more difficult than did their peers assigned to the face-to-face option.

Most of the research on online courses for K-12 students has used large-scale administrative data, looking at otherwise similar students in the two settings. One of these studies, by June Ahn of New York University and Andrew McEachin of the RAND Corp., examined Ohio charter schools; I did another with colleagues looking at Florida public school coursework. Both studies found evidence that online coursetaking was less effective.

About this series

BRIC ARCHIVE

This essay is the fifth in a series that aims to put the pieces of research together so that education decisionmakers can evaluate which policies and practices to implement.

The conveners of this project—Susanna Loeb, the director of Brown University’s Annenberg Institute for School Reform, and Harvard education professor Heather Hill—have received grant support from the Annenberg Institute for this series.

To suggest other topics for this series or join in the conversation, use #EdResearchtoPractice on Twitter.

Read the full series here .

It is not surprising that in-person courses are, on average, more effective. Being in person with teachers and other students creates social pressures and benefits that can help motivate students to engage. Some students do as well in online courses as in in-person courses, some may actually do better, but, on average, students do worse in the online setting, and this is particularly true for students with weaker academic backgrounds.

Students who struggle in in-person classes are likely to struggle even more online. While the research on virtual schools in K-12 education doesn’t address these differences directly, a study of college students that I worked on with Stanford colleagues found very little difference in learning for high-performing students in the online and in-person settings. On the other hand, lower performing students performed meaningfully worse in online courses than in in-person courses.

But just because students who struggle in in-person classes are even more likely to struggle online doesn’t mean that’s inevitable. Online teachers will need to consider the needs of less-engaged students and work to engage them. Online courses might be made to work for these students on average, even if they have not in the past.

Just like in brick-and-mortar classrooms, online courses need a strong curriculum and strong pedagogical practices. Teachers need to understand what students know and what they don’t know, as well as how to help them learn new material. What is different in the online setting is that students may have more distractions and less oversight, which can reduce their motivation. The teacher will need to set norms for engagement—such as requiring students to regularly ask questions and respond to their peers—that are different than the norms in the in-person setting.

Online courses are generally not as effective as in-person classes, but they are certainly better than no classes. A substantial research base developed by Karl Alexander at Johns Hopkins University and many others shows that students, especially students with fewer resources at home, learn less when they are not in school. Right now, virtual courses are allowing students to access lessons and exercises and interact with teachers in ways that would have been impossible if an epidemic had closed schools even a decade or two earlier. So we may be skeptical of online learning, but it is also time to embrace and improve it.

A version of this article appeared in the April 01, 2020 edition of Education Week as How Effective Is Online Learning?

Sign Up for EdWeek Tech Leader

Edweek top school jobs.

Illustration

Sign Up & Sign In

module image 9

  • Research article
  • Open access
  • Published: 02 December 2020

Integrating students’ perspectives about online learning: a hierarchy of factors

  • Montgomery Van Wart 1 ,
  • Anna Ni 1 ,
  • Pamela Medina 1 ,
  • Jesus Canelon 1 ,
  • Melika Kordrostami 1 ,
  • Jing Zhang 1 &

International Journal of Educational Technology in Higher Education volume  17 , Article number:  53 ( 2020 ) Cite this article

150k Accesses

53 Citations

24 Altmetric

Metrics details

This article reports on a large-scale ( n  = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students’ perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence--were identified as significant and reliable. Regression analysis indicates the minimal factors for enrollment in future classes—when students consider convenience and scheduling—were Basic Online Modality, Cognitive Presence, and Online Social Comfort. Students who accepted or embraced online courses on their own merits wanted a minimum of Basic Online Modality, Teaching Presence, Cognitive Presence, Online Social Comfort, and Social Presence. Students, who preferred face-to-face classes and demanded a comparable experience, valued Online Interactive Modality and Instructional Support more highly. Recommendations for online course design, policy, and future research are provided.

Introduction

While there are different perspectives of the learning process such as learning achievement and faculty perspectives, students’ perspectives are especially critical since they are ultimately the raison d’être of the educational endeavor (Chickering & Gamson, 1987 ). More pragmatically, students’ perspectives provide invaluable, first-hand insights into their experiences and expectations (Dawson et al., 2019 ). The student perspective is especially important when new teaching approaches are used and when new technologies are being introduced (Arthur, 2009 ; Crews & Butterfield, 2014 ; Van Wart, Ni, Ready, Shayo, & Court, 2020 ). With the renewed interest in “active” education in general (Arruabarrena, Sánchez, Blanco, et al., 2019 ; Kay, MacDonald, & DiGiuseppe, 2019 ; Nouri, 2016 ; Vlachopoulos & Makri, 2017 ) and the flipped classroom approach in particular (Flores, del-Arco, & Silva, 2016 ; Gong, Yang, & Cai, 2020 ; Lundin, et al., 2018 ; Maycock, 2019 ; McGivney-Burelle, 2013 ; O’Flaherty & Phillips, 2015 ; Tucker , 2012 ) along with extraordinary shifts in the technology, the student perspective on online education is profoundly important. What shapes students’ perceptions of quality integrate are their own sense of learning achievement, satisfaction with the support they receive, technical proficiency of the process, intellectual and emotional stimulation, comfort with the process, and sense of learning community. The factors that students perceive as quality online teaching, however, has not been as clear as it might be for at least two reasons.

First, it is important to note that the overall online learning experience for students is also composed of non-teaching factors which we briefly mention. Three such factors are (1) convenience, (2) learner characteristics and readiness, and (3) antecedent conditions that may foster teaching quality but are not directly responsible for it. (1) Convenience is an enormous non-quality factor for students (Artino, 2010 ) which has driven up online demand around the world (Fidalgo, Thormann, Kulyk, et al., 2020 ; Inside Higher Education and Gallup, 2019 ; Legon & Garrett, 2019 ; Ortagus, 2017 ). This is important since satisfaction with online classes is frequently somewhat lower than face-to-face classes (Macon, 2011 ). However, the literature generally supports the relative equivalence of face-to-face and online modes regarding learning achievement criteria (Bernard et al., 2004 ; Nguyen, 2015 ; Ni, 2013 ; Sitzmann, Kraiger, Stewart, & Wisher, 2006 ; see Xu & Jaggars, 2014 for an alternate perspective). These contrasts are exemplified in a recent study of business students, in which online students using a flipped classroom approach outperformed their face-to-face peers, but ironically rated instructor performance lower (Harjoto, 2017 ). (2) Learner characteristics also affect the experience related to self-regulation in an active learning model, comfort with technology, and age, among others,which affect both receptiveness and readiness of online instruction. (Alqurashi, 2016 ; Cohen & Baruth, 2017 ; Kintu, Zhu, & Kagambe, 2017 ; Kuo, Walker, Schroder, & Belland, 2013 ; Ventura & Moscoloni, 2015 ) (3) Finally, numerous antecedent factors may lead to improved instruction, but are not themselves directly perceived by students such as instructor training (Brinkley-Etzkorn, 2018 ), and the sources of faculty motivation (e.g., incentives, recognition, social influence, and voluntariness) (Wingo, Ivankova, & Moss, 2017 ). Important as these factors are, mixing them with the perceptions of quality tends to obfuscate the quality factors directly perceived by students.

Second, while student perceptions of quality are used in innumerable studies, our overall understanding still needs to integrate them more holistically. Many studies use student perceptions of quality and overall effectiveness of individual tools and strategies in online contexts such as mobile devices (Drew & Mann, 2018 ), small groups (Choi, Land, & Turgeon, 2005 ), journals (Nair, Tay, & Koh, 2013 ), simulations (Vlachopoulos & Makri, 2017 ), video (Lange & Costley, 2020 ), etc. Such studies, however, cannot provide the overall context and comparative importance. Some studies have examined the overall learning experience of students with exploratory lists, but have mixed non-quality factors with quality of teaching factors making it difficult to discern the instructor’s versus contextual roles in quality (e.g., Asoodar, Vaezi, & Izanloo, 2016 ; Bollinger & Martindale, 2004 ; Farrell & Brunton, 2020 ; Hong, 2002 ; Song, Singleton, Hill, & Koh, 2004 ; Sun, Tsai, Finger, Chen, & Yeh, 2008 ). The application of technology adoption studies also fall into this category by essentially aggregating all teaching quality in the single category of performance ( Al-Gahtani, 2016 ; Artino, 2010 ). Some studies have used high-level teaching-oriented models, primarily the Community of Inquiry model (le Roux & Nagel, 2018 ), but empirical support has been mixed (Arbaugh et al., 2008 ); and its elegance (i.e., relying on only three factors) has not provided much insight to practitioners (Anderson, 2016 ; Cleveland-Innes & Campbell, 2012 ).

Research questions

Integration of studies and concepts explored continues to be fragmented and confusing despite the fact that the number of empirical studies related to student perceptions of quality factors has increased. It is important to have an empirical view of what students’ value in a single comprehensive study and, also, to know if there is a hierarchy of factors, ranging from students who are least to most critical of the online learning experience. This research study has two research questions.

The first research question is: What are the significant factors in creating a high-quality online learning experience from students’ perspectives? That is important to know because it should have a significant effect on the instructor’s design of online classes. The goal of this research question is identify a more articulated and empirically-supported set of factors capturing the full range of student expectations.

The second research question is: Is there a priority or hierarchy of factors related to students’ perceptions of online teaching quality that relate to their decisions to enroll in online classes? For example, is it possible to distinguish which factors are critical for enrollment decisions when students are primarily motivated by convenience and scheduling flexibility (minimum threshold)? Do these factors differ from students with a genuine acceptance of the general quality of online courses (a moderate threshold)? What are the factors that are important for the students who are the most critical of online course delivery (highest threshold)?

This article next reviews the literature on online education quality, focusing on the student perspective and reviews eight factors derived from it. The research methods section discusses the study structure and methods. Demographic data related to the sample are next, followed by the results, discussion, and conclusion.

Literature review

Online education is much discussed (Prinsloo, 2016 ; Van Wart et al., 2019 ; Zawacki-Richter & Naidu, 2016 ), but its perception is substantially influenced by where you stand and what you value (Otter et al., 2013 ; Tanner, Noser, & Totaro, 2009 ). Accrediting bodies care about meeting technical standards, proof of effectiveness, and consistency (Grandzol & Grandzol, 2006 ). Institutions care about reputation, rigor, student satisfaction, and institutional efficiency (Jung, 2011 ). Faculty care about subject coverage, student participation, faculty satisfaction, and faculty workload (Horvitz, Beach, Anderson, & Xia, 2015 ; Mansbach & Austin, 2018 ). For their part, students care about learning achievement (Marks, Sibley, & Arbaugh, 2005 ; O’Neill & Sai, 2014 ; Shen, Cho, Tsai, & Marra, 2013 ), but also view online education as a function of their enjoyment of classes, instructor capability and responsiveness, and comfort in the learning environment (e.g., Asoodar et al., 2016 ; Sebastianelli, Swift, & Tamimi, 2015 ). It is this last perspective, of students, upon which we focus.

It is important to note students do not sign up for online classes solely based on perceived quality. Perceptions of quality derive from notions of the capacity of online learning when ideal—relative to both learning achievement and satisfaction/enjoyment, and perceptions about the likelihood and experience of classes living up to expectations. Students also sign up because of convenience and flexibility, and personal notions of suitability about learning. Convenience and flexibility are enormous drivers of online registration (Lee, Stringer, & Du, 2017 ; Mann & Henneberry, 2012 ). Even when students say they prefer face-to-face classes to online, many enroll in online classes and re-enroll in the future if the experience meets minimum expectations. This study examines the threshold expectations of students when they are considering taking online classes.

When discussing students’ perceptions of quality, there is little clarity about the actual range of concepts because no integrated empirical studies exist comparing major factors found throughout the literature. Rather, there are practitioner-generated lists of micro-competencies such as the Quality Matters consortium for higher education (Quality Matters, 2018 ), or broad frameworks encompassing many aspects of quality beyond teaching (Open and Distant Learning Quality Council, 2012 ). While checklists are useful for practitioners and accreditation processes, they do not provide robust, theoretical bases for scholarly development. Overarching frameworks are heuristically useful, but not for pragmatic purposes or theory building arenas. The most prominent theoretical framework used in online literature is the Community of Inquiry (CoI) model (Arbaugh et al., 2008 ; Garrison, Anderson, & Archer, 2003 ), which divides instruction into teaching, cognitive, and social presence. Like deductive theories, however, the supportive evidence is mixed (Rourke & Kanuka, 2009 ), especially regarding the importance of social presence (Annand, 2011 ; Armellini and De Stefani, 2016 ). Conceptually, the problem is not so much with the narrow articulation of cognitive or social presence; cognitive presence is how the instructor provides opportunities for students to interact with material in robust, thought-provoking ways, and social presence refers to building a community of learning that incorporates student-to-student interactions. However, teaching presence includes everything else the instructor does—structuring the course, providing lectures, explaining assignments, creating rehearsal opportunities, supplying tests, grading, answering questions, and so on. These challenges become even more prominent in the online context. While the lecture as a single medium is paramount in face-to-face classes, it fades as the primary vehicle in online classes with increased use of detailed syllabi, electronic announcements, recorded and synchronous lectures, 24/7 communications related to student questions, etc. Amassing the pedagogical and technological elements related to teaching under a single concept provides little insight.

In addition to the CoI model, numerous concepts are suggested in single-factor empirical studies when focusing on quality from a student’s perspective, with overlapping conceptualizations and nonstandardized naming conventions. Seven distinct factors are derived here from the literature of student perceptions of online quality: Instructional Support, Teaching Presence, Basic Online Modality, Social Presence, Online Social Comfort, cognitive Presence, and Interactive Online Modality.

Instructional support

Instructional Support refers to students’ perceptions of techniques by the instructor used for input, rehearsal, feedback, and evaluation. Specifically, this entails providing detailed instructions, designed use of multimedia, and the balance between repetitive class features for ease of use, and techniques to prevent boredom. Instructional Support is often included as an element of Teaching Presence, but is also labeled “structure” (Lee & Rha, 2009 ; So & Brush, 2008 ) and instructor facilitation (Eom, Wen, & Ashill, 2006 ). A prime example of the difference between face-to-face and online education is the extensive use of the “flipped classroom” (Maycock, 2019 ; Wang, Huang, & Schunn, 2019 ) in which students move to rehearsal activities faster and more frequently than traditional classrooms, with less instructor lecture (Jung, 2011 ; Martin, Wang, & Sadaf, 2018 ). It has been consistently supported as an element of student perceptions of quality (Espasa & Meneses, 2010 ).

  • Teaching presence

Teaching Presence refers to students’ perceptions about the quality of communication in lectures, directions, and individual feedback including encouragement (Jaggars & Xu, 2016 ; Marks et al., 2005 ). Specifically, instructor communication is clear, focused, and encouraging, and instructor feedback is customized and timely. If Instructional Support is what an instructor does before the course begins and in carrying out those plans, then Teaching Presence is what the instructor does while the class is conducted and in response to specific circumstances. For example, a course could be well designed but poorly delivered because the instructor is distracted; or a course could be poorly designed but an instructor might make up for the deficit by spending time and energy in elaborate communications and ad hoc teaching techniques. It is especially important in student satisfaction (Sebastianelli et al., 2015 ; Young, 2006 ) and also referred to as instructor presence (Asoodar et al., 2016 ), learner-instructor interaction (Marks et al., 2005 ), and staff support (Jung, 2011 ). As with Instructional Support, it has been consistently supported as an element of student perceptions of quality.

Basic online modality

Basic Online Modality refers to the competent use of basic online class tools—online grading, navigation methods, online grade book, and the announcements function. It is frequently clumped with instructional quality (Artino, 2010 ), service quality (Mohammadi, 2015 ), instructor expertise in e-teaching (Paechter, Maier, & Macher, 2010 ), and similar terms. As a narrowly defined concept, it is sometimes called technology (Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Sun et al., 2008 ). The only empirical study that did not find Basic Online Modality significant, as technology, was Sun et al. ( 2008 ). Because Basic Online Modality is addressed with basic instructor training, some studies assert the importance of training (e.g., Asoodar et al., 2016 ).

Social presence

Social Presence refers to students’ perceptions of the quality of student-to-student interaction. Social Presence focuses on the quality of shared learning and collaboration among students, such as in threaded discussion responses (Garrison et al., 2003 ; Kehrwald, 2008 ). Much emphasized but challenged in the CoI literature (Rourke & Kanuka, 2009 ), it has mixed support in the online literature. While some studies found Social Presence or related concepts to be significant (e.g., Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Eom et al., 2006 ; Richardson, Maeda, Lv, & Caskurlu, 2017 ), others found Social Presence insignificant (Joo, Lim, & Kim, 2011 ; So & Brush, 2008 ; Sun et al., 2008 ).

Online social comfort

Online Social Comfort refers to the instructor’s ability to provide an environment in which anxiety is low, and students feel comfortable interacting even when expressing opposing viewpoints. While numerous studies have examined anxiety (e.g., Liaw & Huang, 2013 ; Otter et al., 2013 ; Sun et al., 2008 ), only one found anxiety insignificant (Asoodar et al., 2016 ); many others have not examined the concept.

  • Cognitive presence

Cognitive Presence refers to the engagement of students such that they perceive they are stimulated by the material and instructor to reflect deeply and critically, and seek to understand different perspectives (Garrison et al., 2003 ). The instructor provides instructional materials and facilitates an environment that piques interest, is reflective, and enhances inclusiveness of perspectives (Durabi, Arrastia, Nelson, Cornille, & Liang, 2011 ). Cognitive Presence includes enhancing the applicability of material for student’s potential or current careers. Cognitive Presence is supported as significant in many online studies (e.g., Artino, 2010 ; Asoodar et al., 2016 ; Joo et al., 2011 ; Marks et al., 2005 ; Sebastianelli et al., 2015 ; Sun et al., 2008 ). Further, while many instructors perceive that cognitive presence is diminished in online settings, neuroscientific studies indicate this need not be the case (Takamine, 2017 ). While numerous studies failed to examine Cognitive Presence, this review found no studies that lessened its significance for students.

Interactive online modality

Interactive Online Modality refers to the “high-end” usage of online functionality. That is, the instructor uses interactive online class tools—video lectures, videoconferencing, and small group discussions—well. It is often included in concepts such as instructional quality (Artino, 2010 ; Asoodar et al., 2016 ; Mohammadi, 2015 ; Otter et al., 2013 ; Paechter et al., 2010 ) or engagement (Clayton, Blumberg, & Anthony, 2018 ). While individual methods have been investigated (e.g. Durabi et al., 2011 ), high-end engagement methods have not.

Other independent variables affecting perceptions of quality include age, undergraduate versus graduate status, gender, ethnicity/race, discipline, educational motivation of students, and previous online experience. While age has been found to be small or insignificant, more notable effects have been reported at the level-of-study, with graduate students reporting higher “success” (Macon, 2011 ), and community college students having greater difficulty with online classes (Legon & Garrett, 2019 ; Xu & Jaggars, 2014 ). Ethnicity and race have also been small or insignificant. Some situational variations and student preferences can be captured by paying attention to disciplinary differences (Arbaugh, 2005 ; Macon, 2011 ). Motivation levels of students have been reported to be significant in completion and achievement, with better students doing as well across face-to-face and online modes, and weaker students having greater completion and achievement challenges (Clayton et al., 2018 ; Lu & Lemonde, 2013 ).

Research methods

To examine the various quality factors, we apply a critical success factor methodology, initially introduced to schools of business research in the 1970s. In 1981, Rockhart and Bullen codified an approach embodying principles of critical success factors (CSFs) as a way to identify the information needs of executives, detailing steps for the collection and analyzation of data to create a set of organizational CSFs (Rockhart & Bullen, 1981 ). CSFs describe the underlying or guiding principles which must be incorporated to ensure success.

Utilizing this methodology, CSFs in the context of this paper define key areas of instruction and design essential for an online class to be successful from a student’s perspective. Instructors implicitly know and consider these areas when setting up an online class and designing and directing activities and tasks important to achieving learning goals. CSFs make explicit those things good instructors may intuitively know and (should) do to enhance student learning. When made explicit, CSFs not only confirm the knowledge of successful instructors, but tap their intuition to guide and direct the accomplishment of quality instruction for entire programs. In addition, CSFs are linked with goals and objectives, helping generate a small number of truly important matters an instructor should focus attention on to achieve different thresholds of online success.

After a comprehensive literature review, an instrument was created to measure students’ perceptions about the importance of techniques and indicators leading to quality online classes. Items were designed to capture the major factors in the literature. The instrument was pilot studied during academic year 2017–18 with a 397 student sample, facilitating an exploratory factor analysis leading to important preliminary findings (reference withheld for review). Based on the pilot, survey items were added and refined to include seven groups of quality teaching factors and two groups of items related to students’ overall acceptance of online classes as well as a variable on their future online class enrollment. Demographic information was gathered to determine their effects on students’ levels of acceptance of online classes based on age, year in program, major, distance from university, number of online classes taken, high school experience with online classes, and communication preferences.

This paper draws evidence from a sample of students enrolled in educational programs at Jack H. Brown College of Business and Public Administration (JHBC), California State University San Bernardino (CSUSB). The JHBC offers a wide range of online courses for undergraduate and graduate programs. To ensure comparable learning outcomes, online classes and face-to-face classes of a certain subject are similar in size—undergraduate classes are generally capped at 60 and graduate classes at 30, and often taught by the same instructors. Students sometimes have the option to choose between both face-to-face and online modes of learning.

A Qualtrics survey link was sent out by 11 instructors to students who were unlikely to be cross-enrolled in classes during the 2018–19 academic year. 1 Approximately 2500 students were contacted, with some instructors providing class time to complete the anonymous survey. All students, whether they had taken an online class or not, were encouraged to respond. Nine hundred eighty-seven students responded, representing a 40% response rate. Although drawn from a single business school, it is a broad sample representing students from several disciplines—management, accounting and finance, marketing, information decision sciences, and public administration, as well as both graduate and undergraduate programs of study.

The sample age of students is young, with 78% being under 30. The sample has almost no lower division students (i.e., freshman and sophomore), 73% upper division students (i.e., junior and senior) and 24% graduate students (master’s level). Only 17% reported having taken a hybrid or online class in high school. There was a wide range of exposure to university level online courses, with 47% reporting having taken 1 to 4 classes, and 21% reporting no online class experience. As a Hispanic-serving institution, 54% self-identified as Latino, 18% White, and 13% Asian and Pacific Islander. The five largest majors were accounting & finance (25%), management (21%), master of public administration (16%), marketing (12%), and information decision sciences (10%). Seventy-four percent work full- or part-time. See Table  1 for demographic data.

Measures and procedure

To increase the reliability of evaluation scores, composite evaluation variables are formed after an exploratory factor analysis of individual evaluation items. A principle component method with Quartimin (oblique) rotation was applied to explore the factor construct of student perceptions of online teaching CSFs. The item correlations for student perceptions of importance coefficients greater than .30 were included, a commonly acceptable ratio in factor analysis. A simple least-squares regression analysis was applied to test the significance levels of factors on students’ impression of online classes.

Exploratory factor constructs

Using a threshold loading of 0.3 for items, 37 items loaded on seven factors. All factors were logically consistent. The first factor, with eight items, was labeled Teaching Presence. Items included providing clear instructions, staying on task, clear deadlines, and customized feedback on strengths and weaknesses. Teaching Presence items all related to instructor involvement during the course as a director, monitor, and learning facilitator. The second factor, with seven items, aligned with Cognitive Presence. Items included stimulating curiosity, opportunities for reflection, helping students construct explanations posed in online courses, and the applicability of material. The third factor, with six items, aligned with Social Presence defined as providing student-to-student learning opportunities. Items included getting to know course participants for sense of belonging, forming impressions of other students, and interacting with others. The fourth factor, with six new items as well as two (“interaction with other students” and “a sense of community in the class”) shared with the third factor, was Instructional Support which related to the instructor’s roles in providing students a cohesive learning experience. They included providing sufficient rehearsal, structured feedback, techniques for communication, navigation guide, detailed syllabus, and coordinating student interaction and creating a sense of online community. This factor also included enthusiasm which students generally interpreted as a robustly designed course, rather than animation in a traditional lecture. The fifth factor was labeled Basic Online Modality and focused on the basic technological requirements for a functional online course. Three items included allowing students to make online submissions, use of online gradebooks, and online grading. A fourth item is the use of online quizzes, viewed by students as mechanical practice opportunities rather than small tests and a fifth is navigation, a key component of Online Modality. The sixth factor, loaded on four items, was labeled Online Social Comfort. Items here included comfort discussing ideas online, comfort disagreeing, developing a sense of collaboration via discussion, and considering online communication as an excellent medium for social interaction. The final factor was called Interactive Online Modality because it included items for “richer” communications or interactions, no matter whether one- or two-way. Items included videoconferencing, instructor-generated videos, and small group discussions. Taken together, these seven explained 67% of the variance which is considered in the acceptable range in social science research for a robust model (Hair, Black, Babin, & Anderson, 2014 ). See Table  2 for the full list.

To test for factor reliability, the Cronbach alpha of variables were calculated. All produced values greater than 0.7, the standard threshold used for reliability, except for system trust which was therefore dropped. To gauge students’ sense of factor importance, all items were means averaged. Factor means (lower means indicating higher importance to students), ranged from 1.5 to 2.6 on a 5-point scale. Basic Online Modality was most important, followed by Instructional Support and Teaching Presence. Students deemed Cognitive Presence, Social Online Comfort, and Online Interactive Modality less important. The least important for this sample was Social Presence. Table  3 arrays the critical success factor means, standard deviations, and Cronbach alpha.

To determine whether particular subgroups of respondents viewed factors differently, a series of ANOVAs were conducted using factor means as dependent variables. Six demographic variables were used as independent variables: graduate vs. undergraduate, age, work status, ethnicity, discipline, and past online experience. To determine strength of association of the independent variables to each of the seven CSFs, eta squared was calculated for each ANOVA. Eta squared indicates the proportion of variance in the dependent variable explained by the independent variable. Eta squared values greater than .01, .06, and .14 are conventionally interpreted as small, medium, and large effect sizes, respectively (Green & Salkind, 2003 ). Table  4 summarizes the eta squared values for the ANOVA tests with Eta squared values less than .01 omitted.

While no significant differences in factor means among students in different disciplines in the College occur, all five other independent variables have some small effect on some or all CSFs. Graduate students tend to rate Online Interactive Modality, Instructional Support, Teaching Presence, and Cognitive Presence higher than undergraduates. Elder students value more Online Interactive Modality. Full-time working students rate all factors, except Social Online Comfort, slightly higher than part-timers and non-working students. Latino and White rate Basic Online Modality and Instructional Support higher; Asian and Pacific Islanders rate Social Presence higher. Students who have taken more online classes rate all factors higher.

In addition to factor scores, two variables are constructed to identify the resultant impressions labeled online experience. Both were logically consistent with a Cronbach’s α greater than 0.75. The first variable, with six items, labeled “online acceptance,” included items such as “I enjoy online learning,” “My overall impression of hybrid/online learning is very good,” and “the instructors of online/hybrid classes are generally responsive.” The second variable was labeled “face-to-face preference” and combines four items, including enjoying, learning, and communicating more in face-to-face classes, as well as perceiving greater fairness and equity. In addition to these two constructed variables, a one-item variable was also used subsequently in the regression analysis: “online enrollment.” That question asked: if hybrid/online classes are well taught and available, how much would online education make up your entire course selection going forward?

Regression results

As noted above, two constructed variables and one item were used as dependent variables for purposes of regression analysis. They were online acceptance, F2F preference, and the selection of online classes. In addition to seven quality-of-teaching factors identified by factor analysis, control variables included level of education (graduate versus undergraduate), age, ethnicity, work status, distance to university, and number of online/hybrid classes taken in the past. See Table  5 .

When the ETA squared values for ANOVA significance were measured for control factors, only one was close to a medium effect. Graduate versus undergraduate status had a .05 effect (considered medium) related to Online Interactive Modality, meaning graduate students were more sensitive to interactive modality than undergraduates. Multiple regression analysis of critical success factors and online impressions were conducted to compare under what conditions factors were significant. The only consistently significant control factor was number of online classes taken. The more classes students had taken online, the more inclined they were to take future classes. Level of program, age, ethnicity, and working status do not significantly affect students’ choice or overall acceptance of online classes.

The least restrictive condition was online enrollment (Table  6 ). That is, students might not feel online courses were ideal, but because of convenience and scheduling might enroll in them if minimum threshold expectations were met. When considering online enrollment three factors were significant and positive (at the 0.1 level): Basic Online Modality, Cognitive Presence, and Online Social Comfort. These least-demanding students expected classes to have basic technological functionality, provide good opportunities for knowledge acquisition, and provide comfortable interaction in small groups. Students who demand good Instructional Support (e.g., rehearsal opportunities, standardized feedback, clear syllabus) are less likely to enroll.

Online acceptance was more restrictive (see Table  7 ). This variable captured the idea that students not only enrolled in online classes out of necessity, but with an appreciation of the positive attributes of online instruction, which balanced the negative aspects. When this standard was applied, students expected not only Basic Online Modality, Cognitive Presence, and Online Social Comfort, but expected their instructors to be highly engaged virtually as the course progressed (Teaching Presence), and to create strong student-to-student dynamics (Social Presence). Students who rated Instructional Support higher are less accepting of online classes.

Another restrictive condition was catering to the needs of students who preferred face-to-face classes (see Table  8 ). That is, they preferred face-to-face classes even when online classes were well taught. Unlike students more accepting of, or more likely to enroll in, online classes, this group rates Instructional Support as critical to enrolling, rather than a negative factor when absent. Again different from the other two groups, these students demand appropriate interactive mechanisms (Online Interactive Modality) to enable richer communication (e.g., videoconferencing). Student-to-student collaboration (Social Presence) was also significant. This group also rated Cognitive Presence and Online Social Comfort as significant, but only in their absence. That is, these students were most attached to direct interaction with the instructor and other students rather than specific teaching methods. Interestingly, Basic Online Modality and Teaching Presence were not significant. Our interpretation here is this student group, most critical of online classes for its loss of physical interaction, are beyond being concerned with mechanical technical interaction and demand higher levels of interactivity and instructional sophistication.

Discussion and study limitations

Some past studies have used robust empirical methods to identify a single factor or a small number of factors related to quality from a student’s perspective, but have not sought to be relatively comprehensive. Others have used a longer series of itemized factors, but have less used less robust methods, and have not tied those factors back to the literature. This study has used the literature to develop a relatively comprehensive list of items focused on quality teaching in a single rigorous protocol. That is, while a Beta test had identified five coherent factors, substantial changes to the current survey that sharpened the focus on quality factors rather than antecedent factors, as well as better articulating the array of factors often lumped under the mantle of “teaching presence.” In addition, it has also examined them based on threshold expectations: from minimal, such as when flexibility is the driving consideration, to modest, such as when students want a “good” online class, to high, when students demand an interactive virtual experience equivalent to face-to-face.

Exploratory factor analysis identified seven factors that were reliable, coherent, and significant under different conditions. When considering students’ overall sense of importance, they are, in order: Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Social Online Comfort, Interactive Online Modality, and Social Presence. Students are most concerned with the basics of a course first, that is the technological and instructor competence. Next they want engagement and virtual comfort. Social Presence, while valued, is the least critical from this overall perspective.

The factor analysis is quite consistent with the range of factors identified in the literature, pointing to the fact that students can differentiate among different aspects of what have been clumped as larger concepts, such as teaching presence. Essentially, the instructor’s role in quality can be divided into her/his command of basic online functionality, good design, and good presence during the class. The instructor’s command of basic functionality is paramount. Because so much of online classes must be built in advance of the class, quality of the class design is rated more highly than the instructor’s role in facilitating the class. Taken as a whole, the instructor’s role in traditional teaching elements is primary, as we would expect it to be. Cognitive presence, especially as pertinence of the instructional material and its applicability to student interests, has always been found significant when studied, and was highly rated as well in a single factor. Finally, the degree to which students feel comfortable with the online environment and enjoy the learner-learner aspect has been less supported in empirical studies, was found significant here, but rated the lowest among the factors of quality to students.

Regression analysis paints a more nuanced picture, depending on student focus. It also helps explain some of the heterogeneity of previous studies, depending on what the dependent variables were. If convenience and scheduling are critical and students are less demanding, minimum requirements are Basic Online Modality, Cognitive Presence, and Online Social Comfort. That is, students’ expect an instructor who knows how to use an online platform, delivers useful information, and who provides a comfortable learning environment. However, they do not expect to get poor design. They do not expect much in terms of the quality teaching presence, learner-to-learner interaction, or interactive teaching.

When students are signing up for critical classes, or they have both F2F and online options, they have a higher standard. That is, they not only expect the factors for decisions about enrolling in noncritical classes, but they also expect good Teaching and Social Presence. Students who simply need a class may be willing to teach themselves a bit more, but students who want a good class expect a highly present instructor in terms responsiveness and immediacy. “Good” classes must not only create a comfortable atmosphere, but in social science classes at least, must provide strong learner-to-learner interactions as well. At the time of the research, most students believe that you can have a good class without high interactivity via pre-recorded video and videoconference. That may, or may not, change over time as technology thresholds of various video media become easier to use, more reliable, and more commonplace.

The most demanding students are those who prefer F2F classes because of learning style preferences, poor past experiences, or both. Such students (seem to) assume that a worthwhile online class has basic functionality and that the instructor provides a strong presence. They are also critical of the absence of Cognitive Presence and Online Social Comfort. They want strong Instructional Support and Social Presence. But in addition, and uniquely, they expect Online Interactive Modality which provides the greatest verisimilitude to the traditional classroom as possible. More than the other two groups, these students crave human interaction in the learning process, both with the instructor and other students.

These findings shed light on the possible ramifications of the COVID-19 aftermath. Many universities around the world jumped from relatively low levels of online instruction in the beginning of spring 2020 to nearly 100% by mandate by the end of the spring term. The question becomes, what will happen after the mandate is removed? Will demand resume pre-crisis levels, will it increase modestly, or will it skyrocket? Time will be the best judge, but the findings here would suggest that the ability/interest of instructors and institutions to “rise to the occasion” with quality teaching will have as much effect on demand as students becoming more acclimated to online learning. If in the rush to get classes online many students experience shoddy basic functional competence, poor instructional design, sporadic teaching presence, and poorly implemented cognitive and social aspects, they may be quite willing to return to the traditional classroom. If faculty and institutions supporting them are able to increase the quality of classes despite time pressures, then most students may be interested in more hybrid and fully online classes. If instructors are able to introduce high quality interactive teaching, nearly the entire student population will be interested in more online classes. Of course students will have a variety of experiences, but this analysis suggests that those instructors, departments, and institutions that put greater effort into the temporary adjustment (and who resist less), will be substantially more likely to have increases in demand beyond what the modest national trajectory has been for the last decade or so.

There are several study limitations. First, the study does not include a sample of non-respondents. Non-responders may have a somewhat different profile. Second, the study draws from a single college and university. The profile derived here may vary significantly by type of student. Third, some survey statements may have led respondents to rate quality based upon experience rather than assess the general importance of online course elements. “I felt comfortable participating in the course discussions,” could be revised to “comfort in participating in course discussions.” The authors weighed differences among subgroups (e.g., among majors) as small and statistically insignificant. However, it is possible differences between biology and marketing students would be significant, leading factors to be differently ordered. Emphasis and ordering might vary at a community college versus research-oriented university (Gonzalez, 2009 ).

Availability of data and materials

We will make the data available.

Al-Gahtani, S. S. (2016). Empirical investigation of e-learning acceptance and assimilation: A structural equation model. Applied Comput Information , 12 , 27–50.

Google Scholar  

Alqurashi, E. (2016). Self-efficacy in online learning environments: A literature review. Contemporary Issues Educ Res (CIER) , 9 (1), 45–52.

Anderson, T. (2016). A fourth presence for the Community of Inquiry model? Retrieved from https://virtualcanuck.ca/2016/01/04/a-fourth-presence-for-the-community-of-inquiry-model/ .

Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning , 12 (5), 40.

Arbaugh, J. B. (2005). How much does “subject matter” matter? A study of disciplinary effects in on-line MBA courses. Academy of Management Learning & Education , 4 (1), 57–73.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education , 11 , 133–136.

Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology , 47 (6), 1202–1216.

Arruabarrena, R., Sánchez, A., Blanco, J. M., et al. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education , 16 , #10.

Arthur, L. (2009). From performativity to professionalism: Lecturers’ responses to student feedback. Teaching in Higher Education , 14 (4), 441–454.

Artino, A. R. (2010). Online or face-to-face learning? Exploring the personal factors that predict students’ choice of instructional format. Internet and Higher Education , 13 , 272–276.

Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior , 63 , 704–716.

Bernard, R. M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research , 74 (3), 379–439.

Bollinger, D., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. Int J E-learning , 3 (1), 61–67.

Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the influence of faculty development training on teaching effectiveness through a TPACK lens. The Internet and Higher Education , 38 , 28–35.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin , 3 , 7.

Choi, I., Land, S. M., & Turgeon, A. J. (2005). Scaffolding peer-questioning strategies to facilitate metacognition during online small group discussion. Instructional Science , 33 , 483–511.

Clayton, K. E., Blumberg, F. C., & Anthony, J. A. (2018). Linkages between course status, perceived course value, and students’ preferences for traditional versus non-traditional learning environments. Computers & Education , 125 , 175–181.

Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning , 13 (4), 269–292.

Cohen, A., & Baruth, O. (2017). Personality, learning, and satisfaction in fully online academic courses. Computers in Human Behavior , 72 , 1–12.

Crews, T., & Butterfield, J. (2014). Data for flipped classroom design: Using student feedback to identify the best components from online and face-to-face classes. Higher Education Studies , 4 (3), 38–47.

Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education , 44 (1), 25–36.

Drew, C., & Mann, A. (2018). Unfitting, uncomfortable, unacademic: A sociological reading of an interactive mobile phone app in university lectures. International Journal of Educational Technology in Higher Education , 15 , #43.

Durabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning , 27 (3), 216–227.

Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education , 4 (2), 215–235.

Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education , 59 (3), 277–292.

Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in High Education , 17 , #25.

Fidalgo, P., Thormann, J., Kulyk, O., et al. (2020). Students’ perceptions on distance education: A multinational study. International Journal of Educational Technology in High Education , 17 , #18.

Flores, Ò., del-Arco, I., & Silva, P. (2016). The flipped classroom model at the university: Analysis based on professors’ and students’ assessment in the educational field. International Journal of Educational Technology in Higher Education , 13 , #21.

Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education , 1 , 113–127.

Gong, D., Yang, H. H., & Cai, J. (2020). Exploring the key influencing factors on college students’ computational thinking skills through flipped-classroom instruction. International Journal of Educational Technology in Higher Education , 17 , #19.

Gonzalez, C. (2009). Conceptions of, and approaches to, teaching online: A study of lecturers teaching postgraduate distance courses. Higher Education , 57 (3), 299–314.

Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business Education. International Review of Research in Open and Distance Learning , 7 (1), 1–18.

Green, S. B., & Salkind, N. J. (2003). Using SPSS: Analyzing and understanding data , (3rd ed., ). Upper Saddle River: Prentice Hall.

Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis: Pearson new international edition . Essex: Pearson Education Limited.

Harjoto, M. A. (2017). Blended versus face-to-face: Evidence from a graduate corporate finance class. Journal of Education for Business , 92 (3), 129–137.

Hong, K.-S. (2002). Relationships between students’ instructional variables with satisfaction and learning from a web-based course. The Internet and Higher Education , 5 , 267–281.

Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovation Higher Education , 40 , 305–316.

Inside Higher Education and Gallup. (2019). The 2019 survey of faculty attitudes on technology. Author .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers and Education , 95 , 270–284.

Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictor in a structural model. Computers & Education , 57 (2), 1654–1664.

Jung, I. (2011). The dimensions of e-learning quality: From the learner’s perspective. Educational Technology Research and Development , 59 (4), 445–464.

Kay, R., MacDonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. Journal of Computing in Higher Education , 31 , 449–471.

Kehrwald, B. (2008). Understanding social presence in text-based online learning environments. Distance Education , 29 (1), 89–106.

Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education , 14 , #7.

Kuo, Y.-C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2013). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Education , 20 , 35–50.

Lange, C., & Costley, J. (2020). Improving online video lectures: Learning challenges created by media. International Journal of Educational Technology in Higher Education , 17 , #16.

le Roux, I., & Nagel, L. (2018). Seeking the best blend for deep learning in a flipped classroom – Viewing student perceptions through the Community of Inquiry lens. International Journal of Educational Technology in High Education , 15 , #16.

Lee, H.-J., & Rha, I. (2009). Influence of structure and interaction on student achievement and satisfaction in web-based distance learning. Educational Technology & Society , 12 (4), 372–382.

Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal , 9 (2), 97–102.

Legon, R., & Garrett, R. (2019). CHLOE 3: Behind the numbers . Published online by Quality Matters and Eduventures. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-3-Report-2019-Behind-the-Numbers.pdf

Liaw, S.-S., & Huang, H.-M. (2013). Perceived satisfaction, perceived usefulness and interactive learning environments as predictors of self-regulation in e-learning environments. Computers & Education , 60 (1), 14–24.

Lu, F., & Lemonde, M. (2013). A comparison of online versus face-to-face students teaching delivery in statistics instruction for undergraduate health science students. Advances in Health Science Education , 18 , 963–973.

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1).

Macon, D. K. (2011). Student satisfaction with online courses versus traditional courses: A meta-analysis . Disssertation: Northcentral University, CA.

Mann, J., & Henneberry, S. (2012). What characteristics of college students influence their decisions to select online courses? Online Journal of Distance Learning Administration , 15 (5), 1–14.

Mansbach, J., & Austin, A. E. (2018). Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education , 43 (4), 257–272.

Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education , 29 (4), 531–563.

Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education , 37 , 52–65.

Maycock, K. W. (2019). Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning , 35 , 121–126.

McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate . Studies , 23 (5), 477–486.

Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior , 45 , 359–374.

Nair, S. S., Tay, L. Y., & Koh, J. H. L. (2013). Students’ motivation and teachers’ teaching practices towards the use of blogs for writing of online journals. Educational Media International , 50 (2), 108–119.

Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching , 11 (2), 309–319.

Ni, A. Y. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education , 19 (2), 199–215.

Nouri, J. (2016). The flipped classroom: For active, effective and increased learning – Especially for low achievers. International Journal of Educational Technology in Higher Education , 13 , #33.

O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education , 68 (1), 1–14.

O'Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95.

Open & Distant Learning Quality Council (2012). ODLQC standards . England: Author https://www.odlqc.org.uk/odlqc-standards .

Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education , 32 , 47–57.

Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education , 19 , 27–35.

Paechter, M., Maier, B., & Macher, D. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 , 292–329.

Prinsloo, P. (2016). (re)considering distance education: Exploring its relevance, sustainability and value contribution. Distance Education , 37 (2), 139–145.

Quality Matters (2018). Specific review standards from the QM higher Education rubric , (6th ed., ). MD: MarylandOnline.

Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior , 71 , 402–417.

Rockhart, J. F., & Bullen, C. V. (1981). A primer on critical success factors . Cambridge: Center for Information Systems Research, Massachusetts Institute of Technology.

Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. The Journal of Distance Education / Revue de l'ducation Distance , 23 (1), 19–48 Athabasca University Press. Retrieved August 2, 2020 from https://www.learntechlib.org/p/105542/ .

Sebastianelli, R., Swift, C., & Tamimi, N. (2015). Factors affecting perceived learning, satisfaction, and quality in the online MBA: A structural equation modeling approach. Journal of Education for Business , 90 (6), 296–305.

Shen, D., Cho, M.-H., Tsai, C.-L., & Marra, R. (2013). Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet and Higher Education , 19 , 10–17.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology , 59 (3), 623–664.

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education , 51 (1), 318–336.

Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education , 7 (1), 59–70.

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education , 50 (4), 1183–1202.

Takamine, K. (2017). Michelle D. miller: Minds online: Teaching effectively with technology. Higher Education , 73 , 789–791.

Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education , 20 (1), 29.

Tucker, B. (2012). The flipped classroom. Education Next , 12 (1), 82–83.

Van Wart, M., Ni, A., Ready, D., Shayo, C., & Court, J. (2020). Factors leading to online learner satisfaction. Business Educational Innovation Journal , 12 (1), 15–24.

Van Wart, M., Ni, A., Rose, L., McWeeney, T., & Worrell, R. A. (2019). Literature review and model of online teaching effectiveness integrating concerns for learning achievement, student satisfaction, faculty satisfaction, and institutional results. Pan-Pacific . Journal of Business Research , 10 (1), 1–22.

Ventura, A. C., & Moscoloni, N. (2015). Learning styles and disciplinary differences: A cross-sectional study of undergraduate students. International Journal of Learning and Teaching , 1 (2), 88–93.

Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education , 14 , #22.

Wang, Y., Huang, X., & Schunn, C. D. (2019). Redesigning flipped classrooms: A learning model and its effects on student perceptions. Higher Education , 78 , 711–728.

Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning , 21 (1), 15–35.

Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education , 85 (5), 633–659.

Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education , 20 (2), 65–77.

Zawacki-Richter, O., & Naidu, S. (2016). Mapping research trends from 35 years of publications in distance Education. Distance Education , 37 (3), 245–269.

Download references

Acknowledgements

No external funding/ NA.

Author information

Authors and affiliations.

Development for the JHB College of Business and Public Administration, 5500 University Parkway, San Bernardino, California, 92407, USA

Montgomery Van Wart, Anna Ni, Pamela Medina, Jesus Canelon, Melika Kordrostami, Jing Zhang & Yu Liu

You can also search for this author in PubMed   Google Scholar

Contributions

Equal. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Montgomery Van Wart .

Ethics declarations

Competing interests.

We have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Van Wart, M., Ni, A., Medina, P. et al. Integrating students’ perspectives about online learning: a hierarchy of factors. Int J Educ Technol High Educ 17 , 53 (2020). https://doi.org/10.1186/s41239-020-00229-8

Download citation

Received : 29 April 2020

Accepted : 30 July 2020

Published : 02 December 2020

DOI : https://doi.org/10.1186/s41239-020-00229-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Online teaching
  • Student perceptions
  • Online quality
  • Student presence

research study about online learning

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Wiley - PMC COVID-19 Collection

Logo of pheblackwell

Students’ experience of online learning during the COVID‐19 pandemic: A province‐wide survey study

Lixiang yan.

1 Centre for Learning Analytics at Monash, Faculty of Information Technology, Monash University, Clayton VIC, Australia

Alexander Whitelock‐Wainwright

2 Portfolio of the Deputy Vice‐Chancellor (Education), Monash University, Melbourne VIC, Australia

Quanlong Guan

3 Department of Computer Science, Jinan University, Guangzhou China

Gangxin Wen

4 College of Cyber Security, Jinan University, Guangzhou China

Dragan Gašević

Guanliang chen, associated data.

The data is not openly available as it is restricted by the Chinese government.

Online learning is currently adopted by educational institutions worldwide to provide students with ongoing education during the COVID‐19 pandemic. Even though online learning research has been advancing in uncovering student experiences in various settings (i.e., tertiary, adult, and professional education), very little progress has been achieved in understanding the experience of the K‐12 student population, especially when narrowed down to different school‐year segments (i.e., primary and secondary school students). This study explores how students at different stages of their K‐12 education reacted to the mandatory full‐time online learning during the COVID‐19 pandemic. For this purpose, we conducted a province‐wide survey study in which the online learning experience of 1,170,769 Chinese students was collected from the Guangdong Province of China. We performed cross‐tabulation and Chi‐square analysis to compare students’ online learning conditions, experiences, and expectations. Results from this survey study provide evidence that students’ online learning experiences are significantly different across school years. Foremost, policy implications were made to advise government authorises and schools on improving the delivery of online learning, and potential directions were identified for future research into K‐12 online learning.

Practitioner notes

What is already known about this topic

  • Online learning has been widely adopted during the COVID‐19 pandemic to ensure the continuation of K‐12 education.
  • Student success in K‐12 online education is substantially lower than in conventional schools.
  • Students experienced various difficulties related to the delivery of online learning.

What this paper adds

  • Provide empirical evidence for the online learning experience of students in different school years.
  • Identify the different needs of students in primary, middle, and high school.
  • Identify the challenges of delivering online learning to students of different age.

Implications for practice and/or policy

  • Authority and schools need to provide sufficient technical support to students in online learning.
  • The delivery of online learning needs to be customised for students in different school years.

INTRODUCTION

The ongoing COVID‐19 pandemic poses significant challenges to the global education system. By July 2020, the UN Educational, Scientific and Cultural Organization (2020) reported nationwide school closure in 111 countries, affecting over 1.07 billion students, which is around 61% of the global student population. Traditional brick‐and‐mortar schools are forced to transform into full‐time virtual schools to provide students with ongoing education (Van Lancker & Parolin,  2020 ). Consequently, students must adapt to the transition from face‐to‐face learning to fully remote online learning, where synchronous video conferences, social media, and asynchronous discussion forums become their primary venues for knowledge construction and peer communication.

For K‐12 students, this sudden transition is problematic as they often lack prior online learning experience (Barbour & Reeves,  2009 ). Barbour and LaBonte ( 2017 ) estimated that even in countries where online learning is growing rapidly, such as USA and Canada, less than 10% of the K‐12 student population had prior experience with this format. Maladaptation to online learning could expose inexperienced students to various vulnerabilities, including decrements in academic performance (Molnar et al.,  2019 ), feeling of isolation (Song et al.,  2004 ), and lack of learning motivation (Muilenburg & Berge,  2005 ). Unfortunately, with confirmed cases continuing to rise each day, and new outbreaks occur on a global scale, full‐time online learning for most students could last longer than anticipated (World Health Organization,  2020 ). Even after the pandemic, the current mass adoption of online learning could have lasting impacts on the global education system, and potentially accelerate and expand the rapid growth of virtual schools on a global scale (Molnar et al.,  2019 ). Thus, understanding students' learning conditions and their experiences of online learning during the COVID pandemic becomes imperative.

Emerging evidence on students’ online learning experience during the COVID‐19 pandemic has identified several major concerns, including issues with internet connection (Agung et al.,  2020 ; Basuony et al.,  2020 ), problems with IT equipment (Bączek et al.,  2021 ; Niemi & Kousa,  2020 ), limited collaborative learning opportunities (Bączek et al.,  2021 ; Yates et al.,  2020 ), reduced learning motivation (Basuony et al.,  2020 ; Niemi & Kousa,  2020 ; Yates et al.,  2020 ), and increased learning burdens (Niemi & Kousa,  2020 ). Although these findings provided valuable insights about the issues students experienced during online learning, information about their learning conditions and future expectations were less mentioned. Such information could assist educational authorises and institutions to better comprehend students’ difficulties and potentially improve their online learning experience. Additionally, most of these recent studies were limited to higher education, except for Yates et al. ( 2020 ) and Niemi and Kousa’s ( 2020 ) studies on senior high school students. Empirical research targeting the full spectrum of K‐12students remain scarce. Therefore, to address these gaps, the current paper reports the findings of a large‐scale study that sought to explore K‐12 students’ online learning experience during the COVID‐19 pandemic in a provincial sample of over one million Chinese students. The findings of this study provide policy recommendations to educational institutions and authorities regarding the delivery of K‐12 online education.

LITERATURE REVIEW

Learning conditions and technologies.

Having stable access to the internet is critical to students’ learning experience during online learning. Berge ( 2005 ) expressed the concern of the divide in digital‐readiness, and the pedagogical approach between different countries could influence students’ online learning experience. Digital‐readiness is the availability and adoption of information technologies and infrastructures in a country. Western countries like America (3rd) scored significantly higher in digital‐readiness compared to Asian countries like China (54th; Cisco,  2019 ). Students from low digital‐readiness countries could experience additional technology‐related problems. Supporting evidence is emerging in recent studies conducted during the COVID‐19 pandemic. In Egypt's capital city, Basuony et al. ( 2020 ) found that only around 13.9%of the students experienced issues with their internet connection. Whereas more than two‐thirds of the students in rural Indonesia reported issues of unstable internet, insufficient internet data, and incompatible learning device (Agung et al.,  2020 ).

Another influential factor for K‐12 students to adequately adapt to online learning is the accessibility of appropriate technological devices, especially having access to a desktop or a laptop (Barbour et al., 2018 ). However, it is unlikely for most of the students to satisfy this requirement. Even in higher education, around 76% of students reported having incompatible devices for online learning and only 15% of students used laptop for online learning, whereas around 85% of them used smartphone (Agung et al.,  2020 ). It is very likely that K‐12 students also suffer from this availability issue as they depend on their parents to provide access to relevant learning devices.

Technical issues surrounding technological devices could also influence students’ experience in online learning. (Barbour & Reeves,  2009 ) argues that students need to have a high level of digital literacy to find and use relevant information and communicate with others through technological devices. Students lacking this ability could experience difficulties in online learning. Bączek et al. ( 2021 ) found that around 54% of the medical students experienced technical problems with IT equipment and this issue was more prevalent in students with lower years of tertiary education. Likewise, Niemi and Kousa ( 2020 ) also find that students in a Finish high school experienced increased amounts of technical problems during the examination period, which involved additional technical applications. These findings are concerning as young children and adolescent in primary and lower secondary school could be more vulnerable to these technical problems as they are less experienced with the technologies in online learning (Barbour & LaBonte,  2017 ). Therefore, it is essential to investigate the learning conditions and the related difficulties experienced by students in K‐12 education as the extend of effects on them remain underexplored.

Learning experience and interactions

Apart from the aforementioned issues, the extent of interaction and collaborative learning opportunities available in online learning could also influence students’ experience. The literature on online learning has long emphasised the role of effective interaction for the success of student learning. According to Muirhead and Juwah ( 2004 ), interaction is an event that can take the shape of any type of communication between two or subjects and objects. Specifically, the literature acknowledges the three typical forms of interactions (Moore,  1989 ): (i) student‐content, (ii) student‐student, and (iii) student‐teacher. Anderson ( 2003 ) posits, in the well‐known interaction equivalency theorem, learning experiences will not deteriorate if only one of the three interaction is of high quality, and the other two can be reduced or even eliminated. Quality interaction can be accomplished by across two dimensions: (i) structure—pedagogical means that guide student interaction with contents or other students and (ii) dialogue—communication that happens between students and teachers and among students. To be able to scale online learning and prevent the growth of teaching costs, the emphasise is typically on structure (i.e., pedagogy) that can promote effective student‐content and student‐student interaction. The role of technology and media is typically recognised as a way to amplify the effect of pedagogy (Lou et al.,  2006 ). Novel technological innovations—for example learning analytics‐based personalised feedback at scale (Pardo et al.,  2019 ) —can also empower teachers to promote their interaction with students.

Online education can lead to a sense of isolation, which can be detrimental to student success (McInnerney & Roberts,  2004 ). Therefore, integration of social interaction into pedagogy for online learning is essential, especially at the times when students do not actually know each other or have communication and collaboration skills underdeveloped (Garrison et al.,  2010 ; Gašević et al.,  2015 ). Unfortunately, existing evidence suggested that online learning delivery during the COVID‐19 pandemic often lacks interactivity and collaborative experiences (Bączek et al.,  2021 ; Yates et al.,  2020 ). Bączek et al., ( 2021 ) found that around half of the medical students reported reduced interaction with teachers, and only 4% of students think online learning classes are interactive. Likewise, Yates et al. ( 2020 )’s study in high school students also revealed that over half of the students preferred in‐class collaboration over online collaboration as they value the immediate support and the proximity to teachers and peers from in‐class interaction.

Learning expectations and age differentiation

Although these studies have provided valuable insights and stressed the need for more interactivity in online learning, K‐12 students in different school years could exhibit different expectations for the desired activities in online learning. Piaget's Cognitive Developmental Theory illustrated children's difficulties in understanding abstract and hypothetical concepts (Thomas,  2000 ). Primary school students will encounter many abstract concepts in their STEM education (Uttal & Cohen,  2012 ). In face‐to‐face learning, teachers provide constant guidance on students’ learning progress and can help them to understand difficult concepts. Unfortunately, the level of guidance significantly drops in online learning, and, in most cases, children have to face learning obstacles by themselves (Barbour,  2013 ). Additionally, lower primary school students may lack the metacognitive skills to use various online learning functions, maintain engagement in synchronous online learning, develop and execute self‐regulated learning plans, and engage in meaningful peer interactions during online learning (Barbour,  2013 ; Broadbent & Poon,  2015 ; Huffaker & Calvert, 2003; Wang et al.,  2013 ). Thus, understanding these younger students’ expectations is imperative as delivering online learning to them in the same way as a virtual high school could hinder their learning experiences. For students with more matured metacognition, their expectations of online learning could be substantially different from younger students. Niemi et al.’s study ( 2020 ) with students in a Finish high school have found that students often reported heavy workload and fatigue during online learning. These issues could cause anxiety and reduce students’ learning motivation, which would have negative consequences on their emotional well‐being and academic performance (Niemi & Kousa,  2020 ; Yates et al.,  2020 ), especially for senior students who are under the pressure of examinations. Consequently, their expectations of online learning could be orientated toward having additional learning support functions and materials. Likewise, they could also prefer having more opportunities for peer interactions as these interactions are beneficial to their emotional well‐being and learning performance (Gašević et al., 2013 ; Montague & Rinaldi, 2001 ). Therefore, it is imperative to investigate the differences between online learning expectations in students of different school years to suit their needs better.

Research questions

By building upon the aforementioned relevant works, this study aimed to contribute to the online learning literature with a comprehensive understanding of the online learning experience that K‐12 students had during the COVID‐19 pandemic period in China. Additionally, this study also aimed to provide a thorough discussion of what potential actions can be undertaken to improve online learning delivery. Formally, this study was guided by three research questions (RQs):

RQ1 . What learning conditions were experienced by students across 12 years of education during their online learning process in the pandemic period? RQ2 . What benefits and obstacles were perceived by students across 12 years of education when performing online learning? RQ3 . What expectations do students, across 12 years of education, have for future online learning practices ?

Participants

The total number of K‐12 students in the Guangdong Province of China is around 15 million. In China, students of Year 1–6, Year 7–9, and Year 10–12 are referred to as students of primary school, middle school, and high school, respectively. Typically, students in China start their study in primary school at the age of around six. At the end of their high‐school study, students have to take the National College Entrance Examination (NCEE; also known as Gaokao) to apply for tertiary education. The survey was administrated across the whole Guangdong Province, that is the survey was exposed to all of the 15 million K‐12 students, though it was not mandatory for those students to accomplish the survey. A total of 1,170,769 students completed the survey, which accounts for a response rate of 7.80%. After removing responses with missing values and responses submitted from the same IP address (duplicates), we had 1,048,575 valid responses, which accounts to about 7% of the total K‐12 students in the Guangdong Province. The number of students in different school years is shown in Figure  1 . Overall, students were evenly distributed across different school years, except for a smaller sample in students of Year 10–12.

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g004.jpg

The number of students in each school year

Survey design

The survey was designed collaboratively by multiple relevant parties. Firstly, three educational researchers working in colleges and universities and three educational practitioners working in the Department of Education in Guangdong Province were recruited to co‐design the survey. Then, the initial draft of the survey was sent to 30 teachers from different primary and secondary schools, whose feedback and suggestions were considered to improve the survey. The final survey consisted of a total of 20 questions, which, broadly, can be classified into four categories: demographic, behaviours, experiences, and expectations. Details are available in Appendix.

All K‐12 students in the Guangdong Province were made to have full‐time online learning from March 1, 2020 after the outbreak of COVID‐19 in January in China. A province‐level online learning platform was provided to all schools by the government. In addition to the learning platform, these schools can also use additional third‐party platforms to facilitate the teaching activities, for example WeChat and Dingding, which provide services similar to WhatsApp and Zoom. The main change for most teachers was that they had to shift the classroom‐based lectures to online lectures with the aid of web‐conferencing tools. Similarly, these teachers also needed to perform homework marking and have consultation sessions in an online manner.

The Department of Education in the Guangdong Province of China distributed the survey to all K‐12 schools in the province on March 21, 2020 and collected responses on March 26, 2020. Students could access and answer the survey anonymously by either scan the Quick Response code along with the survey or click the survey address link on their mobile device. The survey was administrated in a completely voluntary manner and no incentives were given to the participants. Ethical approval was granted by the Department of Education in the Guangdong Province. Parental approval was not required since the survey was entirely anonymous and facilitated by the regulating authority, which satisfies China's ethical process.

The original survey was in Chinese, which was later translated by two bilingual researchers and verified by an external translator who is certified by the Australian National Accreditation Authority of Translators and Interpreters. The original and translated survey questionnaires are available in Supporting Information. Given the limited space we have here and the fact that not every survey item is relevant to the RQs, the following items were chosen to answer the RQs: item Q3 (learning media) and Q11 (learning approaches) for RQ1, item Q13 (perceived obstacle) and Q19 (perceived benefits) for RQ2, and item Q19 (expected learning activities) for RQ3. Cross‐tabulation based approaches were used to analyse the collected data. To scrutinise whether the differences displayed by students of different school years were statistically significant, we performed Chi‐square tests and calculated the Cramer's V to assess the strengths of the association after chi‐square had determined significance.

For the analyses, students were segmented into four categories based on their school years, that is Year 1–3, Year 4–6, Year 7–9, and Year 10–12, to provide a clear understanding of the different experiences and needs that different students had for online learning. This segmentation was based on the educational structure of Chinese schools: elementary school (Year 1–6), middle school (Year 7–9), and high school (Year 10–12). Children in elementary school can further be segmented into junior (Year 1–3) or senior (Year 4–6) students because senior elementary students in China are facing more workloads compared to junior students due to the provincial Middle School Entry Examination at the end of Year 6.

Learning conditions—RQ1

Learning media.

The Chi‐square test showed significant association between school years and students’ reported usage of learning media, χ 2 (55, N  = 1,853,952) = 46,675.38, p  < 0.001. The Cramer's V is 0.07 ( df ∗ = 5), which indicates a small‐to‐medium effect according to Cohen’s ( 1988 ) guidelines. Based on Figure  2 , we observed that an average of up to 87.39% students used smartphones to perform online learning, while only 25.43% students used computer, which suggests that smartphones, with widespread availability in China (2020), have been adopted by students for online learning. As for the prevalence of the two media, we noticed that both smartphones ( χ 2 (3, N  = 1,048,575) = 9,395.05, p < 0.001, Cramer's V  = 0.10 ( df ∗ = 1)) and computers ( χ 2 (3, N  = 1,048,575) = 11,025.58, p <.001, Cramer's V  = 0.10 ( df ∗ = 1)) were more adopted by high‐school‐year (Year 7–12) than early‐school‐year students (Year 1–6), both with a small effect size. Besides, apparent discrepancies can be observed between the usages of TV and paper‐based materials across different school years, that is early‐school‐year students reported more TV usage ( χ 2 (3, N  = 1,048,575) = 19,505.08, p <.001), with a small‐to‐medium effect size, Cramer's V  = 0.14( df ∗ = 1). High‐school‐year students (especially Year 10–12) reported more usage of paper‐based materials ( χ 2 (3, N  = 1,048,575) = 23,401.64, p < 0.001), with a small‐to‐medium effect size, Cramer's V  = 0.15( df ∗ = 1).

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g002.jpg

Learning media used by students in online learning

Learning approaches

School years is also significantly associated with the different learning approaches students used to tackle difficult concepts during online learning, χ 2 (55, N  = 2,383,751) = 58,030.74, p < 0.001. The strength of this association is weak to moderate as shown by the Cramer's V (0.07, df ∗ = 5; Cohen,  1988 ). When encountering problems related to difficult concepts, students typically chose to “solve independently by searching online” or “rewatch recorded lectures” instead of consulting to their teachers or peers (Figure  3 ). This is probably because, compared to classroom‐based education, it is relatively less convenient and more challenging for students to seek help from others when performing online learning. Besides, compared to high‐school‐year students, early‐school‐year students (Year 1–6), reported much less use of “solve independently by searching online” ( χ 2 (3, N  = 1,048,575) = 48,100.15, p <.001), with a small‐to‐medium effect size, Cramer's V  = 0.21 ( df ∗ = 1). Also, among those approaches of seeking help from others, significantly more high‐school‐year students preferred “communicating with other students” than early‐school‐year students ( χ 2 (3, N  = 1,048,575) = 81,723.37, p < 0.001), with a medium effect size, Cramer's V  = 0.28 ( df ∗ = 1).

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g003.jpg

Learning approaches used by students in online learning

Perceived benefits and obstacles—RQ2

Perceived benefits.

The association between school years and perceived benefits in online learning is statistically significant, χ 2 (66, N  = 2,716,127) = 29,534.23, p  < 0.001, and the Cramer's V (0.04, df ∗ = 6) indicates a small effect (Cohen,  1988 ). Unsurprisingly, benefits brought by the convenience of online learning are widely recognised by students across all school years (Figure  4 ), that is up to 75% of students reported that it is “more convenient to review course content” and 54% said that they “can learn anytime and anywhere” . Besides, we noticed that about 50% of early‐school‐year students appreciated the “access to courses delivered by famous teachers” and 40%–47% of high‐school‐year students indicated that online learning is “helpful to develop self‐regulation and autonomy” .

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g005.jpg

Perceived benefits of online learning reported by students

Perceived obstacles

The Chi‐square test shows a significant association between school years and students’ perceived obstacles in online learning, χ 2 (77, N  = 2,699,003) = 31,987.56, p < 0.001. This association is relatively weak as shown by the Cramer's V (0.04, df ∗ = 7; Cohen,  1988 ). As shown in Figure  5 , the biggest obstacles encountered by up to 73% of students were the “eyestrain caused by long staring at screens” . Disengagement caused by nearby disturbance was reported by around 40% of students, especially those of Year 1–3 and 10–12. Technological‐wise, about 50% of students experienced poor Internet connection during their learning process, and around 20% of students reported the “confusion in setting up the platforms” across of school years.

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g001.jpg

Perceived obstacles of online learning reported by students

Expectations for future practices of online learning – RQ3

Online learning activities.

The association between school years and students’ expected online learning activities is significant, χ 2 (66, N  = 2,416,093) = 38,784.81, p < 0.001. The Cramer's V is 0.05 ( df ∗ = 6) which suggests a small effect (Cohen,  1988 ). As shown in Figure  6 , the most expected activity for future online learning is “real‐time interaction with teachers” (55%), followed by “online group discussion and collaboration” (38%). We also observed that more early‐school‐year students expect reflective activities, such as “regular online practice examinations” ( χ 2 (3, N  = 1,048,575) = 11,644.98, p < 0.001), with a small effect size, Cramer's V  = 0.11 ( df ∗ = 1). In contrast, more high‐school‐year students expect “intelligent recommendation system …” ( χ 2 (3, N  = 1,048,575) = 15,327.00, p < 0.001), with a small effect size, Cramer's V  = 0.12 ( df ∗ = 1).

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g006.jpg

Students’ expected online learning activities

Regarding students’ learning conditions, substantial differences were observed in learning media, family dependency, and learning approaches adopted in online learning between students in different school years. The finding of more computer and smartphone usage in high‐school‐year than early‐school‐year students can probably be explained by that, with the growing abilities in utilising these media as well as the educational systems and tools which run on these media, high‐school‐year students tend to make better use of these media for online learning practices. Whereas, the differences in paper‐based materials may imply that high‐school‐year students in China have to accomplish a substantial amount of exercise, assignments, and exam papers to prepare for the National College Entrance Examination (NCEE), whose delivery was not entirely digitised due to the sudden transition to online learning. Meanwhile, high‐school‐year students may also have preferred using paper‐based materials for exam practice, as eventually, they would take their NCEE in the paper format. Therefore, these substantial differences in students’ usage of learning media should be addressed by customising the delivery method of online learning for different school years.

Other than these between‐age differences in learning media, the prevalence of smartphone in online learning resonates with Agung et al.’s ( 2020 ) finding on the issues surrounding the availability of compatible learning device. The prevalence of smartphone in K‐12 students is potentially problematic as the majority of the online learning platform and content is designed for computer‐based learning (Berge,  2005 ; Molnar et al.,  2019 ). Whereas learning with smartphones has its own unique challenges. For example, Gikas and Grant ( 2013 ) discovered that students who learn with smartphone experienced frustration with the small screen‐size, especially when trying to type with the tiny keypad. Another challenge relates to the distraction of various social media applications. Although similar distractions exist in computer and web‐based social media, the level of popularity, especially in the young generation, are much higher in mobile‐based social media (Montag et al.,  2018 ). In particular, the message notification function in smartphones could disengage students from learning activities and allure them to social media applications (Gikas & Grant,  2013 ). Given these challenges of learning with smartphones, more research efforts should be devoted to analysing students’ online learning behaviour in the setting of mobile learning to accommodate their needs better.

The differences in learning approaches, once again, illustrated that early‐school‐year students have different needs compared to high‐school‐year students. In particular, the low usage of the independent learning methods in early‐school‐year students may reflect their inability to engage in independent learning. Besides, the differences in help seeking behaviours demonstrated the distinctive needs for communication and interaction between different students, that is early‐school‐year students have a strong reliance on teachers and high‐school‐year students, who are equipped with stronger communication ability, are more inclined to interact with their peers. This finding implies that the design of online learning platforms should take students’ different needs into account. Thus, customisation is urgently needed for the delivery of online learning to different school years.

In terms of the perceived benefits and challenges of online learning, our results resonate with several previous findings. In particular, the benefits of convenience are in line with the flexibility advantages of online learning, which were mentioned in prior works (Appana,  2008 ; Bączek et al.,  2021 ; Barbour,  2013 ; Basuony et al.,  2020 ; Harvey et al.,  2014 ). Early‐school‐year students’ higher appreciation in having “access to courses delivered by famous teachers” and lower appreciation in the independent learning skills developed through online learning are also in line with previous literature (Barbour,  2013 ; Harvey et al.,  2014 ; Oliver et al.,  2009 ). Again, these similar findings may indicate the strong reliance that early‐school‐year students place on teachers, while high‐school‐year students are more capable of adapting to online learning by developing independent learning skills.

Technology‐wise, students’ experience of poor internet connection and confusion in setting up online learning platforms are particularly concerning. The problem of poor internet connection corroborated the findings reported in prior studies (Agung et al.,  2020 ; Barbour,  2013 ; Basuony et al.,  2020 ; Berge,  2005 ; Rice,  2006 ), that is the access issue surrounded the digital divide as one of the main challenges of online learning. In the era of 4G and 5G networks, educational authorities and institutions that deliver online education could fall into the misconception of most students have a stable internet connection at home. The internet issue we observed is particularly vital to students’ online learning experience as most students prefer real‐time communications (Figure  6 ), which rely heavily on stable internet connection. Likewise, the finding of students’ confusion in technology is also consistent with prior studies (Bączek et al.,  2021 ; Muilenburg & Berge,  2005 ; Niemi & Kousa,  2020 ; Song et al.,  2004 ). Students who were unsuccessfully in setting up the online learning platforms could potentially experience declines in confidence and enthusiasm for online learning, which would cause a subsequent unpleasant learning experience. Therefore, both the readiness of internet infrastructure and student technical skills remain as the significant challenges for the mass‐adoption of online learning.

On the other hand, students’ experience of eyestrain from extended screen time provided empirical evidence to support Spitzer’s ( 2001 ) speculation about the potential ergonomic impact of online learning. This negative effect is potentially related to the prevalence of smartphone device and the limited screen size of these devices. This finding not only demonstrates the potential ergonomic issues that would be caused by smartphone‐based online learning but also resonates with the aforementioned necessity of different platforms and content designs for different students.

A less‐mentioned problem in previous studies on online learning experiences is the disengagement caused by nearby disturbance, especially in Year 1–3 and 10–12. It is likely that early‐school‐year students suffered from this problem because of their underdeveloped metacognitive skills to concentrate on online learning without teachers’ guidance. As for high‐school‐year students, the reasons behind their disengagement require further investigation in the future. Especially it would be worthwhile to scrutinise whether this type of disengagement is caused by the substantial amount of coursework they have to undertake and the subsequent a higher level of pressure and a lower level of concentration while learning.

Across age‐level differences are also apparent in terms of students’ expectations of online learning. Although, our results demonstrated students’ needs of gaining social interaction with others during online learning, findings (Bączek et al.,  2021 ; Harvey et al.,  2014 ; Kuo et al.,  2014 ; Liu & Cavanaugh,  2012 ; Yates et al.,  2020 ). This need manifested differently across school years, with early‐school‐year students preferring more teacher interactions and learning regulation support. Once again, this finding may imply that early‐school‐year students are inadequate in engaging with online learning without proper guidance from their teachers. Whereas, high‐school‐year students prefer more peer interactions and recommendation to learning resources. This expectation can probably be explained by the large amount of coursework exposed to them. Thus, high‐school‐year students need further guidance to help them better direct their learning efforts. These differences in students’ expectations for future practices could guide the customisation of online learning delivery.

Implications

As shown in our results, improving the delivery of online learning not only requires the efforts of policymakers but also depend on the actions of teachers and parents. The following sub‐sections will provide recommendations for relevant stakeholders and discuss their essential roles in supporting online education.

Technical support

The majority of the students has experienced technical problems during online learning, including the internet lagging and confusion in setting up the learning platforms. These problems with technology could impair students’ learning experience (Kauffman,  2015 ; Muilenburg & Berge,  2005 ). Educational authorities and schools should always provide a thorough guide and assistance for students who are experiencing technical problems with online learning platforms or other related tools. Early screening and detection could also assist schools and teachers to direct their efforts more effectively in helping students with low technology skills (Wilkinson et al.,  2010 ). A potential identification method involves distributing age‐specific surveys that assess students’ Information and Communication Technology (ICT) skills at the beginning of online learning. For example, there are empirical validated ICT surveys available for both primary (Aesaert et al.,  2014 ) and high school (Claro et al.,  2012 ) students.

For students who had problems with internet lagging, the delivery of online learning should provide options that require fewer data and bandwidth. Lecture recording is the existing option but fails to address students’ need for real‐time interaction (Clark et al.,  2015 ; Malik & Fatima,  2017 ). A potential alternative involves providing students with the option to learn with digital or physical textbooks and audio‐conferencing, instead of screen sharing and video‐conferencing. This approach significantly reduces the amount of data usage and lowers the requirement of bandwidth for students to engage in smooth online interactions (Cisco,  2018 ). It also requires little additional efforts from teachers as official textbooks are often available for each school year, and thus, they only need to guide students through the materials during audio‐conferencing. Educational authority can further support this approach by making digital textbooks available for teachers and students, especially those in financial hardship. However, the lack of visual and instructor presence could potentially reduce students’ attention, recall of information, and satisfaction in online learning (Wang & Antonenko,  2017 ). Therefore, further research is required to understand whether the combination of digital or physical textbooks and audio‐conferencing is appropriate for students with internet problems. Alternatively, suppose the local technological infrastructure is well developed. In that case, governments and schools can also collaborate with internet providers to issue data and bandwidth vouchers for students who are experiencing internet problems due to financial hardship.

For future adoption of online learning, policymakers should consider the readiness of the local internet infrastructure. This recommendation is particularly important for developing countries, like Bangladesh, where the majority of the students reported the lack of internet infrastructure (Ramij & Sultana,  2020 ). In such environments, online education may become infeasible, and alternative delivery method could be more appropriate, for example, the Telesecundaria program provides TV education for rural areas of Mexico (Calderoni,  1998 ).

Other than technical problems, choosing a suitable online learning platform is also vital for providing students with a better learning experience. Governments and schools should choose an online learning platform that is customised for smartphone‐based learning, as the majority of students could be using smartphones for online learning. This recommendation is highly relevant for situations where students are forced or involuntarily engaged in online learning, like during the COVID‐19 pandemic, as they might not have access to a personal computer (Molnar et al.,  2019 ).

Customisation of delivery methods

Customising the delivery of online learning for students in different school years is the theme that appeared consistently across our findings. This customisation process is vital for making online learning an opportunity for students to develop independent learning skills, which could help prepare them for tertiary education and lifelong learning. However, the pedagogical design of K‐12 online learning programs should be differentiated from adult‐orientated programs as these programs are designed for independent learners, which is rarely the case for students in K‐12 education (Barbour & Reeves,  2009 ).

For early‐school‐year students, especially Year 1–3 students, providing them with sufficient guidance from both teachers and parents should be the priority as these students often lack the ability to monitor and reflect on learning progress. In particular, these students would prefer more real‐time interaction with teachers, tutoring from parents, and regular online practice examinations. These forms of guidance could help early‐school‐year students to cope with involuntary online learning, and potentially enhance their experience in future online learning. It should be noted that, early‐school‐year students demonstrated interest in intelligent monitoring and feedback systems for learning. Additional research is required to understand whether these young children are capable of understanding and using learning analytics that relay information on their learning progress. Similarly, future research should also investigate whether young children can communicate effectively through digital tools as potential inability could hinder student learning in online group activities. Therefore, the design of online learning for early‐school‐year students should focus less on independent learning but ensuring that students are learning effective under the guidance of teachers and parents.

In contrast, group learning and peer interaction are essential for older children and adolescents. The delivery of online learning for these students should focus on providing them with more opportunities to communicate with each other and engage in collaborative learning. Potential methods to achieve this goal involve assigning or encouraging students to form study groups (Lee et al.,  2011 ), directing students to use social media for peer communication (Dabbagh & Kitsantas,  2012 ), and providing students with online group assignments (Bickle & Rucker,  2018 ).

Special attention should be paid to students enrolled in high schools. For high‐school‐year students, in particular, students in Year 10–12, we also recommend to provide them with sufficient access to paper‐based learning materials, such as revision booklet and practice exam papers, so they remain familiar with paper‐based examinations. This recommendation applies to any students who engage in online learning but has to take their final examination in paper format. It is also imperative to assist high‐school‐year students who are facing examinations to direct their learning efforts better. Teachers can fulfil this need by sharing useful learning resources on the learning management system, if it is available, or through social media groups. Alternatively, students are interested in intelligent recommendation systems for learning resources, which are emerging in the literature (Corbi & Solans,  2014 ; Shishehchi et al.,  2010 ). These systems could provide personalised recommendations based on a series of evaluation on learners’ knowledge. Although it is infeasible for situations where the transformation to online learning happened rapidly (i.e., during the COVID‐19 pandemic), policymakers can consider embedding such systems in future online education.

Limitations

The current findings are limited to primary and secondary Chinese students who were involuntarily engaged in online learning during the COVID‐19 pandemic. Despite the large sample size, the population may not be representative as participants are all from a single province. Also, information about the quality of online learning platforms, teaching contents, and pedagogy approaches were missing because of the large scale of our study. It is likely that the infrastructures of online learning in China, such as learning platforms, instructional designs, and teachers’ knowledge about online pedagogy, were underprepared for the sudden transition. Thus, our findings may not represent the experience of students who voluntarily participated in well‐prepared online learning programs, in particular, the virtual school programs in America and Canada (Barbour & LaBonte,  2017 ; Molnar et al.,  2019 ). Lastly, the survey was only evaluated and validated by teachers but not students. Therefore, students with the lowest reading comprehension levels might have a different understanding of the items’ meaning, especially terminologies that involve abstract contracts like self‐regulation and autonomy in item Q17.

In conclusion, we identified across‐year differences between primary and secondary school students’ online learning experience during the COVID‐19 pandemic. Several recommendations were made for the future practice and research of online learning in the K‐12 student population. First, educational authorities and schools should provide sufficient technical support to help students to overcome potential internet and technical problems, as well as choosing online learning platforms that have been customised for smartphones. Second, customising the online pedagogy design for students in different school years, in particular, focusing on providing sufficient guidance for young children, more online collaborative opportunity for older children and adolescent, and additional learning resource for senior students who are facing final examinations.

CONFLICT OF INTEREST

There is no potential conflict of interest in this study.

ETHICS STATEMENT

The data are collected by the Department of Education of the Guangdong Province who also has the authority to approve research studies in K12 education in the province.

Supporting information

Supplementary Material

ACKNOWLEDGEMENTS

This work is supported by the National Natural Science Foundation of China (62077028, 61877029), the Science and Technology Planning Project of Guangdong (2020B0909030005, 2020B1212030003, 2020ZDZX3013, 2019B1515120010, 2018KTSCX016, 2019A050510024), the Science and Technology Planning Project of Guangzhou (201902010041), and the Fundamental Research Funds for the Central Universities (21617408, 21619404).

SURVEY ITEMS

DimensionsQuestion textQuestion types
DemographicQ1. What is the location and category of your school?Single‐response MCQ
Q2. Which school year are you in?Single‐response MCQ
BehaviourQ3. What equipment and materials did you use for online learning during the COVID−19 pandemic period?Multiple‐response MCQ
Q4. Other than the lecture function, which features of the online education platform have you used?Multiple‐response MCQ
Q5. What is the longest class time for your online courses?Single‐response MCQ
Q6. How long do you study online every day?Slider questions
Q8. Did you need family companionship when studying online?Single‐response MCQ
Q10. What content does your online course include?Multiple‐response MCQ
Q11. What approaches did you use to tackle the unlearnt concepts you had when performing online learning?Multiple‐response MCQ
Q12. How often do you interact with your classroom in online learning?Single‐response MCQ
Q14. Regarding the following online learning behaviours, please select the answer that fits your situation in the form below.Yes/No Questions
ExperienceQ7. Which of the following learning statuses is appropriate for your situation?Multiple‐response MCQ
Q13. What obstacles did you encounter when studying online?Multiple‐response MCQ
Q15. What skills do you think are developed from online education?Multiple‐response MCQ
Q16. How satisfied are you with the following aspects of online learning?Four‐point bipolar scale
Q17. Compared to classroom‐based learning, what are the advantages of online learning?Multiple‐response MCQ
Q18. What do you think are the deficiencies of online learning compared to physical classrooms?Multiple‐response MCQ
ExpectationsQ9. What is your preferred online classroom format?Single‐response MCQ
Q19. What online activities or experiences do you expect to have that will enhance your online learning?Multiple‐response MCQ
Q20. After the COVID−19 pandemic, which type of learning would you prefer?Single‐response MCQ

Yan, L , Whitelock‐Wainwright, A , Guan, Q , Wen, G , Gašević, D , & Chen, G . Students’ experience of online learning during the COVID‐19 pandemic: A province‐wide survey study . Br J Educ Technol . 2021; 52 :2038–2057. 10.1111/bjet.13102 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]

DATA AVAILABILITY STATEMENT

  • Aesaert, K. , Van Nijlen, D. , Vanderlinde, R. , & van Braak, J. (2014). Direct measures of digital information processing and communication skills in primary education: Using item response theory for the development and validation of an ICT competence scale . Computers & Education , 76 , 168–181. 10.1016/j.compedu.2014.03.013 [ CrossRef ] [ Google Scholar ]
  • Agung, A. S. N. , Surtikanti, M. W. , & Quinones, C. A. (2020). Students’ perception of online learning during COVID‐19 pandemic: A case study on the English students of STKIP Pamane Talino . SOSHUM: Jurnal Sosial Dan Humaniora , 10 ( 2 ), 225–235. 10.31940/soshum.v10i2.1316 [ CrossRef ] [ Google Scholar ]
  • Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction . The International Review of Research in Open and Distributed Learning , 4 ( 2 ). 10.19173/irrodl.v4i2.149 [ CrossRef ] [ Google Scholar ]
  • Appana, S. (2008). A review of benefits and limitations of online learning in the context of the student, the instructor and the tenured faculty . International Journal on E‐learning , 7 ( 1 ), 5–22. [ Google Scholar ]
  • Bączek, M. , Zagańczyk‐Bączek, M. , Szpringer, M. , Jaroszyński, A. , & Wożakowska‐Kapłon, B. (2021). Students’ perception of online learning during the COVID‐19 pandemic: A survey study of Polish medical students . Medicine , 100 ( 7 ), e24821. 10.1097/MD.0000000000024821 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Barbour, M. K. (2013). The landscape of k‐12 online learning: Examining what is known . Handbook of Distance Education , 3 , 574–593. [ Google Scholar ]
  • Barbour, M. , Huerta, L. , & Miron, G. (2018). Virtual schools in the US: Case studies of policy, performance and research evidence. In Society for information technology & teacher education international conference (pp. 672–677). Association for the Advancement of Computing in Education (AACE). [ Google Scholar ]
  • Barbour, M. K. , & LaBonte, R. (2017). State of the nation: K‐12 e‐learning in Canada, 2017 edition . http://k12sotn.ca/wp‐content/uploads/2018/02/StateNation17.pdf [ Google Scholar ]
  • Barbour, M. K. , & Reeves, T. C. (2009). The reality of virtual schools: A review of the literature . Computers & Education , 52 ( 2 ), 402–416. [ Google Scholar ]
  • Basuony, M. A. K. , EmadEldeen, R. , Farghaly, M. , El‐Bassiouny, N. , & Mohamed, E. K. A. (2020). The factors affecting student satisfaction with online education during the COVID‐19 pandemic: An empirical study of an emerging Muslim country . Journal of Islamic Marketing . 10.1108/JIMA-09-2020-0301 [ CrossRef ] [ Google Scholar ]
  • Berge, Z. L. (2005). Virtual schools: Planning for success . Teachers College Press, Columbia University. [ Google Scholar ]
  • Bickle, M. C. , & Rucker, R. (2018). Student‐to‐student interaction: Humanizing the online classroom using technology and group assignments . Quarterly Review of Distance Education , 19 ( 1 ), 1–56. [ Google Scholar ]
  • Broadbent, J. , & Poon, W. L. (2015). Self‐regulated learning strategies & academic achievement in online higher education learning environments: A systematic review . The Internet and Higher Education , 27 , 1–13. [ Google Scholar ]
  • Calderoni, J. (1998). Telesecundaria: Using TV to bring education to rural Mexico (Tech. Rep.). The World Bank. [ Google Scholar ]
  • Cisco . (2018). Bandwidth requirements for meetings with cisco Webex and collaboration meeting rooms white paper . http://dwz.date/dpbc [ Google Scholar ]
  • Cisco . (2019). Cisco digital readiness 2019 . https://www.cisco.com/c/m/en_us/about/corporate‐social‐responsibility/research‐resources/digital‐readiness‐index.html#/ (Library Catalog: www.cisco.com). [ Google Scholar ]
  • Clark, C. , Strudler, N. , & Grove, K. (2015). Comparing asynchronous and synchronous video vs. text based discussions in an online teacher education course . Online Learning , 19 ( 3 ), 48–69. [ Google Scholar ]
  • Claro, M. , Preiss, D. D. , San Martín, E. , Jara, I. , Hinostroza, J. E. , Valenzuela, S. , Cortes, F. , & Nussbaum, M. (2012). Assessment of 21st century ICT skills in Chile: Test design and results from high school level students . Computers & Education , 59 ( 3 ), 1042–1053. 10.1016/j.compedu.2012.04.004 [ CrossRef ] [ Google Scholar ]
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences . Routledge Academic. [ Google Scholar ]
  • Corbi, A. , & Solans, D. B. (2014). Review of current student‐monitoring techniques used in elearning‐focused recommender systems and learning analytics: The experience API & LIME model case study . IJIMAI , 2 ( 7 ), 44–52. [ Google Scholar ]
  • Dabbagh, N. , & Kitsantas, A. (2012). Personal learning environments, social media, and self‐regulated learning: A natural formula for connecting formal and informal learning . The Internet and Higher Education , 15 ( 1 ), 3–8. 10.1016/j.iheduc.2011.06.002 [ CrossRef ] [ Google Scholar ]
  • Garrison, D. R. , Cleveland‐Innes, M. , & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework . The Internet and Higher Education , 13 ( 1–2 ), 31–36. 10.1016/j.iheduc.2009.10.002 [ CrossRef ] [ Google Scholar ]
  • Gašević, D. , Adesope, O. , Joksimović, S. , & Kovanović, V. (2015). Externally‐facilitated regulation scaffolding and role assignment to develop cognitive presence in asynchronous online discussions . The Internet and Higher Education , 24 , 53–65. 10.1016/j.iheduc.2014.09.006 [ CrossRef ] [ Google Scholar ]
  • Gašević, D. , Zouaq, A. , & Janzen, R. (2013). “Choose your classmates, your GPA is at stake!” The association of cross‐class social ties and academic performance . American Behavioral Scientist , 57 ( 10 ), 1460–1479. [ Google Scholar ]
  • Gikas, J. , & Grant, M. M. (2013). Mobile computing devices in higher education: Student perspectives on learning with cellphones, smartphones & social media . The Internet and Higher Education , 19 , 18–26. [ Google Scholar ]
  • Harvey, D. , Greer, D. , Basham, J. , & Hu, B. (2014). From the student perspective: Experiences of middle and high school students in online learning . American Journal of Distance Education , 28 ( 1 ), 14–26. 10.1080/08923647.2014.868739 [ CrossRef ] [ Google Scholar ]
  • Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning . Research in Learning Technology , 23 . 10.3402/rlt.v23.26507 [ CrossRef ] [ Google Scholar ]
  • Kuo, Y.‐C. , Walker, A. E. , Belland, B. R. , Schroder, K. E. , & Kuo, Y.‐T. (2014). A case study of integrating interwise: Interaction, internet self‐efficacy, and satisfaction in synchronous online learning environments . International Review of Research in Open and Distributed Learning , 15 ( 1 ), 161–181. 10.19173/irrodl.v15i1.1664 [ CrossRef ] [ Google Scholar ]
  • Lee, S. J. , Srinivasan, S. , Trail, T. , Lewis, D. , & Lopez, S. (2011). Examining the relationship among student perception of support, course satisfaction, and learning outcomes in online learning . The Internet and Higher Education , 14 ( 3 ), 158–163. 10.1016/j.iheduc.2011.04.001 [ CrossRef ] [ Google Scholar ]
  • Liu, F. , & Cavanaugh, C. (2012). Factors influencing student academic performance in online high school algebra . Open Learning: The Journal of Open, Distance and e‐Learning , 27 ( 2 ), 149–167. 10.1080/02680513.2012.678613 [ CrossRef ] [ Google Scholar ]
  • Lou, Y. , Bernard, R. M. , & Abrami, P. C. (2006). Media and pedagogy in undergraduate distance education: A theory‐based meta‐analysis of empirical literature . Educational Technology Research and Development , 54 ( 2 ), 141–176. 10.1007/s11423-006-8252-x [ CrossRef ] [ Google Scholar ]
  • Malik, M. , & Fatima, G. (2017). E‐learning: Students’ perspectives about asynchronous and synchronous resources at higher education level . Bulletin of Education and Research , 39 ( 2 ), 183–195. [ Google Scholar ]
  • McInnerney, J. M. , & Roberts, T. S. (2004). Online learning: Social interaction and the creation of a sense of community . Journal of Educational Technology & Society , 7 ( 3 ), 73–81. [ Google Scholar ]
  • Molnar, A. , Miron, G. , Elgeberi, N. , Barbour, M. K. , Huerta, L. , Shafer, S. R. , & Rice, J. K. (2019). Virtual schools in the US 2019 . National Education Policy Center. [ Google Scholar ]
  • Montague, M. , & Rinaldi, C. (2001). Classroom dynamics and children at risk: A followup . Learning Disability Quarterly , 24 ( 2 ), 75–83. [ Google Scholar ]
  • Montag, C. , Becker, B. , & Gan, C. (2018). The multipurpose application Wechat: A review on recent research . Frontiers in Psychology , 9 , 2247. 10.3389/fpsyg.2018.02247 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Moore, M. G. (1989). Editorial: Three types of interaction . American Journal of Distance Education , 3 ( 2 ), 1–7. 10.1080/08923648909526659 [ CrossRef ] [ Google Scholar ]
  • Muilenburg, L. Y. , & Berge, Z. L. (2005). Student barriers to online learning: A factor analytic study . Distance Education , 26 ( 1 ), 29–48. 10.1080/01587910500081269 [ CrossRef ] [ Google Scholar ]
  • Muirhead, B. , & Juwah, C. (2004). Interactivity in computer‐mediated college and university education: A recent review of the literature . Journal of Educational Technology & Society , 7 ( 1 ), 12–20. [ Google Scholar ]
  • Niemi, H. M. , & Kousa, P. (2020). A case study of students’ and teachers’ perceptions in a finnish high school during the COVID pandemic . International Journal of Technology in Education and Science , 4 ( 4 ), 352–369. 10.46328/ijtes.v4i4.167 [ CrossRef ] [ Google Scholar ]
  • Oliver, K. , Osborne, J. , & Brady, K. (2009). What are secondary students’ expectations for teachers in virtual school environments? Distance Education , 30 ( 1 ), 23–45. 10.1080/01587910902845923 [ CrossRef ] [ Google Scholar ]
  • Pardo, A. , Jovanovic, J. , Dawson, S. , Gašević, D. , & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback . British Journal of Educational Technology , 50 ( 1 ), 128–138. 10.1111/bjet.12592 [ CrossRef ] [ Google Scholar ]
  • Ramij, M. , & Sultana, A. (2020). Preparedness of online classes in developing countries amid covid‐19 outbreak: A perspective from Bangladesh. Afrin, Preparedness of Online Classes in Developing Countries amid COVID‐19 Outbreak: A Perspective from Bangladesh (June 29, 2020) .
  • Rice, K. L. (2006). A comprehensive look at distance education in the k–12 context . Journal of Research on Technology in Education , 38 ( 4 ), 425–448. 10.1080/15391523.2006.10782468 [ CrossRef ] [ Google Scholar ]
  • Shishehchi, S. , Banihashem, S. Y. , & Zin, N. A. M. (2010). A proposed semantic recommendation system for elearning: A rule and ontology based e‐learning recommendation system. In 2010 international symposium on information technology (Vol. 1, pp. 1–5).
  • Song, L. , Singleton, E. S. , Hill, J. R. , & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics . The Internet and Higher Education , 7 ( 1 ), 59–70. 10.1016/j.iheduc.2003.11.003 [ CrossRef ] [ Google Scholar ]
  • Spitzer, D. R. (2001). Don’t forget the high‐touch with the high‐tech in distance learning . Educational Technology , 41 ( 2 ), 51–55. [ Google Scholar ]
  • Thomas, R. M. (2000). Comparing theories of child development. Wadsworth/Thomson Learning. United Nations Educational, Scientific and Cultural Organization. (2020, March). Education: From disruption to recovery . https://en.unesco.org/covid19/educationresponse (Library Catalog: en.unesco.org)
  • Uttal, D. H. , & Cohen, C. A. (2012). Spatial thinking and stem education: When, why, and how? In Psychology of learning and motivation (Vol. 57 , pp. 147–181). Elsevier. [ Google Scholar ]
  • Van Lancker, W. , & Parolin, Z. (2020). Covid‐19, school closures, and child poverty: A social crisis in the making . The Lancet Public Health , 5 ( 5 ), e243–e244. 10.1016/S2468-2667(20)30084-0 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wang, C.‐H. , Shannon, D. M. , & Ross, M. E. (2013). Students’ characteristics, self‐regulated learning, technology self‐efficacy, and course outcomes in online learning . Distance Education , 34 ( 3 ), 302–323. 10.1080/01587919.2013.835779 [ CrossRef ] [ Google Scholar ]
  • Wang, J. , & Antonenko, P. D. (2017). Instructor presence in instructional video: Effects on visual attention, recall, and perceived learning . Computers in Human Behavior , 71 , 79–89. 10.1016/j.chb.2017.01.049 [ CrossRef ] [ Google Scholar ]
  • Wilkinson, A. , Roberts, J. , & While, A. E. (2010). Construction of an instrument to measure student information and communication technology skills, experience and attitudes to e‐learning . Computers in Human Behavior , 26 ( 6 ), 1369–1376. 10.1016/j.chb.2010.04.010 [ CrossRef ] [ Google Scholar ]
  • World Health Organization . (2020, July). Coronavirus disease 2019 (COVID‐19): Situation Report‐164 (Situation Report No. 164). https://www.who.int/docs/default‐source/coronaviruse/situation‐reports/20200702‐covid‐19‐sitrep‐164.pdf?sfvrsn$=$ac074f58$_$2
  • Yates, A. , Starkey, L. , Egerton, B. , & Flueggen, F. (2020). High school students’ experience of online learning during Covid‐19: The influence of technology and pedagogy . Technology, Pedagogy and Education , 9 , 1–15. 10.1080/1475939X.2020.1854337 [ CrossRef ] [ Google Scholar ]

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

What we know about online learning and the homework gap amid the pandemic

A sixth grader completes his homework online in his family's living room in Boston on March 31, 2020.

America’s K-12 students are returning to classrooms this fall after 18 months of virtual learning at home during the COVID-19 pandemic. Some students who lacked the home internet connectivity needed to finish schoolwork during this time – an experience often called the “ homework gap ” – may continue to feel the effects this school year.

Here is what Pew Research Center surveys found about the students most likely to be affected by the homework gap and their experiences learning from home.

Children across the United States are returning to physical classrooms this fall after 18 months at home, raising questions about how digital disparities at home will affect the existing homework gap between certain groups of students.

Methodology for each Pew Research Center poll can be found at the links in the post.

With the exception of the 2018 survey, everyone who took part in the surveys is a member of the Center’s American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of residential addresses. This way nearly all U.S. adults have a chance of selection. The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, partisan affiliation, education and other categories. Read more about the  ATP’s methodology .

The 2018 data on U.S. teens comes from a Center poll of 743 U.S. teens ages 13 to 17 conducted March 7 to April 10, 2018, using the NORC AmeriSpeak panel. AmeriSpeak is a nationally representative, probability-based panel of the U.S. household population. Randomly selected U.S. households are sampled with a known, nonzero probability of selection from the NORC National Frame, and then contacted by U.S. mail, telephone or face-to-face interviewers. Read more details about the NORC AmeriSpeak panel methodology .

Around nine-in-ten U.S. parents with K-12 children at home (93%) said their children have had some online instruction since the coronavirus outbreak began in February 2020, and 30% of these parents said it has been very or somewhat difficult for them to help their children use technology or the internet as an educational tool, according to an April 2021 Pew Research Center survey .

A bar chart showing that mothers and parents with lower incomes are more likely than fathers and those with higher incomes to have trouble helping their children with tech for online learning

Gaps existed for certain groups of parents. For example, parents with lower and middle incomes (36% and 29%, respectively) were more likely to report that this was very or somewhat difficult, compared with just 18% of parents with higher incomes.

This challenge was also prevalent for parents in certain types of communities – 39% of rural residents and 33% of urban residents said they have had at least some difficulty, compared with 23% of suburban residents.

Around a third of parents with children whose schools were closed during the pandemic (34%) said that their child encountered at least one technology-related obstacle to completing their schoolwork during that time. In the April 2021 survey, the Center asked parents of K-12 children whose schools had closed at some point about whether their children had faced three technology-related obstacles. Around a quarter of parents (27%) said their children had to do schoolwork on a cellphone, 16% said their child was unable to complete schoolwork because of a lack of computer access at home, and another 14% said their child had to use public Wi-Fi to finish schoolwork because there was no reliable connection at home.

Parents with lower incomes whose children’s schools closed amid COVID-19 were more likely to say their children faced technology-related obstacles while learning from home. Nearly half of these parents (46%) said their child faced at least one of the three obstacles to learning asked about in the survey, compared with 31% of parents with midrange incomes and 18% of parents with higher incomes.

A chart showing that parents with lower incomes are more likely than parents with higher incomes to say their children have faced tech-related schoolwork challenges in the pandemic

Of the three obstacles asked about in the survey, parents with lower incomes were most likely to say that their child had to do their schoolwork on a cellphone (37%). About a quarter said their child was unable to complete their schoolwork because they did not have computer access at home (25%), or that they had to use public Wi-Fi because they did not have a reliable internet connection at home (23%).

A Center survey conducted in April 2020 found that, at that time, 59% of parents with lower incomes who had children engaged in remote learning said their children would likely face at least one of the obstacles asked about in the 2021 survey.

A year into the outbreak, an increasing share of U.S. adults said that K-12 schools have a responsibility to provide all students with laptop or tablet computers in order to help them complete their schoolwork at home during the pandemic. About half of all adults (49%) said this in the spring 2021 survey, up 12 percentage points from a year earlier. An additional 37% of adults said that schools should provide these resources only to students whose families cannot afford them, and just 13% said schools do not have this responsibility.

A bar chart showing that roughly half of adults say schools have responsibility to provide technology to all students during pandemic

While larger shares of both political parties in April 2021 said K-12 schools have a responsibility to provide computers to all students in order to help them complete schoolwork at home, there was a 15-point change among Republicans: 43% of Republicans and those who lean to the Republican Party said K-12 schools have this responsibility, compared with 28% last April. In the 2021 survey, 22% of Republicans also said schools do not have this responsibility at all, compared with 6% of Democrats and Democratic leaners.

Even before the pandemic, Black teens and those living in lower-income households were more likely than other groups to report trouble completing homework assignments because they did not have reliable technology access. Nearly one-in-five teens ages 13 to 17 (17%) said they are often or sometimes unable to complete homework assignments because they do not have reliable access to a computer or internet connection, a 2018 Center survey of U.S. teens found.

A bar chart showing that in 2018, Black teens and those from lower-income households were especially likely to be impacted by the digital 'homework gap'

One-quarter of Black teens said they were at least sometimes unable to complete their homework due to a lack of digital access, including 13% who said this happened to them often. Just 4% of White teens and 6% of Hispanic teens said this often happened to them. (There were not enough Asian respondents in the survey sample to be broken out into a separate analysis.)

A wide gap also existed by income level: 24% of teens whose annual family income was less than $30,000 said the lack of a dependable computer or internet connection often or sometimes prohibited them from finishing their homework, but that share dropped to 9% among teens who lived in households earning $75,000 or more a year.

  • Coronavirus (COVID-19)
  • COVID-19 & Technology
  • Digital Divide
  • Education & Learning Online

Download Katherine Schaeffer's photo

Katherine Schaeffer is a research analyst at Pew Research Center .

How Americans View the Coronavirus, COVID-19 Vaccines Amid Declining Levels of Concern

Online religious services appeal to many americans, but going in person remains more popular, about a third of u.s. workers who can work from home now do so all the time, how the pandemic has affected attendance at u.s. religious services, mental health and the pandemic: what u.s. surveys have found, most popular.

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan, nonadvocacy fact tank that informs the public about the issues, attitudes and trends shaping the world. It does not take policy positions. The Center conducts public opinion polling, demographic research, computational social science research and other data-driven research. Pew Research Center is a subsidiary of The Pew Charitable Trusts , its primary funder.

© 2024 Pew Research Center

IMAGES

  1. (PDF) RESEARCH ON ONLINE LEARNING

    research study about online learning

  2. Impact Of Online Learning Research Paper

    research study about online learning

  3. exe learning online

    research study about online learning

  4. What Research Does—and Does Not—Tell Us About Online Learning

    research study about online learning

  5. The Advantages of Online Learning At Home

    research study about online learning

  6. (PDF) The Effect of Online Learning Technology on Learning Effectiveness

    research study about online learning

VIDEO

  1. Qualitative Research Study- Online Learning

  2. Student Story: Balancing Study and being a Mum

  3. Ensuring Safety in Online Reinforcement Learning by Leveraging Offline Data

  4. Adverbs of Frequency #shorts #englishlearning #english #animation #englishspeaking #репетитор

  5. पिता का कर्ज अदा करने के लिए पूरे दुनिया की दौलत भी कुछ नहीं || Capt. Zile Singh Academy

  6. Online Study App

COMMENTS

  1. Online and face‐to‐face learning: Evidence from students' performance during the Covid‐19 pandemic

    Evaluation of evidence‐based practices in online learning: A meta‐analysis and review of online learning studies (Report No. ed‐04‐co‐0040 task 0006). U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Washington DC. ... International Review of Research in Open and Distance Learning, 4 (2), 1-20 ...

  2. (PDF) The Effectiveness of Online Learning: Beyond No Significant

    (PDF) The Effectiveness of Online Learning: Beyond No ...

  3. A systematic review of research on online teaching and learning from

    A systematic review of research on online teaching and ...

  4. The effects of online education on academic success: A meta-analysis study

    The purpose of this study is to determine the effect size of online education on academic achievement. Before determining the effect sizes in the study, the probability of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin's Safe N Analysis, Duval and Tweedie's Trip and Fill Analysis, and Egger's Regression Test.

  5. Review of Education

    This systematic analysis examines effectiveness research on online and blended learning from schools, particularly relevant during the Covid-19 pandemic, and also educational games, computer-supported cooperative learning (CSCL) and computer-assisted instruction (CAI), largely used in schools but with potential for outside school.

  6. PDF A Systematic Review of the Research Topics in Online Learning During

    Table 1 summarizes the 12 topics in online learning research in the current research and compares it to Martin et al.'s (2020) study, as shown in Figure 1. The top research theme in our study was engagement (22.5%), followed by course design and development (12.6%) and course technology (11.0%).

  7. Student satisfaction and academic efficacy during online learning with

    The COVID-19 pandemic caused unprecedented changes to educational institutions, forcing their closure and a subsequent shift to online education to cater to student learning requirements. However, successful online learning depends on several factors and may also vary between countries. As such, this cross-sectional study sought to investigate how engagement of university students, a major ...

  8. PDF Online vs in-person learning in higher education: effects on student

    learning, as identified in existing research. A number of studies point to the superiority of traditional in-person education over online learning in terms of academic outcomes. For example ...

  9. COVID-19's impacts on the scope, effectiveness, and ...

    The COVID-19 outbreak brought online learning to the forefront of education. Scholars have conducted many studies on online learning during the pandemic, but only a few have performed quantitative comparative analyses of students' online learning behavior before and after the outbreak. We collected review data from China's massive open online course platform called icourse.163 and ...

  10. Online vs in-person learning in higher education: effects on student

    Online vs in-person learning in higher education: effects on ...

  11. The effects of online education on academic success: A meta-analysis study

    The effects of online education on academic success

  12. Assessing the Impact of Online-Learning Effectiveness and Benefits in

    Online learning is one of the educational solutions for students during the COVID-19 pandemic. Worldwide, most universities have shifted much of their learning frameworks to an online learning model to limit physical interaction between people and slow the spread of COVID-19. The effectiveness of online learning depends on many factors, including student and instructor self-efficacy, attitudes ...

  13. Insights Into Students' Experiences and Perceptions of Remote Learning

    This result is consistent with prior research on the value of active learning (Freeman et al., 2014). Though research shows that student learning improves in active learning classes, on campus, student perceptions of their learning, enjoyment, and satisfaction with instruction are often lower in active-learning courses (Deslauriers et al., 2019 ...

  14. Learners' Satisfaction and Commitment Towards Online Learning During

    Online learning can be defined as the latest model of learning and the use of the Internet to access learning materials; to interact with the content, instructor and other learners; and to obtain support during the learning process, to acquire knowledge, construct personal meaning and grow from the learning experience (Martin et al., 2020).During the COVID-19 pandemic, the educational sectors ...

  15. Online learning during COVID-19 produced equivalent or better student

    Online learning during COVID-19 produced equivalent or ...

  16. (Pdf) Research on Online Learning

    The CoI model has formed the basis for a good deal of research on online learning. Most of this research. has focused on one of the three pr esences, social presence being the most frequently ...

  17. How Effective Is Online Learning? What the Research Does and Doesn't

    How Effective Is Online Learning? What the Research ...

  18. A systematic review of research on online teaching and learning from

    1. Introduction. Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase (Allen & Seaman, 2017), and so has the research on online learning.There have been review studies conducted on specific areas on online learning such as innovations ...

  19. Integrating students' perspectives about online learning: a hierarchy

    This article reports on a large-scale (n = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students' perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social ...

  20. Students' experience of online learning during the COVID‐19 pandemic: A

    Students' experience of online learning during the COVID ...

  21. Online Learning: Challenges and Solutions for Learners and Teachers

    The article presents some challenges faced by teachers and learners, supplemented with the recommendations to remove them. JEL Code: A20. The COVID-19 pandemic has led to an expansion in the demand for online teaching and learning across the globe. Online teaching and learning is attracting many students for enhanced learning experiences.

  22. Key findings about online learning and the ...

    Parents with lower incomes whose children's schools closed amid COVID-19 were more likely to say their children faced technology-related obstacles while learning from home. Nearly half of these parents (46%) said their child faced at least one of the three obstacles to learning asked about in the survey, compared with 31% of parents with ...

  23. Traditional Learning Compared to Online Learning During the COVID-19

    By examining the strategic goals of online learning, college facilitators, faculty, and instructors find that while online education thus targets learners, develops their skills, encourages student participation, and promotes scientific innovation, its full implementation remains underdeveloped (Andrade et al., 2020). Some universities have ...