online formative assessment in higher education

Online Assessment in Higher Education: A Systematic Review

  • Joana Heil University of Mannheim
  • Dirk Ifenthaler University of Mannheim Curtin University https://orcid.org/0000-0002-2446-6548

Online assessment is defined as a systematic method of gathering information about a learner and learning processes to draw inferences about the learner’s dispositions. Online assessments provide opportunities for meaningful feedback and interactive support for learners as well as possible influences on the engagement of learners and learning outcomes. The purpose of this systematic literature review is to identify and synthesize original research studies focusing on online assessments in higher education. Out of an initial set of 4,290 publications, a final sample of 114 key publications was identified, according to predefined inclusion criteria. The synthesis yielded four main categories of online assessment modes: peer, teacher, automated, and self-assessment. The synthesis of findings supports the assumption that online assessments have promising potential in supporting and improving online learning processes and outcomes. A summary of success factors for implementing online assessments includes instructional support as well as clear-defined assessment criteria. Future research may focus on online assessments harnessing formative and summative data from stakeholders and learning environments to facilitate learning processes in real-time and help decision-makers to improve learning environments, i.e., analytics-

(*) indicates publications included in the systematic review.

*Abbakumov, D., Desmet, P., & Van den Noortgate, W. (2020). Rasch model extensions for en-hanced formative assessments in MOOCs. Applied Measurement in Education, 33(2), 113–123.

*Acosta-Gonzaga, E., & Walet, N. R. (2018). The role of attitudinal factors in mathematical online assessments: A study of undergraduate STEM students. Assessment & Evaluation in Higher Education, 43(5), 710–726.

Admiraal, W., Huisman, B., & van de Ven, M. (2014). Self- and peer assessment in Massive Open Online Courses. International Journal of Higher Education, 3(3), 119–128. https://doi.org/10.5430/ijhe.v3n3p119

*Admiraal, W., Huisman, B., & Pilli, O. (2015). Assessment in Massive Open Online Courses. Electronic Journal of E-Learning, 13(4), 207–216.

Ahmed, A., & Pollitt, A. (2010). The support model for interactive assessment. Assessment in Ed-ucation: Principles, Policy & Practice, 17(2), 133–167.

*Amhag, L. (2020). Student reflections and self-assessments in vocational training supported by a mobile learning hub. International Journal of Mobile and Blended Learning, 12(1), 1–16.

*ArchMiller, A., Fieberg, J., Walker, J. D., & Holm, N. (2017). Group peer assessment for sum-mative evaluation in a graduate-level statistics course for ecologists. Assessment & Evalua-tion in Higher Education, 42(8), 1208–1220. http://dx.doi.org/10.1080/02602938.2016.1243219

*Ashton, S., & Davies, R. S. (2015). Using scaffolded rubrics to improve peer assessment in a MOOC writing course. Distance Education, 36(3), 312–334. http://dx.doi.org/10.1080/01587919.2015.1081733

*Azevedo, B. F., Pereira, A. I., Fernandes, F. P., & Pacheco, M. F. (2022). Mathematics learning and assessment using MathE platform: A case study. Education and Information Technol-ogies, 27(2), 1747–1769. https://doi.org/10.1007/s10639-021-10669-y

*Babo, R., Babo, L., Suhonen, J., & Tukiainen, M. (2020). E-Assessment with multiple-choice questions: A 5-year study of students’ opinions and experience. Journal of Information Technology Education: Innovations in Practice, 19, 1–29. https://doi.org/10.28945/4491

*Bacca-Acosta, J., & Avila-Garzon, C. (2021). Student engagement with mobile-based assessment systems: A survival analysis. Journal of Computer Assisted Learning, 37(1), 158–171. https://doi.org/10.1111/jcal.12475

Baker, E., Chung, G., & Cai, L. (2016). Assessment, gaze, refraction, and blur: The course of achievement testing in the past 100 years. Review of Research in Education, 40, 94–142. https://doi.org/10.3102/0091732X16679806

Baleni, Z. (2015). Online formative assessment in higher education: Its pros and cons. Electronic Journal of e-Learning, 13(4), 228–226.

*Bekmanova, G., Ongarbayev, Y., Somzhurek, B., & Mukatayev, N. (2021). Personalized training model for organizing blended and lifelong distance learning courses and its effectiveness in higher education. Journal of Computing in Higher Education, 33(3), 668–683. https://doi.org/10.1007/s12528-021-09282-2

Bektik, D. (2019). Issues and challenges for implementing writing analytics at higher education. In D. Ifenthaler, J. Y.-K. Yau, & D.-K. Mah (Eds.), Utilizing learning analytics to support study success (pp. 143–155). Springer.

Bellotti, F., Kapralos, B., Lee, K., Moreno-Ger, P., & Berta, R. (2013). Assessment in and of seri-ous games: An overview. Advances in Human-Computer Interaction, 2013, 1:1. https://doi.org/10.1155/2013/136864

Bennett, R. E. (2015). The changing nature of educational assessment. Review of Research in Edu-cation, 39(1), 370–407. https://doi.org/10.3102/0091732x14554179

*Birks, M., Hartin, P., Woods, C., Emmanuel, E., & Hitchins, M. (2016). Students’ perceptions of the use of eportfolios in nursing and midwifery education. Nurse Education in Practice, 18, 46–51. https://doi.org/10.1016/j.nepr.2016.03.003

Black, P. J. (1998). Testing: friend or foe? The theory and practice of assessment and testing. Falmer Press.

Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21, 5–31. https://doi.org/10.1007/s11092-008-9068-5

*Bohndick, C., Menne, C. M., Kohlmeyer, S., & Buhl, H. M. (2020). Feedback in Internet-based self-assessments and its effects on acceptance and motivation. Journal of Further and Higher Education, 44(6), 717–728. https://doi.org/10.1080/0309877X.2019.1596233

Bonk, C. J., Lee, M. M., Reeves, T. C., & Reynolds, T. H. (Eds.). (2015). MOOCs and open education around the world. Routledge. https://doi.org/10.4324/9781315751108 .

Boud, D. (2000). Sustainable assessment: rethinking assessment for the learning society. Studies in Continuing Education, 22(2), 151–167. https://doi.org/10.1080/713695728

Carless, D. (2007). Learning-oriented assessment: conceptual bases and practical implications. Innovations in Education and Teaching International, 44(1), 57–66. https://doi.org/10.1080/14703290601081332

*Carnegie, J. (2015). Use of feedback-oriented online exercises to help physiology students con-struct well-organized answers to short-answer questions. CBE—Life Sciences Education, 14(3), ar25. https://doi.org/10.1187/cbe.14-08-0132

*Carpenter, S. K., Rahman, S., Lund, T. J. S., Armstrong, P. I., Lamm, M. H., Reason, R. D., & Coffman, C. R. (2017). Students’ use of optional online reviews and its relationship to summative assessment outcomes in introductory biology. CBE—Life Sciences Education, 16(2), ar23. https://doi.org/10.1187/cbe.16-06-0205

*Caspari-Sadeghi, S., Forster-Heinlein, B., Maegdefrau, J., & Bachl, L. (2021). Student-generated questions: developing mathematical competence through online assessment. International Journal for the Scholarship of Teaching and Learning, 15(1), 8. https://doi.org/10.20429/ijsotl.2021.150108

*Chaudy, Y., & Connolly, T. (2018). Specification and evaluation of an assessment engine for educational games: Empowering educators with an assessment editor and a learning analyt-ics dashboard. Entertainment Computing, 27, 209–224. https://doi.org/10.1016/j.entcom.2018.07.003

*Chen, X., Breslow, L., & DeBoer, J. (2018). Analyzing productive learning behaviors for stu-dents using immediate corrective feedback in a blended learning environment. Computers & Education, 117, 59–74. https://doi.org/10.1016/j.compedu.2017.09.013

*Chen, Z., Jiao, J., & Hu, K. (2021). Formative assessment as an online instruction intervention: Student engagement, outcomes, and perceptions. International Journal of Distance Educa-tion Technologies, 19(1), 50–65. https://doi.org/10.4018/IJDET.20210101.oa1

*Chew, E., Snee, H., & Price, T. (2016). Enhancing international postgraduates’ learning experi-ence with online peer assessment and feedback innovation. Innovations in Education and Teaching International, 53(3), 247–259. https://doi.org/10.1080/14703297.2014.937729

Conrad, D., & Openo, J. (2018). Assessment strategies for online learning: engagement and au-thenticity. Athabasca University Press. https://doi.org/10.15215/aupress/9781771992329.01

*Davis, M. C., Duryee, L. A., Schilling, A. H., Loar, E. A., & Hammond, H. G. (2020). Examin-ing the impact of multiple practice quiz attempts on student exam performance. Journal of Educators Online, 17(2).

*Dermo, J., & Boyne, J. (2014). Assessing understanding of complex learning outcomes and real-world skills using an authentic software tool: A study from biomedical sciences. Practi-tioner Research in Higher Education, 8(1), 101–112.

Dochy, F. J. R. C., Segers, M., & Sluijsmans, D. (1999). The use of self-, peer and co-assessment in higher education: A review. Studies in Higher Education, 24(3), 331–350.

*Elizondo-Garcia, J., Schunn, C., & Gallardo, K. (2019). Quality of peer feedback in relation to instructional design: a comparative study in energy and sustainability MOOCs. Interna-tional Journal of Instruction, 12(1), 1025–1040.

Ellis, C. (2013). Broadening the scope and increasing usefulness of learning analytics: the case for assessment analytics. British Journal of Educational Technology, 44(4), 662–664. https://doi.org/10.1111/bjet.12028

*Ellis, S., & Barber, J. (2016). Expanding and personalising feedback in online assessment: a case study in a school of pharmacy. Practitioner Research in Higher Education, 10(1), 121–129.

*Farrelly, D., & Kaplin, D. (2019). Using student feedback to inform change within a community college teacher education program’s ePortfolio initiative. Community College Enterprise, 25(2), 9–38.

*Faulkner, M., Mahfuzul Aziz, S., Waye, V., & Smith, E. (2013). Exploring ways that eportfolios can support the progressive development of graduate qualities and professional competen-cies. Higher Education Research and Development, 32(6), 871–887.

*Filius, R. M., de Kleijn, R. A. M., Uijl, S. G., Prins, F. J., van Rijen, H. V. M., & Grobbee, D. E. (2019). Audio peer feedback to promote deep learning in online education. Journal of Computer Assisted Learning, 35(5), 607–619. https://doi.org/10.1111/jcal.12363

*Filius, R. M., Kleijn, R. A. M. de, Uijl, S. G., Prins, F. J., Rijen, H. V. M. van, & Grobbee, D. E. (2018). Strengthening dialogic peer feedback aiming for deep learning in SPOCs. Comput-ers & Education, 125, 86–100. https://doi.org/10.1016/j.compedu.2018.06.004

*Formanek, M., Wenger, M. C., Buxner, S. R., Impey, C. D., & Sonam, T. (2017). Insights about large-scale online peer assessment from an analysis of an astronomy MOOC. Computers & Education, 113, 243–262. https://doi.org/10.1016/j.compedu.2017.05.019

*Förster, M., Weiser, C., & Maur, A. (2018). How feedback provided by voluntary electronic quizzes affects learning outcomes of university students in large classes. Computers & Ed-ucation, 121, 100–114. https://doi.org/10.1016/j.compedu.2018.02.012

*Fratter, I., & Marigo, L. (2018). Integrated forms of self-assessment and placement testing for Italian L2 aimed at incoming foreign university exchange students at the University of Pad-ua. Language Learning in Higher Education, 8(1), 91–114. https://doi.org/10.1515/cercles-2018-0005

*Gamage, S. H. P. W., Ayres, J. R., Behrend, M. B., & Smith, E. J. (2019). Optimising Moodle quizzes for online assessments. International Journal of STEM Education, 6(1), 1–14. https://doi.org/10.1186/s40594-019-0181-4

*Gámiz Sánchez, V., Montes Soldado, R., & Pérez López, M. C. (2014). Self-assessment via a blended-learning strategy to improve performance in an accounting subject. International Journal of Educational Technology in Higher Education, 11(2), 43–54. https://doi.org/10.7238/rusc.v11i2.2055

*Garcia-Peñalvo, F. J., Garcia-Holgado, A., Vazquez-Ingelmo, A., & Carlos Sanchez-Prieto, J. (2021). Planning, communication and active methodologies: online assessment of the soft-ware engineering subject during the COVID-19 crisis. Ried-Revista Iberoamericana De Educacion A Distancia, 24(2), 41–66. https://doi.org/10.5944/ried.24.2.27689

Gašević, D., Greiff, S., & Shaffer, D. (2022). Towards strengthening links between learning analytics and assessment: Challenges and potentials of a promising new bond. Computers in Human Behavior, 134, 107304. https://doi.org/10.1016/j.chb.2022.107304

Gašević, D., Joksimović, S., Eagan, B. R., & Shaffer, D. W. (2019). SENS: Network analytics to combine social and cognitive perspectives of collaborative learning. Computers in Human Behavior, 92, 562–577. https://doi.org/10.1016/j.chb.2018.07.003

Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics, 4(2), 113–128. https://doi.org/jla.2017.42.10

Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004

*Gleason, J. (2012). Using technology-assisted instruction and assessment to reduce the effect of class size on student outcomes in undergraduate mathematics courses. College Teaching, 60(3), 87–94. https://doi.org/ 10.1080/87567555.2011.637249

*González-Gómez, D., Jeong, J. S., & Canada-Canada, F. (2020). Examining the effect of an online formative assessment tool (O Fat) of students’ motivation and achievement for a university science education. Journal of Baltic Science Education, 19(3), 401–414. https://doi.org/10.33225/jbse/20.19.401

Gottipati, S., Shankararaman, V., & Lin, J. R. (2018). Text analytics approach to extract course improvement suggestions from students’ feedback. Research and Practice in Technology Enhanced Learning, 13(6). https://doi.org/10.1186/s41039-018-0073-0

*Guerrero-Roldán, A.-E., & Noguera, I. (2018). A model for aligning assessment with compe-tences and learning activities in online courses. Internet And Higher Education, 38, 36–46. https://doi.org/10.1016/j.iheduc.2018.04.005

*Hains-Wesson, R., Wakeling, L., & Aldred, P. (2014). A university-wide ePortfolio initiative at Federation University Australia: Software analysis, test-to-production, and evaluation phases. International Journal of EPortfolio, 4(2), 143–156.

* Hashim, H., Salam, S., Mohamad, S. N. M., & Sazali, N. S. S. (2018). The designing of adap-tive self-assessment activities in second language learning using massive open online courses (MOOCs). International Journal of Advanced Computer Science and Applica-tions, 9(9), 276–282.

*Hay, P. J., Engstrom, C., Green, A., Friis, P., Dickens, S., & Macdonald, D. (2013). Promoting assessment efficacy through an integrated system for online clinical assessment of practical skills. Assessment & Evaluation in Higher Education, 38(5), 520–535. https://doi.org/10.1080/02602938.2012.658019

*Herzog, M. A., & Katzlinger, E. (2017). The multiple faces of peer review in higher education. five learning scenarios developed for digital business. EURASIA Journal of Mathematics, Science & Technology Education, 13(4), 1121–1143. https://doi.org/ 10.12973/eurasia.2017.00662a

*Hickey, D., & Rehak, A. (2013). Wikifolios and participatory assessment for engagement, under-standing, and achievement in online courses. Journal of Educational Multimedia and Hy-permedia, 22(4), 407–441.

*Holmes, N. (2018). Engaging with assessment: increasing student engagement through continu-ous assessment. Active Learning in Higher Education, 19(1), 23–34. https://doi.org/10.1177/1469787417723230

*Hughes, M., Salamonson, Y., & Metcalfe, L. (2020). Student engagement using multiple-attempt "Weekly Participation Task" quizzes with undergraduate nursing students. Nurse Educa-tion in Practice, 46, 102803. https://doi.org/10.1016/j.nepr.2020.102803

*Huisman, B., Admiraal, W., Pilli, O., van de Ven, M., & Saab, N. (2018). Peer assessment in moocs: the relationship between peer reviewers’ ability and authors’ essay performance. British Journal of Educational Technology, 49(1), 101–110. https://doi.org/ 10.1111/bjet.12520

*Hwang, W.-Y., Hsu, J.-L., Shadiev, R., Chang, C.-L., & Huang, Y.-M. (2015). Employing self-assessment, journaling, and peer sharing to enhance learning from an online course. Jour-nal of Computing in Higher Education, 27(2), 114–133.

Ifenthaler, D. (2012). Determining the effectiveness of prompts for self-regulated learning in problem-solving scenarios. Journal of Educational Technology & Society, 15(1), 38–52.

Ifenthaler, D. (2023). Automated essay grading systems. In O. Zawacki-Richter & I. Jung (Eds.), Hanboock of open, distance and digital education (pp. 1057–1071). Springer. https://doi.org/10.1007/978-981-19-2080-6_59

Ifenthaler, D., & Greiff, S. (2021). Leveraging learning analytics for assessment and feedback. In J. Liebowitz (Ed.), Online learning analytics (pp. 1–18). Auerbach Publications. https://doi.org/10.1201/9781003194620

Ifenthaler, D., Greiff, S., & Gibson, D. C. (2018). Making use of data for assessments: harnessing analytics and data science. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International Handbook of IT in Primary and Secondary Education (2nd ed., pp. 649–663). Springer. https://doi.org/10.1007/978-3-319-71054-9_41

Ifenthaler, D., Schumacher, C., & Kuzilek, J. (2023). Investigating students’ use of self-assessments in higher education using learning analytics. Journal of Computer Assisted Learning, 39(1), 255–268. https://doi.org/10.1111/jcal.12744

*James, R. (2016). Tertiary student attitudes to invigilated, online summative examinations. Inter-national Journal of Educational Technology in Higher Education, 13(1), 19. https://doi.org/10.1186/s41239-016-0015-0

*Jarrott, S., & Gambrel, L. E. (2011). The bottomless file box: electronic portfolios for learning and evaluation purposes. International Journal of EPortfolio, 1(1), 85–94.

Johnson, W. L., & Lester, J. C. (2016). Face-to-Face interaction with pedagogical agents, twenty years later. International Journal of Artificial Intelligence in Education, 26(1), 25–36. https://doi.org/10.1007/s40593-015-0065-9

*Kim, Y. A., Rezende, L., Eadie, E., Maximillian, J., Southard, K., Elfring, L., Blowers, P., & Ta-lanquer, V. (2021). Responsive teaching in online learning environments: using an instruc-tional team to promote formative assessment and sense of community. Journal of College Science Teaching, 50(4), 17–24.

Kim, Y. J., & Ifenthaler, D. (2019). Game-based assessment: The past ten years and moving forward. In D. Ifenthaler & Y. J. Kim (Eds.), Game-based assessment revisted (pp. 3–12). Springer. https://doi.org/10.1007/978-3-030-15569-8_1

*Kristanto, Y. D. (2018). Technology-enhanced pre-instructional peer assessment: Exploring stu-dents’ perceptions in a statistical methods course. Online Submission, 4(2), 105–116.

*Küchemann, S., Malone, S., Edelsbrunner, P., Lichtenberger, A., Stern, E.,

Schumacher, R., Brünken, R., Vaterlaus, A., & Kuhn, J. (2021). Inventory for the

assessment of representational competence of vector fields. Physical Review

Physics Education Research, 17(2), 20126.

https://doi.org/10.1103/PhysRevPhysEducRes.17.020126

*Kühbeck, F., Berberat, P. O., Engelhardt, S., & Sarikas, A. (2019). Correlation of online assess-ment parameters with summative exam performance in undergraduate medical education of pharmacology: A prospective cohort study. BMC Medical Education, 19(1), 412. https://doi.org/10.1186/s12909-019-1814-5

*Law, S. (2019). Using digital tools to assess and improve college student writing. Higher Educa-tion Studies, 9(2), 117–123.

Lee, H.-S., Gweon, G.-H., Lord, T., Paessel, N., Pallant, A., & Pryputniewicz, S. (2021). Machine learning-enabled automated feedback: Supporting students’ revision of scientific arguments based on data drawn from simulation. Journal of Science Education and Technology, 30(2), 168–192. https://doi.org/10.1007/s10956-020-09889-7

Lenhard, W., Baier, H., Hoffmann, J., & Schneider, W. (2007). Automatische Bewertung offener Antworten mittels Latenter Semantischer Analyse [Automatic scoring of constructed-response items with latent semantic analysis]. Diagnostica, 53(3), 155–165. https://doi.org/10.1026/0012-1924.53.3.155

*Li, L., & Gao, F. (2016). The effect of peer assessment on project performance of students at dif-ferent learning levels. Assessment & Evaluation in Higher Education, 41(6), 885–900.

*Li, L., Liu, X., & Steckelberg, A. L. (2010). Assessor or assessee: How student learning im-proves by giving and receiving peer feedback. British Journal of Educational Technology, 41(3), 525–536. https://doi.org/ 10.1111/j.1467-8535.2009.00968.x

*Liu, E. Z.-F., & Lee, C.-Y. (2013). Using peer feedback to improve learning via online peer as-sessment. Turkish Online Journal of Educational Technology—TOJET, 12(1), 187–199.

*Liu, X., Li, L., & Zhang, Z. (2018). Small group discussion as a key component in online as-sessment training for enhanced student learning in web-based peer assessment. Assessment & Evaluation in Higher Education, 43(2), 207–222. https://doi.org/10.1080/02602938.2017.1324018

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://doi.org/10.1177/0002764213479367

*López-Tocón, I. (2021). Moodle quizzes as a continuous assessment in higher education: An ex-ploratory approach in physical chemistry. Education Sciences, 11(9), 500. https://

doi.org/10.3390/educsci11090500

*Luaces, O., Díez, J., Alonso-Betanzos, A., Troncoso, A., & Bahamonde, A. (2017).

Content-based methods in peer assessment of open-response questions to grade students as authors and as graders. Knowledge-Based Systems, 117, 79–87. https://doi.org/10.1016/j.knosys.2016.06.024

*MacKenzie, L. M. (2019). Improving learning outcomes: Unlimited vs. limited attempts and time for supplemental interactive online learning activities. Journal of Curriculum and Teaching, 8(4), 36–45. https://doi.org/10.5430/jct.v8n4p36

*Mao, J., & Peck, K. (2013). Assessment strategies, self-regulated learning skills, and perceptions of assessment in online learning. Quarterly Review of Distance Education, 14(2), 75–95.

*Martin, F., Ritzhaupt, A., Kumar, S., & Budhrani, K. (2019). Award-winning faculty online teaching practices: Course design, assessment and evaluation, and facilitation. The Internet and Higher Education, 42, 34–43. https://doi.org/10.1016/j.iheduc.2019.04.001

Martin, F., & Whitmer, J. C. (2016). Applying learning analytics to investigate timed release in online learning. Technology, Knowledge and Learning, 21(1), 59–74. https://doi.org/10.1007/s10758-015-9261-9

*Mason, R., & Williams, B. (2016). Using ePortfolio’s to assess undergraduate paramedic stu-dents: a proof of concept evaluation. International Journal of Higher Education, 5(3), 146–154. https://doi.org/ 10.5430/ijhe.v5n3p146

*McCarthy, J. (2017). Enhancing feedback in higher education: Students’ attitudes towards online and in-class formative assessment feedback models. Active Learning in Higher Education, 18(2), 127–141. https://doi.org/10.1177/146978741770761

*McCracken, J., Cho, S., Sharif, A., Wilson, B., & Miller, J. (2012). Principled assessment strate-gy design for online courses and programs. Electronic Journal of E-Learning, 10(1), 107–119.

*McNeill, M., Gosper, M., & Xu, J. (2012). Assessment choices to target higher order learning outcomes: the power of academic empowerment. Research in Learning Technology, 20(3), 283–296.

*McWhorter, R. R., Delello, J. A., Roberts, P. B., Raisor, C. M., & Fowler, D. A. (2013). A cross-case analysis of the use of web-based eportfolios in higher education. Journal of In-formation Technology Education: Innovations in Practice, 12, 253–286.

*Meek, S. E. M., Blakemore, L., & Marks, L. (2017). Is peer review an appropriate form of as-sessment in a MOOC? Student participation and performance in formative peer review. As-sessment & Evaluation in Higher Education, 42(6), 1000–1013.

*Milne, L., McCann, J., Bolton, K., Savage, J., & Spence, A. (2020). Student satisfaction with feedback in a third year Nutrition unit: A strategic approach. Journal of University Teach-ing and Learning Practice, 17(5), 67–83. https://doi.org/10.53761/1.17.5.5

Montenegro-Rueda, M., Luque-de la Rosa, A., Sarasola Sánchez-Serrano, J. L., & Fernández-Cerero, J. (2021). Assessment in higher education during the COVID-19 pandemic: A sys-tematic review. Sustainability, 13(19), 10509.

Moore, M. G., & Kearsley, G. (2011). Distance education: a systems view of online learning. Wadsworth Cengage Learning.

*Mora, M. C., Sancho-Bru, J. L., Iserte, J. L., & Sanchez, F. T. (2012). An e-assessment approach for evaluation in engineering overcrowded groups. Computers & Education, 59(2), 732–740. https://doi.org/10.1016/j.compedu.2012.03.011

Newton, P. E. (2007). Clarifying the purposes of educational assessment. Assessment in Education: Principles, Policy & Practice, 14(2), 149–170. https://doi.org/10.1080/09695940701478321

*Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703–714. https://doi.org/10.1016/j.chb.2017.03.028

*Nicholson, D. T. (2018). Enhancing student engagement through online portfolio assessment. Practitioner Research in Higher Education, 11(1), 15–31.

*Ogange, B. O., Agak, J. O., Okelo, K. O., & Kiprotich, P. (2018). Student perceptions of the effectiveness of formative assessment in an online learning environment. Open Praxis, 10(1), 29–39.

*Ortega-Arranz, A., Bote-Lorenzo, M. L., Asensio-Pérez, J. I., Martínez-Monés, A., Gómez-Sánchez, E., & Dimitriadis, Y. (2019). To reward and beyond: Analyzing the effect of re-ward-based strategies in a MOOC. Computers & Education, 142, 103639. https://doi.org/10.1016/j.compedu.2019.103639

Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grim-shaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., . . . Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ, 372, n71. https://doi.org/10.1136/bmj.n71

Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. National Academy Press.

*Pinargote-Ortega, M., Bowen-Mendoza, L., Meza, J., & Ventura, S. (2021). Peer assessment using soft computing techniques. Journal of Computing in Higher Education, 33(3), 684–726. https://doi.org/10.1007/s12528-021-09296-w

*Polito, G., & Temperini, M. (2021). A gamified web based system for computer programming learning. Computers and Education: Artificial Intelligence, 2, 100029. https://doi.org/10.1016/j.caeai.2021.100029

*Reilly, E. D., Williams, K. M., Stafford, R. E., Corliss, S. B., Walkow, J. C., & Kidwell, D. K. (2016). Global times call for global measures: investigating automated essay scoring in lin-guistically-diverse MOOCs. Online Learning, 20(2), 217–229.

*Rogerson-Revell, P. (2015). Constructively aligning technologies with learning and assessment in a distance education master’s programme. Distance Education, 36(1), 129–147.

*Ross, B., Chase, A.-M., Robbie, D., Oates, G., & Absalom, Y. (2018). Adaptive quizzes to in-crease motivation, Mark engagement and learning outcomes in a first year accounting unit. International Journal Of Educational Technology In Higher Education, 15(1), 1–14. https://doi.org/10.1186/s41239-018-0113-2

*Sampaio-Maia, B., Maia, J. S., Leitao, S., Amaral, M., & Vieira-Marques, P. (2014). Wiki as a tool for Microbiology teaching, learning and assessment. European Journal of Dental Edu-cation, 18(2), 91–97. https://doi.org/10.1111/eje.12061

*Sancho-Vinuesa, T., Masià, R., Fuertes-Alpiste, M., & Molas-Castells, N. (2018). Exploring the effectiveness of continuous activity with automatic feedback in online calculus. Computer Applications in Engineering Education, 26(1), 62–74. https://doi.org/10.1002/cae.21861

*Santamaría Lancho, M., Hernández, M., Sánchez-Elvira Paniagua, Á., Luzón Encabo, J. M., & de Jorge-Botana, G. (2018). Using semantic technologies for formative assessment and scor-ing in large courses and MOOCs. Journal of Interactive Media in Education, 2018(1), 1–10. https://doi.org/10.5334/jime.468 .

*Sarcona, A., Dirhan, D., & Davidson, P. (2020). An overview of audio and written feedback from students’ and instructors’ perspective. Educational Media International, 57(1), 47–60. https://doi.org/10.1080/09523987.2020.1744853

*Scalise, K., Douskey, M., & Stacy, A. (2018). Measuring learning gains and examining implica-tions for student success in STEM. Higher Education Pedagogies, 3(1), 183–195. https://doi.org/10.1080/23752696.2018.1425096

*Schaffer, H. E., Young, K. R., Ligon, E. W., & Chapman, D. D. (2017). Automating individual-ized formative feedback in large classes based on a directed concept graph. Frontiers in Psychology, 8, 260. https://doi.org/10.1080/23752696.2018.1425096

Schumacher, C., & Ifenthaler, D. (2021). Investigating prompts for supporting students' self-regulation—A remaining challenge for learning analytics approaches? The Internet and Higher Education, 49, 100791. https://doi.org/10.1016/j.iheduc.2020.100791

*Schultz, M., Young, K., K. Gunning, T., & Harvey, M. L. (2022). Defining and measuring au-thentic assessment: a case study in the context of tertiary science. Assessment & Evaluation in Higher Education, 47(1), 77–94. https://doi.org/10.1080/02602938.2021.1887811

*Sekendiz, B. (2018). Utilisation of formative peer-assessment in distance online education: A case study of a multi-model sport management unit. Interactive Learning Environments, 26(5), 682–694. https://doi.org/10.1080/10494820.2017.1396229

*Senel, S., & Senel, H. C. (2021). Remote assessment in higher education during COVID-19 pan-demic. International Journal of Assessment Tools in Education, 8(2), 181–199.

*Shaw, L., MacIsaac, J., & Singleton-Jackson, J. (2019). The efficacy of an online cognitive as-sessment tool for enhancing and improving student academic outcomes. Online Learning Journal, 23(2), 124–144. https://doi.org/ 10.24059/olj.v23i2.1490

Shute, V. J., Wang, L., Greiff, S., Zhao, W., & Moore, G. (2016). Measuring problem solving skills via stealth assessment in an engaging video game. Computers in Human Behavior, 63, 106–117. https://doi.org/10.1016/j.chb.2016.05.047

Stödberg, U. (2012). A research review of e-assessment. Assessment & Evaluation in Higher Ed-ucation, 37(5), 591–604. https://doi.org/10.1080/02602938.2011.557496

*Stratling, R. (2017). The complementary use of audience response systems and online tests to implement repeat testing: a case study. British Journal of Educational Technology, 48(2), 370–384. https://doi.org/ 10.1111/bjet.12362

*Sullivan, D., & Watson, S. (2015). Peer assessment within hybrid and online courses: Students’ view of its potential and performance. Journal of Educational Issues, 1(1), 1–18. https://doi.org/10.5296/jei.v1i1.7255

*Taghizadeh, M., Alavi, S. M., & Rezaee, A. A. (2014). Diagnosing L2 learners’ language skills based on the use of a web-based assessment tool called DIALANG. International Journal of E-Learning & Distance Education, 29(2), n2.

*Tawafak, R. M., Romli, A. M., & Alsinani, M. J. (2019). Student assessment feedback effective-ness model for enhancing teaching method and developing academic performance. Interna-tional Journal of Information and Communication Technology Education, 15(3), 75–88. https://doi.org/10.4018/IJICTE.2019070106

*Tempelaar, D. (2020). Supporting the less-adaptive student: The role of learning analytics, forma-tive assessment and blended learning. Assessment & Evaluation in Higher Education, 45(4), 579–593.

Tempelaar, D. T., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning analytics application using formative assessment. Computers in Human Behavior, 78, 408–420. https://doi.org/10.1016/j.chb.2017.08.010

*Tenório, T., Bittencourt, I. I., Isotani, S., Pedro, A., & Ospina, P. (2016). A gamified peer as-sessment model for on-line learning environments in a competitive context. Computers in Human Behavior, 64, 247–263. https://doi.org/10.1016/j.chb.2016.06.049

*Thille, C., Schneider, E., Kizilcec, R. F., Piech, C., Halawa, S. A., & Greene, D. K. (2014). The future of data-enriched assessment. Research & Practice in Assessment, 9, 5–16.

*Tsai, N. W. (2016). Assessment of students’ learning behavior and academic misconduct in a student-pulled online learning and student-governed testing environment: A case study. Journal of Education for Business, 91(7), 387–392. https://dx.doi.org/10.1080/08832323.2016.1238808

*Tucker, C., Pursel, B. K., & Divinsky, A. (2014). Mining student-generated textual data in MOOCs and quantifying their effects on student performance and learning outcomes. Computers in Education Journal, 5(4), 84–95.

*Tucker, R. (2014). Sex does not matter: Gender bias and gender differences in peer assessments of contributions to group work. Assessment & Evaluation in Higher Education, 39(3), 293–309. http://dx.doi.org/10.1080/02602938.2013.830282

Turkay, S., & Tirthali, D. (2010). Youth leadership development in virtual worlds: A case study. Procedia - Social and Behavioral Sciences, 2(2), 3175–3179. https://doi.org/10.1016/j.sbspro.2010.03.485

*Turner, J., & Briggs, G. (2018). To see or not to see? Comparing the effectiveness of examina-tions and end of module assessments in online distance learning. Assessment & Evaluation in Higher Education, 43(7), 1048–1060. https://doi.org/10.1080/02602938.2018.1428730

*Vaughan, N. (2014). Student engagement and blended learning: Making the assessment connec-tion. Education Sciences, 4(4), 247–264. https://doi.org/10.3390/educsci4040247

*Wadmany, R., & Melamed, O. (2018). "New Media in Education" MOOC: Improving peer as-sessments of students’ plans and their innovativeness. Journal of Education and E-Learning Research, 5(2), 122–130. https://doi.org/10.20448/journal.509.2018.52.122.130

*Wang, S., & Wang, H. (2012). Organizational schemata of e-portfolios for fostering higher-order thinking. Information Systems Frontiers, 14(2), 395–407. https:// doi.org/10.1007/s10796-010-9262-0

*Wang, Y.-M. (2019). Enhancing the quality of online discussion—assessment matters. Journal of Educational Technology Systems, 48(1), 112–129. https://doi.org/10.1177/0047239519861

*Watson, S. L., Watson, W. R., & Kim, W. (2017). Primary assessment activity and learner per-ceptions of attitude change in four MOOCs. Educational Media International, 54(3), 245–260. https://doi.org/10.1080/09523987.2017.1384165

Webb, M., Gibson, D. C., & Forkosh-Baruch, A. (2013). Challenges for information technology supporting educational assessment. Journal of Computer Assisted Learning, 29(5), 451–462. https://doi.org/10.1111/jcal.12033

Webb, M., & Ifenthaler, D. (2018). Assessment as, for and of 21st century learning using infor-mation technology: An overview. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International Handbook of IT in Primary and Secondary Education (2nd ed., pp. 1–20). Springer.

Wei, X., Saab, N., & Admiraal, W. (2021). Assessment of cognitive, behavioral, and affective learning outcomes in massive open online courses: A systematic literature review. Comput-ers & Education, 163, 104097.

*Wells, J., Spence, A., & McKenzie, S. (2021). Student participation in computing studies to un-derstand engagement and grade outcome. Journal of Information Technology Education, 20, 385–403. https://doi.org/10.28945/4817

*West, J., & Turner, W. (2016). Enhancing the assessment experience: Improving student percep-tions, engagement and understanding using online video feedback. Innovations in Educa-tion and Teaching International, 53(4), 400–410. http://dx.doi.org/10.1080/14703297.2014.1003954

Whitelock, D., & Bektik, D. (2018). Progress and challenges for automated scoring and feedback systems for large-scale assessments. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International Handbook of IT in Primary and Secondary Education (2nd ed., pp. 617–634). Springer.

*Wilkinson, K., Dafoulas, G., Garelick, H., & Huyck, C. (2020). Are quiz-games an effective re-vision tool in anatomical sciences for higher education and what do students think of them? British Journal of Educational Technology, 51(3), 761–777. https://doi.org/ 10.1111/bjet.12883

*Wu, C., Chanda, E., & Willison, J. (2014). Implementation and outcomes of online self and peer assessment on group based honours research projects. Assessment & Evaluation in Higher Education, 39(1), 21–37. http://dx.doi.org/10.1080/02602938.2013.779634

*Xian, L. (2020). The effectiveness of dynamic assessment in linguistic accuracy in efl writing: an investigation assisted by online scoring systems. Language Teaching Research Quarterly, 18, 98–114.

*Xiao, Y. A. N. G., & Hao, G. A. O. (2018). Teaching business english course: Incorporating portfolio assessment-based blended learning and MOOC. Journal of Literature and Art Studies, 8(9), 1364–1369. https://doi.org/10.17265/2159-5836/2018.09.008

*Yang, T. C., Chen, S. Y., & Chen, M. C. (2016). An investigation of a two-tier test strategy in a university calculus course: Causes versus consequences. IEEE Transactions on Learning Technologies, 9(2), 146–156.

*Yeh, H.-C., & Lai, P.-Y. (2012). Implementing online question generation to foster reading com-prehension. Australasian Journal of Educational Technology, 28(7), 1152–1175.

*Zhan, Y. (2021). What matters in design? Cultivating undergraduates’ critical thinking through online peer assessment in a confucian heritage context. Assessment & Evaluation in Higher Education, 46(4), 615–630. https://doi.org/10.1080/02602938.2020.1804826

*Zong, Z., Schunn, C. D., & Wang, Y. (2021). What aspects of online peer feedback robustly pre-dict growth in students’ task performance? Computers in Human Behavior, 124, 106924. https://doi.org/10.1016/j.chb.2021.106924

As a condition of publication, the author agrees to apply the Creative Commons – Attribution International 4.0 (CC-BY) License to OLJ articles. See: https://creativecommons.org/licenses/by/4.0/ .

This licence allows anyone to reproduce OLJ articles at no cost and without further permission as long as they attribute the author and the journal. This permission includes printing, sharing and other forms of distribution.

Author(s) hold copyright in their work, and retain publishing rights without restrictions

Scopus Citescore 2022

The DOAJ Seal is awarded to journals that demonstrate best practice in open access publishing

OLC Membership

Join the OLC

OLC Research Center

online formative assessment in higher education

Information

  • For Readers
  • For Authors
  • For Librarians

More information about the publishing system, Platform and Workflow by OJS/PKP.

The Role of Online Formative Assessment in Higher Education: Effectiveness and Student Satisfaction

  • First Online: 30 June 2022

Cite this chapter

online formative assessment in higher education

  • Wendy Farrell 8 &
  • Juliana Pattermann 9  

Part of the book series: Forschung und Praxis an der FHWien der WKW ((FPGHW))

1460 Accesses

It is critical to understand how to design courses that keep students engaged and help them achieve the desired learning outcome. This is especially true in an online classroom where student reaction is harder to gauge. Accordingly, this exploratory study attempts to understand how incorporating frequent formative assessments, or ongoing knowledge checks, in online classes can impact student engagement and achievement. Effectiveness was evaluated according to reaction and learning, the first two levels of the Kirkpatrick four-level model. Learning Analytics helped assess effectiveness. Results indicate that according to the first level of the model, reaction, students appreciate a more interactive class. Furthermore, participation in the formative assessments was analyzed in conjunction with the summative assessments, such as exams and graded assignments, to evaluate the second level, learning. Here too, the results show a significant positive impact of increased participation in achieving the desired learning outcomes. This study shows the importance of learning analytics, even with little data, to make informed decisions about course design. Furthermore, it shows the importance of ensuring formative assessments occur at regular intervals throughout online classes. It offers students a better experience and helps them better achieve the desired learning outcome.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Alliger GM, Tannenbaum SI, Bennett W et al (1997) A meta-analysis of the relations among training criteria. Pers Psychol 50:341–358. https://doi.org/10.1111/j.1744-6570.1997.tb00911.x

Article   Google Scholar  

Arroway P, Morgan G, O’Keefe M, Yanosky R (2015) Learning analytics in higher education. EDUCAUSE Center for Analysis and Research, Louisville

Google Scholar  

Arthur W, Tubreʼ T, Paul DS, Edens PS (2003) Teaching effectiveness: the relationship between reaction and learning evaluation criteria. Educ Psychol 23:275–285. https://doi.org/10.1080/0144341032000060110

Baleni ZG (2015) Online formative assessment in higher education: its pros and cons. Electron J e-Learn 13:228–236

Bassi F (2018) Dynamic clustering to evaluate satisfaction with teaching at university. Int J Educ Manag 32:1070–1081. https://doi.org/10.1108/IJEM-07-2017-0162

Bates R (2004) A critical analysis of evaluation practice: the Kirkpatrick model and the principle of beneficence. Eval Prog Plan 27:341–347. https://doi.org/10.1016/j.evalprogplan.2004.04.011

Beebe R, Vonderwell S, Boboc M (2010) Emerging patterns in transferring assessment practices from F2f to online environments. Electron J e-Learn 8:1–12

Berendt B, Littlejohn A, Kern P et al (2017) Big Data for monitoring educational systems. European Commission,

Bers TH (2008) The role of institutional assessment in assessing student learning outcomes. New Direct High Educ 2008:31–39. https://doi.org/10.1002/he.291

Black P, Wiliam D (1998) Assessment and classroom learning. Assess Educ Princ Policy Pract 5:7–74. https://doi.org/10.1080/0969595980050102

Broos T, Verbert K, Langie G et al (2017) Small data as a conversation starter for learning analytics: exam results dashboard for first-year students in higher education. J Res Innovat Teach Learn 10:94–106. https://doi.org/10.1108/JRIT-05-2017-0010

Chang H-P (2010) Applying adaptive course caching and presentation strategies in M-learning environment. In: Proceedings of the 2010 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM). IEEE, Macao, China, pp 1314–1318.

Dipace A, Loperfido FF, Scarinci A (2018) From big data to learning analytics for a personalized learning experience. Res Educ Media 10:3–9. https://doi.org/10.1515/rem-2018-0009

Flodén J (2017) The impact of student feedback on teaching in higher education. Assess Eval High Educ 42:1054–1068. https://doi.org/10.1080/02602938.2016.1224997

Gikandi JW, Morrow D, Davis NE (2011) Online formative assessment in higher education: a review of the literature. Comput Educ 57:2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004

Goodhue D, Lewis W, Thompson R (2006) PLS, small sample size, and statistical Power in MIS Research. In: Proceedings of the 39th Annual Hawaii International Conference on System Sciences (HICSS'06). IEEE, Kauia

Hair JF, Hult GTM, Ringle C, Sarstedt M (2016) A primer on partial least squares structural equation modeling (PLS-SEM) (2nd Edn.). Kindle Edition. Sage Publications.

Hargreaves E (2008) Assessment. In: McCulloch G, Crook D (eds) The Routledge international encyclopedia of education. Taylor and Francis, Hoboken, pp 37–38.

Hattie J, Timperley H (2007) The power of feedback. Rev Educ Res 77:81–112. https://doi.org/10.3102/003465430298487

Ifenthaler D (2015) Learning analytics. In: Spector JM (ed) The SAGE encyclopedia of educational technology. SAGE Publications, Thousand Oaks, 91320.

Kirkpatrick D (1967) Evaluation of training. In: Craig RL, Bittel LR (eds) Training and development handbook. McGraw-Hill Book Company, New York

Leitner P, Khalil M, Ebner M (2017) Learning analytics in higher education – a literature review. In: Peña-Ayala A (ed) Learning analytics: fundaments, applications, and trends. Springer International Publishing AG, Basel pp 1–23.

Lindner MA (2019) Lernbegleitende Tests in der Hochschullehre als Feedback für Studierende und Lehrende: Die Rolle des Aufgabenformats. jlb. https://doi.org/10.35468/jlb-01-2019_05

Meyer DZ, Avery LM (2009) Excel as a qualitative data analysis tool. Field Methods 21:91–112. https://doi.org/10.1177/1525822X08323985

Ogange BO, Agak J, Okelo KO, Kiprotich P (2018) Student perceptions of the effectiveness of formative assessment in an online learning environment. openpraxis 10:29. https://doi.org/10.5944/openpraxis.10.1.705

Parker RE (2013) Evaluation considerations. In: Cutting-edge technologies in higher education. Emerald Group Publishing Limited, S 113–141

Praslova L (2010) Adaptation of Kirkpatrick‘s four level model of training criteria to assessment of learning outcomes and program evaluation in Higher Education. Educ Asse Eval Acc 22:215–225. https://doi.org/10.1007/s11092-010-9098-7

Prvan T, Reid A, Petocz P (2002) Statistical laboratories using Minitab, SPSS and Excel: a practical comparison. Teach Statist 24:68–75. https://doi.org/10.1111/1467-9639.00089

Sclater N, Bailey P (2015) Code of practice for learning analytics.

Sclater N, Peasgood A, Mullan J (2016) Learning analytics in higher education: a review of UK and international practice. JISC, Bristol

Shute VJ (2008) Focus on formative feedback. Rev Educ Res 78:153–189. https://doi.org/10.3102/0034654307313795

Tempelaar D (2020) Supporting the less-adaptive student: the role of learning analytics, formative assessment and blended learning. Assess Eval High Educ 45:579–593. https://doi.org/10.1080/02602938.2019.1677855

Tempelaar D, Rienties B, Mittelmeier J, Nguyen Q (2018) Student profiling in a dispositional learning analytics application using formative assessment. Comput Hum Behav 78:408–420. https://doi.org/10.1016/j.chb.2017.08.010

Tempelaar DT, Rienties B, Giesbers B (2015) In search for the most informative data for feedback generation: learning analytics in a data-rich context. Comput Hum Behav 47:157–167. https://doi.org/10.1016/j.chb.2014.05.038

Turnbull D, Chugh R, Luck J (2021) Transitioning to E-Learning during the COVID-19 pandemic: how have Higher Education Institutions responded to the challenge? Educ Inf Technol 26:6401–6419. https://doi.org/10.1007/s10639-021-10633-w

Vella EJ, Turesky EF, Hebert J (2016) Predictors of academic success in web-based courses: age, GPA, and instruction mode. Qual Assur Educ 24:586–600. https://doi.org/10.1108/QAE-08-2015-0035

Vonderwell SK, Boboc M (2013) Promoting formative assessment in online teaching and learning. Techtrends Tech Trends 57:22–27. https://doi.org/10.1007/s11528-013-0673-x

Wong BT-M, Li KC, Choi SP-M (2018) Trends in learning analytics practices: a review of higher education institutions. Interact Technol Smart Educ 15:132–154. https://doi.org/10.1108/ITSE-12-2017-0065

Download references

Author information

Authors and affiliations.

Betriebswirtschaft online, Management Center Innsbruck, Innsbruck, Austria

Wendy Farrell

Wirtschaft & Management, Management Center Innsbruck, Innsbruck, Austria

Juliana Pattermann

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Wendy Farrell .

Editor information

Editors and affiliations.

Institut für Medien, Gesellschaft und Kommunikation, Leopold-Franzens-Universität Innsbruck, Innsbruck, Österreich

Uta Rußmann

Studienbereich Tourism & Hospitality Management, FHWien der WKW, Wien, Österreich

Florian Aubke

Institute for Business Ethics and Sustainable Strategy, FHWien der WKW, Wien, Österreich

Daniela Ortiz

Studienbereich Marketing & Sales Management, FHWien der WKW, Wien, Österreich

Ilona Pezenka

Institute for Digital Transformation and Strategy, FHWien der WKW, Wien, Österreich

Ann-Christine Schulz

Studienbereich Human Resources & Organization, FHWien der WKW, Wien, Österreich

Christina Schweiger

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Der/die Autor(en), exklusiv lizenziert an Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature

About this chapter

Farrell, W., Pattermann, J. (2022). The Role of Online Formative Assessment in Higher Education: Effectiveness and Student Satisfaction. In: Rußmann, U., Aubke, F., Ortiz, D., Pezenka, I., Schulz, AC., Schweiger, C. (eds) Zukunft verantwortungsvoll gestalten. Forschung und Praxis an der FHWien der WKW. Springer Gabler, Wiesbaden. https://doi.org/10.1007/978-3-658-36861-6_11

Download citation

DOI : https://doi.org/10.1007/978-3-658-36861-6_11

Published : 30 June 2022

Publisher Name : Springer Gabler, Wiesbaden

Print ISBN : 978-3-658-36860-9

Online ISBN : 978-3-658-36861-6

eBook Packages : Business and Economics (German Language)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Logo

Three approaches to improve your online teaching

When designing online courses and teaching remotely, teachers need to select the framework that supports learning goals. Here, three academics break online learning techniques into their key parts

Antoni Badia

.css-76pyzs{margin-right:0.25rem;} ,, consuelo garcía, julio meneses, additional links.

  • More on this topic

laptop open with headphones and coffee cup

You may also like

Illustration of a person using a laptop

Popular resources

.css-1txxx8u{overflow:hidden;max-height:81px;text-indent:0px;} Emotions and learning: what role do emotions play in how and why students learn?

A diy guide to starting your own journal, universities, ai and the common good, artificial intelligence and academic integrity: striking a balance, create an onboarding programme for neurodivergent students.

Many university teachers need help identifying the right conceptual framework for designing online courses and teaching in virtual learning environments (VLEs). A quick online search will suggest a multitude of pedagogic approaches and supposed innovative teaching methods, but many of these are not evidence-based. 

So, here we will summarise three research-based  online teaching approaches that teachers can use to guide their work in VLEs. We explain the focus, roles of teacher and students, the online learning environment and learning resources for each approach.

The content-acquisition approach

The main aim of this approach is to provide subject content to students. It is designed to cover all the essential concepts and principles of the subject matter. The content should be accessible, clear, concise, relevant, interactive, well explained and up to date to facilitate the students’ comprehensive understanding of the subject matter.

  • Make EDI in higher education a reality by building it into your course design
  • The three stages of developing a framework to support students of open and distance learning
  • Master the art of effective feedback

Teacher role: The teacher is the primary source of knowledge. They are responsible for selecting and organising the course content, choosing the appropriate learning resources for students, designing the learning activities and assessing students’ learning outcomes.

Student role:  The student   is expected to acquire and learn the course content from the provided resources and complete the learning assignments. Each student’s acquired knowledge should accurately represent the course content. Most students’ learning time is devoted to self-paced learning.

Online learning environment: This should facilitate learners’ content acquisition. Thus, technology-based tools are mainly used to produce, provide access to and deliver learning resources and materials. The VLE will contain lesson plans, activities and grades and class recordings. The technology used needs to deliver live online classes, with teachers’ real-time explanations, to many students worldwide. A prototype example of the content acquisition approach can be seen in most  massive open online courses (Moocs).

Learning resources: Learning resources may be selected and designed to accurately represent the nature of the content and enhance students’ learning experience. Various kinds of resources may be used to represent and deliver the content, such as multimedia (video and audio tools), authoring tools, interactive resources (computer simulations, games, virtual labs), mobile resources and adaptive resources (intelligent tutoring systems).

The knowledge-building approach

The key feature of this approach is the knowledge-building processes through which students expand their individual understanding.

Teacher role: The teacher’s primary role is to guide, help and supervise students’ learning. The course design includes learning objectives, content, activities, scaffolding and support, and assessments. Course implementation involves supervision to ensure the correct application of learning skills and adequate task completion, correcting students’ misunderstandings, and monitoring and assessing the learning process and learning outcomes.

Student role: The student should actively engage in an individual knowledge-building process and activate high-level thinking processes. This is encouraged through enquiry-based assignments in which students must find solutions to project-based challenges or work through real-world or theoretical examples. Such assignments require a deep understanding of the content, challenging students to explore information, apply critical thinking , ask questions, design enquiries, interpret evidence, form explanations and communicate findings. Students must make connections, and apply and use their knowledge in authentic learning situations.

Online learning environment: Technology should help students to develop the best individual knowledge-building processes, learn content and complete activities. These environments integrate learning tools that support students’ learning and work on specific higher-order-thinking processes such as solving authentic problems. The  web-based inquiry science environment (WISE) provides a good example of an online learning environment of this kind.

Learning resources: Generally referred to as instructional scaffolding, resources combine content and learning guides, question prompts, model examples and conceptual, procedural, strategic and metacognitive supports to guide learning.

The collaborative learning approach

The critical feature of this approach is students’ collaboration in learning in virtual environments, usually using synchronous and written communication. 

Teacher role: The teacher aims to facilitate high-quality social participation among students in collaborative learning activities. To achieve this, the teacher should create a learning environment that promotes relationships of trust and mutual commitment and provide learning resources that guide students’ social participation and collaboration in groups and in the community. The teacher should encourage communication, solve student conflicts, control learning periods and monitor students’ learning pace. Content and activities, such as online debates and team presentations, should be designed to facilitate collaboration and learning among equals.

Student role: Students engage naturally and proactively in community participation and group work and collaborate with peers to complete learning assignments, solve problems and share knowledge and perspectives. They participate in discussion and group work, and use critical thinking and creativity .

Online learning environments: Digital tools should be used that facilitate group-work organisation, promote group communication and collaboration, support knowledge exchange, and elaborate on shared outcomes. One example of such an online environment is Moodle, particularly the chat, forum, wiki and glossary functions.

Learning resources: Participants are expected to take an active role, building on the initial selection of learning resources the teacher provides according to their learning needs. For example, a shared learning task for students may be solving a problem or tackling a learning challenge collaboratively. In that situation, the students should search for and select new information related to the specific issue or challenge of the learning assignment and share it with their peers.

Teachers may use a single teaching approach to design an online course, but it is more likely they will take features from all three and combine them to best fit the different elements of the course and its learning objectives.

Antoni Badia is director of the bachelor’s degree in primary education, and  Julio Meneses is director of learning and teaching analysis at the eLearning Innovation Center; both are at the Open University of Catalonia. Consuelo García is vice-rector for teaching and educational innovation at the Valencian International University.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter .

For more teaching and learning insight from the authors, see “ Emotions in response to teaching online: Exploring the factors influencing teachers in a fully online university ”, as published in Innovations in Education and Teaching International .  

Emotions and learning: what role do emotions play in how and why students learn?

Global perspectives: navigating challenges in higher education across borders, how to help young women see themselves as coders, contextual learning: linking learning to the real world, authentic assessment in higher education and the role of digital creative technologies, how hard can it be testing ai detection tools.

Register for free

and unlock a host of features on the THE site

Select your location

  • North America English
  • Brazil Português
  • Latin America Español

Asia Pacific

  • Australia English
  • Germany Deutsch
  • Spain Español
  • United Kingdom English
  • Benelux Dutch
  • Italy Italiano

Reflecting on ASU+GSV Summit 2024: Navigating the AI Landscape in Education

The ASU+GSV Summit 2024 , a cornerstone event in the edtech conference calendar, once again brought together a diverse array of thought leaders, innovators, educators, investors, and other stakeholders from across the globe to explore the intersection of education and technology. Against the backdrop of a rapidly evolving technological landscape, this year's Summit delved deep into the transformative potential of artificial intelligence (AI) in shaping the future of learning, even adding the new AIR Show as a free, educator-centric event to showcase AI tools, use cases, and actionable best practices. 

The Instructure team was honored to participate in these conversations and learn from all the sessions, meetings, and casual run-ins with brilliant people working to understand, define, and support the future of learning. From the ubiquitous presence of AI-driven products to the pressing concerns surrounding data safety and ethical implications, this year's Summit was a catalyst for robust dialogue and forward-thinking initiatives aimed at reimagining the educational paradigm. Here are seven key themes and takeaways from this year's ASU+GSV Summit.

Thoughtful AI Integration : AI is dominating the conversation about education technology and making its way into the classroom and back office. Many AI tools are relying on GPT-like chat interfaces for tutoring and content generation while others are pushing the boundaries of innovation. Amidst this progress, concerns loom large around data safety and the social implications of AI integration, and educators are asking for additional training to most effectively leverage these tools.

Policy Gaps in Education : Despite AI's growing presence, educational institutions are grappling with a lack of solid policies. A staggering 79% of K12 districts lack clear guidelines, as revealed by an EdWeek Market Brief survey. Higher ed institutions are in a similar situation–a recent Inside Higher Ed poll found that only one in five colleges and universities had published AI policies. The call for guidance in navigating AI's ethical and practical dimensions is clear, but there is also recognition that policies need to remain flexible as our understanding and use of AI evolves.

Co-designing AI with Stakeholders : It's imperative to involve educators, students, and parents in the co-design process of AI tools. This collaborative approach helps to ensure that AI is contextual, safe, and effective. Moreover, underserved communities must be prioritized, and efforts should consider fostering more person-to-person interactions alongside technological advancements.

Revamping Education for Career Readiness : The traditional 'Portrait of a Graduate' framework is under scrutiny in light of evolving job markets and increasingly tech-enabled learning. There's a pressing need to revamp educational systems to better align with career readiness. K12 and higher education institutions must work in tandem to bridge the gap and prepare students effectively for evolving job landscapes.

Mobility and Universal Taxonomies : Mobility is key in preparing individuals for careers in rapidly evolving industries. Universal taxonomies are being considered as tools to aid learners in navigating the dynamic job market. However, flexibility is paramount to accommodate the nuances of different sectors and individual organizations.

Funding Challenges and Program Priorities : The looming ESSER funding cliff is a pressing concern for K12 organizations, with programs like social-emotional learning and summer learning at highest risk of defunding. Core programs and high-dosage tutoring are most in demand amongst administrators, but edtech providers will increasingly need to differentiate their solutions and demonstrate efficacy. Higher ed institutions are facing reductions due to the end of stimulus funds, too, in addition to budget challenges related to declining enrollments, and are also looking closely at technology spending.

Focus on Evidence-Based Approaches : In line with the ESSER funding cliff, evidence matters more and more when it comes to edtech. In their session on Designing with Evidence , Digital Promise emphasized innovative, rapid-cycle approaches (like those offered by LearnPlatform ) to support more effective product development. This highlights a shift towards more agile and responsive methods of evaluating educational interventions that help educators take action and make decisions.

Coming back from a conference like the ASU+GSV Summit is always invigorating. For the Instructure team, it reinforces our mission to inspire everyone to learn together and our core values of openness, relationships, equality, ownership, and simplicity. We're excited to continue tackling these big challenges–especially those related to AI–collaboratively with our institution and edtech provider partners. Want to learn more about how we're getting started? Check out this blog on AI and effective edtech evaluation. 

Discover More Topics:

Related Content

red text: columbus city schools image: school building

Blog Articles

Voices of Columbus City Schools: The Power of Canvas LMS in the Classroom

We can see their usage, but we want to know [which education tools] teachers feel are helpful in their environment.

Evidence-Based EdTech Management: What It Is & Why It Matters

text: fusion academy, image: instructor organizing resources in Canvas Commons in front of a class

In Search of Quality: Using Canvas Commons to Personalize a Resource Library

Stay in the know.

IMAGES

  1. A Practical Guide to Digital Formative Assessment with OneNote in

    online formative assessment in higher education

  2. Web-based formative assessment in higher education blended learning

    online formative assessment in higher education

  3. 75 Formative Assessment Examples (2024)

    online formative assessment in higher education

  4. A Practical Guide to Digital Formative Assessment with OneNote in

    online formative assessment in higher education

  5. Online formative assessment in higher education: Its pros and cons by

    online formative assessment in higher education

  6. A Practical Guide to Digital Formative Assessment with OneNote in

    online formative assessment in higher education

VIDEO

  1. FORMATIVE AND SUMMATIVE ASSESSMENT

  2. FORMATIVE ASSESSMENT: INDIVIDUAL MARKS REGISTER C C E DOCUMENT

  3. Understanding Formative Assessment Versus Summative Assessment

  4. Formative assessment -1,sub.-Math, Topic-Set

  5. Formative assessment 10 English Class 6-7 Pravah english grade 6-7 workbook Formative assessment 10

  6. Power of Formative assessment

COMMENTS

  1. Online formative assessment in higher education: A review of the literature

    In review ing the literature about formative assessment and its pedagogical implications in higher education, Koh (2008) identified deep learning, motivation and self-esteem, self-regulated and transferable learning as main benefits of formative feedback. Koh's review included studies in both online and f2f settings.

  2. PDF Online formative assessment in higher education: Its pros and cons

    1. Introduction. Assessment for learning (formative assessment) has been noticeable intonation in assessment circles rather than assessment of learning (summative assessment) but the main focus has shifted; the use of online and blended learning has developed drastically in the 21st century higher education learning and teaching environment.

  3. Online formative assessment in higher education: A review of the

    As online and blended learning has become common place educational strategy in higher education, educators need to reconceptualise fundamental issues of teaching, learning and assessment in non traditional spaces. These issues include concepts such as validity and reliability of assessment in online environments in relation to serving the intended purposes, as well as understanding how ...

  4. PDF Online Assessment in Higher Education: A Systematic Review

    Online assessment in higher education: A systematic review. Online Learning, 27(1), 187-218. DOI: 10.24059/olj.v27i1.3398 Tracing the history of educational assessment practice is challenging as several diverse concepts refer to the idea of assessment. Our recent search in scientific databases identified an increase in research publications ...

  5. Online Formative Assessment in Higher Education: Bibliometric ...

    Assessment is critical in postsecondary education, as it is at all levels. Assessments are classified into four types: diagnostic, summative, evaluative, and formative. Recent trends in assessment have migrated away from summative to formative evaluations. Formative evaluations help students develop expertise and concentrate their schedules, ease student anxiety, instill a feeling of ownership ...

  6. Online Assessment in Higher Education: A Systematic Review

    The purpose of this systematic literature review is to identify and synthesize original research studies focusing on online assessments in higher education. Out of an initial set of 4,290 publications, a final sample of 114 key publications was identified, according to predefined inclusion criteria. ... Online formative assessment in higher ...

  7. ERIC

    As online and blended learning has become common place educational strategy in higher education, educators need to reconceptualise fundamental issues of teaching, learning and assessment in non traditional spaces. These issues include concepts such as validity and reliability of assessment in online environments in relation to serving the intended purposes, as well as understanding how ...

  8. Online formative assessment in higher education: A review of the

    Committing to quality learning through adaptive online assessment. Assessment and Evaluation in Higher Education. v30 i5. 519-527. Google Scholar; Chung et al., 2006. An exploratory study of a novel online formative assessment and instructional tool to promote students' circuit problem solving.

  9. The Role of Online Formative Assessment in Higher Education

    2.1 Formative and Summative Assessments in Higher Education. Assessments in Higher Education (HE) are conducted at multiple levels, such as the course, program, or even institutional level (Bers 2008). Here an important distinction should be drawn between assessment of learning and assessment for learning (Vonderwell and Boboc 2013). In other ...

  10. PDF The Role of Online Formative Assessment in Higher Education ...

    11.2 Assessments and Their Effectiveness in Higher Education Assessments are an important part of higher education. However, it is important to match the right assessment with the educational need and ensure that the assessment is effecti-vely addressing that need. 11.2.1 Formative and Summative Assessments in Higher Education

  11. [PDF] Online Formative Assessment in Higher Education: Its Pros and

    It is found that effective online formative assessment can nurture a student and assessment centred focus through formative feedback and enrich student commitment with valued learning experiences. Online and blended learning have become common educational strategy in higher education. Lecturers have to re-theorise certain basic concerns of teaching, learning and assessment in non -traditional ...

  12. PDF Online Formative Assessment in Higher Education: Bibliometric Analysis

    Online formative assessment (OFA) emerged as a result of the convergence of formative and computer-assisted assessment research. Bibliometric analyses provide readers with a comprehensive understanding of a study topic across a particular time period. We used a PRISMA-compliant bibliometric method.

  13. [PDF] Online formative assessment in higher education: A review of the

    Education, Computer Science. 2018. TLDR. This article systematically reviews the research literature on the feasibility of improving the feedback efficiency of college online teaching activities and requires a systematic search, review, and writing of this literature review to bring together the key topics and findings in this area of research.

  14. Formative assessment and feedback for learning in higher education: A

    INTRODUCTION. Formative assessment and feedback are fundamental aspects of learning. In higher education (HE), both topics have received considerable attention in recent years with proponents linking assessment and feedback—and strategies for these—to educational, social, psychological and employability benefits (Gaynor, 2020; Jonsson, 2013; van der Schaaf et al., 2013).

  15. Online Assessment in Higher Education: A Systematic Review

    Online assessment has emerged as the primary mode of examination in higher education institutions, with the onset of the Covid-19 pandemic. Online assessment is defined as a systematic method of ...

  16. Online formative assessment in higher education: A review of the

    The bibliometric method was used only by Sudakova (2022) , who conducted a bibliometric research on online formative assessment in higher education, analyzing 898 studies searched in the Scopus ...

  17. Online Formative Assessment in Higher Education: Its Pros and Cons

    Online and blended learning have become common educational strategy in higher education. Lecturers have to re-theorise certain basic concerns of teaching, learning and assessment in non-traditional environments. These concerns include perceptions such as cogency and trustworthiness of assessment in online environments in relation to serving the intended purposes, as well as understanding how ...

  18. Formative assessment and feedback for learning in higher education: A

    We review causal evidence from trials of feedback and formative assessment in higher education. Although the evidence base is currently limited, our results suggest that low stakes-quizzing is a particularly powerful approach and that there are benefits for forms of peer and tutor feedback, although these depend on implementation factors.

  19. A bibliometric analysis of scholarly literature related to digital

    The study of Zou et al. (Citation 2021) provided the perspective of EFL teacher engagement and influence factors on online formative assessment. The results revealed that engagements were largely shaped by beliefs, digital literacies and teaching experiences of the teachers, and were further affected by pertinent contextual and technological ...

  20. Online Formative Assessment in Higher Education: Bibliometric Analysis

    Bibliometric analyses provide readers with a comprehensive understanding of a study topic across a particular time period and the most important topics were "e-assessment", "higher education', and "online learning". Assessment is critical in postsecondary education, as it is at all levels. Assessments are classified into four types: diagnostic, summative, evaluative, and formative ...

  21. Online Formative Assessment in Higher Education:

    Explore millions of resources from scholarly journals, books, newspapers, videos and more, on the ProQuest Platform.

  22. EFL learners' motivation in a gamified formative assessment: The case

    Beer LE Contemplative administration: Transforming the workplace culture of higher education Innovative Higher Education 2010 35 4 217 231 10.1007/s10755-010-9146-8 Google Scholar Cross Ref; Bennett RE Formative assessment: A critical review Assessment in Education: Principles, Policy & Practice 2011 18 1 5 25 Google Scholar Cross Ref

  23. (PDF) Online Formative Assessment in Higher Education ...

    This group. is quite influential yet has a low centrality. The highest percentage is 79.2 for "online assessment". The subject is comprised of three components: "distance learning ...

  24. Programmatic assessment design choices in nine programs in higher education

    Programmatic assessment principle 6: The higher the stakes of the decision, ... Identifying concrete student behavior in co-regulatory formative assessment processes in the classroom. Front. Educ. 6:750281.doi: 10.3389/feduc.2021.75 ... Curricular fit perspective on motivation in higher education. High. Educ. 83, 729-745.doi: 10.1007/s10734 ...

  25. (PDF) Online Formative Assessment in Higher STEM Education: A

    This review aimed at examining the use of online/web-based formative assessment in STEM higher. education. Based on the insight drawn an d the li mitation of a study by G ikandi et a l. (2011 ...

  26. Three approaches to improve your online teaching

    Make EDI in higher education a reality by building it into your course design; ... The course design includes learning objectives, content, activities, scaffolding and support, and assessments. Course implementation involves supervision to ensure the correct application of learning skills and adequate task completion, correcting students ...

  27. Full article: Assessment patterns in teacher education programmes

    Formative assessment is incorporated in the national curriculum of both primary and lower-secondary school teacher education programmes. This is because assessment in primary and lower secondary schools is mainly formative, with grades only provided in secondary school. ... Assessment & Evaluation in Higher Education 39 : 73-88. doi:10.1080 ...

  28. Alignment of learning objectives, assessments, and active learning to

    Despite the instructor's intent to connect learning objectives to formative assessments to prepare students for summative assessments, 20% of the students in the sample reported never using them and only 10% of the students in this sample reported using them very often (rating of 4); the mean rating of 90 students on a scale of 4 was 2.3 ± 0.9 .

  29. Reflecting on ASU+GSV Summit 2024: Navigating the AI Landscape in Education

    The ASU+GSV Summit 2024, a cornerstone event in the edtech conference calendar, once again brought together a diverse array of thought leaders, innovators, educators, investors, and other stakeholders from across the globe to explore the intersection of education and technology.Against the backdrop of a rapidly evolving technological landscape, this year's Summit delved deep into the ...