Online Assessment in Higher Education: A Systematic Review
Keywords:Online assessment, feedback, systematic review, formative assessment, summative assessment, learning analytics
Online assessment is defined as a systematic method of gathering information about a learner and learning processes to draw inferences about the learner’s dispositions. Online assessments provide opportunities for meaningful feedback and interactive support for learners as well as possible influences on the engagement of learners and learning outcomes. The purpose of this systematic literature review is to identify and synthesize original research studies focusing on online assessments in higher education. Out of an initial set of 4,290 publications, a final sample of 114 key publications was identified, according to predefined inclusion criteria. The synthesis yielded four main categories of online assessment modes: peer, teacher, automated, and self-assessment. The synthesis of findings supports the assumption that online assessments have promising potential in supporting and improving online learning processes and outcomes. A summary of success factors for implementing online assessments includes instructional support as well as clear-defined assessment criteria. Future research may focus on online assessments harnessing formative and summative data from stakeholders and learning environments to facilitate learning processes in real-time and help decision-makers to improve learning environments, i.e., analytics-
(*) indicates publications included in the systematic review.
*Abbakumov, D., Desmet, P., & Van den Noortgate, W. (2020). Rasch model extensions for en-hanced formative assessments in MOOCs. Applied Measurement in Education, 33(2), 113–123.
*Acosta-Gonzaga, E., & Walet, N. R. (2018). The role of attitudinal factors in mathematical online assessments: A study of undergraduate STEM students. Assessment & Evaluation in Higher Education, 43(5), 710–726.
Admiraal, W., Huisman, B., & van de Ven, M. (2014). Self- and peer assessment in Massive Open Online Courses. International Journal of Higher Education, 3(3), 119–128. https://doi.org/10.5430/ijhe.v3n3p119
*Admiraal, W., Huisman, B., & Pilli, O. (2015). Assessment in Massive Open Online Courses. Electronic Journal of E-Learning, 13(4), 207–216.
Ahmed, A., & Pollitt, A. (2010). The support model for interactive assessment. Assessment in Ed-ucation: Principles, Policy & Practice, 17(2), 133–167.
*Amhag, L. (2020). Student reflections and self-assessments in vocational training supported by a mobile learning hub. International Journal of Mobile and Blended Learning, 12(1), 1–16.
*ArchMiller, A., Fieberg, J., Walker, J. D., & Holm, N. (2017). Group peer assessment for sum-mative evaluation in a graduate-level statistics course for ecologists. Assessment & Evalua-tion in Higher Education, 42(8), 1208–1220. http://dx.doi.org/10.1080/02602938.2016.1243219
*Ashton, S., & Davies, R. S. (2015). Using scaffolded rubrics to improve peer assessment in a MOOC writing course. Distance Education, 36(3), 312–334. http://dx.doi.org/10.1080/01587919.2015.1081733
*Azevedo, B. F., Pereira, A. I., Fernandes, F. P., & Pacheco, M. F. (2022). Mathematics learning and assessment using MathE platform: A case study. Education and Information Technol-ogies, 27(2), 1747–1769. https://doi.org/10.1007/s10639-021-10669-y
*Babo, R., Babo, L., Suhonen, J., & Tukiainen, M. (2020). E-Assessment with multiple-choice questions: A 5-year study of students’ opinions and experience. Journal of Information Technology Education: Innovations in Practice, 19, 1–29. https://doi.org/10.28945/4491
*Bacca-Acosta, J., & Avila-Garzon, C. (2021). Student engagement with mobile-based assessment systems: A survival analysis. Journal of Computer Assisted Learning, 37(1), 158–171. https://doi.org/10.1111/jcal.12475
Baker, E., Chung, G., & Cai, L. (2016). Assessment, gaze, refraction, and blur: The course of achievement testing in the past 100 years. Review of Research in Education, 40, 94–142. https://doi.org/10.3102/0091732X16679806
Baleni, Z. (2015). Online formative assessment in higher education: Its pros and cons. Electronic Journal of e-Learning, 13(4), 228–226.
*Bekmanova, G., Ongarbayev, Y., Somzhurek, B., & Mukatayev, N. (2021). Personalized training model for organizing blended and lifelong distance learning courses and its effectiveness in higher education. Journal of Computing in Higher Education, 33(3), 668–683. https://doi.org/10.1007/s12528-021-09282-2
Bektik, D. (2019). Issues and challenges for implementing writing analytics at higher education. In D. Ifenthaler, J. Y.-K. Yau, & D.-K. Mah (Eds.), Utilizing learning analytics to support study success (pp. 143–155). Springer.
Bellotti, F., Kapralos, B., Lee, K., Moreno-Ger, P., & Berta, R. (2013). Assessment in and of seri-ous games: An overview. Advances in Human-Computer Interaction, 2013, 1:1. https://doi.org/10.1155/2013/136864
Bennett, R. E. (2015). The changing nature of educational assessment. Review of Research in Edu-cation, 39(1), 370–407. https://doi.org/10.3102/0091732x14554179
*Birks, M., Hartin, P., Woods, C., Emmanuel, E., & Hitchins, M. (2016). Students’ perceptions of the use of eportfolios in nursing and midwifery education. Nurse Education in Practice, 18, 46–51. https://doi.org/10.1016/j.nepr.2016.03.003
Black, P. J. (1998). Testing: friend or foe? The theory and practice of assessment and testing. Falmer Press.
Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21, 5–31. https://doi.org/10.1007/s11092-008-9068-5
*Bohndick, C., Menne, C. M., Kohlmeyer, S., & Buhl, H. M. (2020). Feedback in Internet-based self-assessments and its effects on acceptance and motivation. Journal of Further and Higher Education, 44(6), 717–728. https://doi.org/10.1080/0309877X.2019.1596233
Bonk, C. J., Lee, M. M., Reeves, T. C., & Reynolds, T. H. (Eds.). (2015). MOOCs and open education around the world. Routledge. https://doi.org/10.4324/9781315751108.
Boud, D. (2000). Sustainable assessment: rethinking assessment for the learning society. Studies in Continuing Education, 22(2), 151–167. https://doi.org/10.1080/713695728
Carless, D. (2007). Learning-oriented assessment: conceptual bases and practical implications. Innovations in Education and Teaching International, 44(1), 57–66. https://doi.org/10.1080/14703290601081332
*Carnegie, J. (2015). Use of feedback-oriented online exercises to help physiology students con-struct well-organized answers to short-answer questions. CBE—Life Sciences Education, 14(3), ar25. https://doi.org/10.1187/cbe.14-08-0132
*Carpenter, S. K., Rahman, S., Lund, T. J. S., Armstrong, P. I., Lamm, M. H., Reason, R. D., & Coffman, C. R. (2017). Students’ use of optional online reviews and its relationship to summative assessment outcomes in introductory biology. CBE—Life Sciences Education, 16(2), ar23. https://doi.org/10.1187/cbe.16-06-0205
*Caspari-Sadeghi, S., Forster-Heinlein, B., Maegdefrau, J., & Bachl, L. (2021). Student-generated questions: developing mathematical competence through online assessment. International Journal for the Scholarship of Teaching and Learning, 15(1), 8. https://doi.org/10.20429/ijsotl.2021.150108
*Chaudy, Y., & Connolly, T. (2018). Specification and evaluation of an assessment engine for educational games: Empowering educators with an assessment editor and a learning analyt-ics dashboard. Entertainment Computing, 27, 209–224. https://doi.org/10.1016/j.entcom.2018.07.003
*Chen, X., Breslow, L., & DeBoer, J. (2018). Analyzing productive learning behaviors for stu-dents using immediate corrective feedback in a blended learning environment. Computers & Education, 117, 59–74. https://doi.org/10.1016/j.compedu.2017.09.013
*Chen, Z., Jiao, J., & Hu, K. (2021). Formative assessment as an online instruction intervention: Student engagement, outcomes, and perceptions. International Journal of Distance Educa-tion Technologies, 19(1), 50–65. https://doi.org/10.4018/IJDET.20210101.oa1
*Chew, E., Snee, H., & Price, T. (2016). Enhancing international postgraduates’ learning experi-ence with online peer assessment and feedback innovation. Innovations in Education and Teaching International, 53(3), 247–259. https://doi.org/10.1080/14703297.2014.937729
Conrad, D., & Openo, J. (2018). Assessment strategies for online learning: engagement and au-thenticity. Athabasca University Press. https://doi.org/10.15215/aupress/9781771992329.01
*Davis, M. C., Duryee, L. A., Schilling, A. H., Loar, E. A., & Hammond, H. G. (2020). Examin-ing the impact of multiple practice quiz attempts on student exam performance. Journal of Educators Online, 17(2).
*Dermo, J., & Boyne, J. (2014). Assessing understanding of complex learning outcomes and real-world skills using an authentic software tool: A study from biomedical sciences. Practi-tioner Research in Higher Education, 8(1), 101–112.
Dochy, F. J. R. C., Segers, M., & Sluijsmans, D. (1999). The use of self-, peer and co-assessment in higher education: A review. Studies in Higher Education, 24(3), 331–350.
*Elizondo-Garcia, J., Schunn, C., & Gallardo, K. (2019). Quality of peer feedback in relation to instructional design: a comparative study in energy and sustainability MOOCs. Interna-tional Journal of Instruction, 12(1), 1025–1040.
Ellis, C. (2013). Broadening the scope and increasing usefulness of learning analytics: the case for assessment analytics. British Journal of Educational Technology, 44(4), 662–664. https://doi.org/10.1111/bjet.12028
*Ellis, S., & Barber, J. (2016). Expanding and personalising feedback in online assessment: a case study in a school of pharmacy. Practitioner Research in Higher Education, 10(1), 121–129.
*Farrelly, D., & Kaplin, D. (2019). Using student feedback to inform change within a community college teacher education program’s ePortfolio initiative. Community College Enterprise, 25(2), 9–38.
*Faulkner, M., Mahfuzul Aziz, S., Waye, V., & Smith, E. (2013). Exploring ways that eportfolios can support the progressive development of graduate qualities and professional competen-cies. Higher Education Research and Development, 32(6), 871–887.
*Filius, R. M., de Kleijn, R. A. M., Uijl, S. G., Prins, F. J., van Rijen, H. V. M., & Grobbee, D. E. (2019). Audio peer feedback to promote deep learning in online education. Journal of Computer Assisted Learning, 35(5), 607–619. https://doi.org/10.1111/jcal.12363
*Filius, R. M., Kleijn, R. A. M. de, Uijl, S. G., Prins, F. J., Rijen, H. V. M. van, & Grobbee, D. E. (2018). Strengthening dialogic peer feedback aiming for deep learning in SPOCs. Comput-ers & Education, 125, 86–100. https://doi.org/10.1016/j.compedu.2018.06.004
*Formanek, M., Wenger, M. C., Buxner, S. R., Impey, C. D., & Sonam, T. (2017). Insights about large-scale online peer assessment from an analysis of an astronomy MOOC. Computers & Education, 113, 243–262. https://doi.org/10.1016/j.compedu.2017.05.019
*Förster, M., Weiser, C., & Maur, A. (2018). How feedback provided by voluntary electronic quizzes affects learning outcomes of university students in large classes. Computers & Ed-ucation, 121, 100–114. https://doi.org/10.1016/j.compedu.2018.02.012
*Fratter, I., & Marigo, L. (2018). Integrated forms of self-assessment and placement testing for Italian L2 aimed at incoming foreign university exchange students at the University of Pad-ua. Language Learning in Higher Education, 8(1), 91–114. https://doi.org/10.1515/cercles-2018-0005
*Gamage, S. H. P. W., Ayres, J. R., Behrend, M. B., & Smith, E. J. (2019). Optimising Moodle quizzes for online assessments. International Journal of STEM Education, 6(1), 1–14. https://doi.org/10.1186/s40594-019-0181-4
*Gámiz Sánchez, V., Montes Soldado, R., & Pérez López, M. C. (2014). Self-assessment via a blended-learning strategy to improve performance in an accounting subject. International Journal of Educational Technology in Higher Education, 11(2), 43–54. https://doi.org/10.7238/rusc.v11i2.2055
*Garcia-Peñalvo, F. J., Garcia-Holgado, A., Vazquez-Ingelmo, A., & Carlos Sanchez-Prieto, J. (2021). Planning, communication and active methodologies: online assessment of the soft-ware engineering subject during the COVID-19 crisis. Ried-Revista Iberoamericana De Educacion A Distancia, 24(2), 41–66. https://doi.org/10.5944/ried.24.2.27689
Gašević, D., Greiff, S., & Shaffer, D. (2022). Towards strengthening links between learning analytics and assessment: Challenges and potentials of a promising new bond. Computers in Human Behavior, 134, 107304. https://doi.org/10.1016/j.chb.2022.107304
Gašević, D., Joksimović, S., Eagan, B. R., & Shaffer, D. W. (2019). SENS: Network analytics to combine social and cognitive perspectives of collaborative learning. Computers in Human Behavior, 92, 562–577. https://doi.org/10.1016/j.chb.2018.07.003
Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics, 4(2), 113–128. https://doi.org/jla.2017.42.10
Gikandi, J. W., Morrow, D., & Davis, N. E. (2011). Online formative assessment in higher education: A review of the literature. Computers & Education, 57(4), 2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004
*Gleason, J. (2012). Using technology-assisted instruction and assessment to reduce the effect of class size on student outcomes in undergraduate mathematics courses. College Teaching, 60(3), 87–94. https://doi.org/ 10.1080/87567555.2011.637249
*González-Gómez, D., Jeong, J. S., & Canada-Canada, F. (2020). Examining the effect of an online formative assessment tool (O Fat) of students’ motivation and achievement for a university science education. Journal of Baltic Science Education, 19(3), 401–414. https://doi.org/10.33225/jbse/20.19.401
Gottipati, S., Shankararaman, V., & Lin, J. R. (2018). Text analytics approach to extract course improvement suggestions from students’ feedback. Research and Practice in Technology Enhanced Learning, 13(6). https://doi.org/10.1186/s41039-018-0073-0
*Guerrero-Roldán, A.-E., & Noguera, I. (2018). A model for aligning assessment with compe-tences and learning activities in online courses. Internet And Higher Education, 38, 36–46. https://doi.org/10.1016/j.iheduc.2018.04.005
*Hains-Wesson, R., Wakeling, L., & Aldred, P. (2014). A university-wide ePortfolio initiative at Federation University Australia: Software analysis, test-to-production, and evaluation phases. International Journal of EPortfolio, 4(2), 143–156.
* Hashim, H., Salam, S., Mohamad, S. N. M., & Sazali, N. S. S. (2018). The designing of adap-tive self-assessment activities in second language learning using massive open online courses (MOOCs). International Journal of Advanced Computer Science and Applica-tions, 9(9), 276–282.
*Hay, P. J., Engstrom, C., Green, A., Friis, P., Dickens, S., & Macdonald, D. (2013). Promoting assessment efficacy through an integrated system for online clinical assessment of practical skills. Assessment & Evaluation in Higher Education, 38(5), 520–535. https://doi.org/10.1080/02602938.2012.658019
*Herzog, M. A., & Katzlinger, E. (2017). The multiple faces of peer review in higher education. five learning scenarios developed for digital business. EURASIA Journal of Mathematics, Science & Technology Education, 13(4), 1121–1143. https://doi.org/ 10.12973/eurasia.2017.00662a
*Hickey, D., & Rehak, A. (2013). Wikifolios and participatory assessment for engagement, under-standing, and achievement in online courses. Journal of Educational Multimedia and Hy-permedia, 22(4), 407–441.
*Holmes, N. (2018). Engaging with assessment: increasing student engagement through continu-ous assessment. Active Learning in Higher Education, 19(1), 23–34. https://doi.org/10.1177/1469787417723230
*Hughes, M., Salamonson, Y., & Metcalfe, L. (2020). Student engagement using multiple-attempt "Weekly Participation Task" quizzes with undergraduate nursing students. Nurse Educa-tion in Practice, 46, 102803. https://doi.org/10.1016/j.nepr.2020.102803
*Huisman, B., Admiraal, W., Pilli, O., van de Ven, M., & Saab, N. (2018). Peer assessment in moocs: the relationship between peer reviewers’ ability and authors’ essay performance. British Journal of Educational Technology, 49(1), 101–110. https://doi.org/ 10.1111/bjet.12520
*Hwang, W.-Y., Hsu, J.-L., Shadiev, R., Chang, C.-L., & Huang, Y.-M. (2015). Employing self-assessment, journaling, and peer sharing to enhance learning from an online course. Jour-nal of Computing in Higher Education, 27(2), 114–133.
Ifenthaler, D. (2012). Determining the effectiveness of prompts for self-regulated learning in problem-solving scenarios. Journal of Educational Technology & Society, 15(1), 38–52.
Ifenthaler, D. (2023). Automated essay grading systems. In O. Zawacki-Richter & I. Jung (Eds.), Hanboock of open, distance and digital education (pp. 1057–1071). Springer. https://doi.org/10.1007/978-981-19-2080-6_59
Ifenthaler, D., & Greiff, S. (2021). Leveraging learning analytics for assessment and feedback. In J. Liebowitz (Ed.), Online learning analytics (pp. 1–18). Auerbach Publications. https://doi.org/10.1201/9781003194620
Ifenthaler, D., Greiff, S., & Gibson, D. C. (2018). Making use of data for assessments: harnessing analytics and data science. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International Handbook of IT in Primary and Secondary Education (2nd ed., pp. 649–663). Springer. https://doi.org/10.1007/978-3-319-71054-9_41
Ifenthaler, D., Schumacher, C., & Kuzilek, J. (2023). Investigating students’ use of self-assessments in higher education using learning analytics. Journal of Computer Assisted Learning, 39(1), 255–268. https://doi.org/10.1111/jcal.12744
*James, R. (2016). Tertiary student attitudes to invigilated, online summative examinations. Inter-national Journal of Educational Technology in Higher Education, 13(1), 19. https://doi.org/10.1186/s41239-016-0015-0
*Jarrott, S., & Gambrel, L. E. (2011). The bottomless file box: electronic portfolios for learning and evaluation purposes. International Journal of EPortfolio, 1(1), 85–94.
Johnson, W. L., & Lester, J. C. (2016). Face-to-Face interaction with pedagogical agents, twenty years later. International Journal of Artificial Intelligence in Education, 26(1), 25–36. https://doi.org/10.1007/s40593-015-0065-9
*Kim, Y. A., Rezende, L., Eadie, E., Maximillian, J., Southard, K., Elfring, L., Blowers, P., & Ta-lanquer, V. (2021). Responsive teaching in online learning environments: using an instruc-tional team to promote formative assessment and sense of community. Journal of College Science Teaching, 50(4), 17–24.
Kim, Y. J., & Ifenthaler, D. (2019). Game-based assessment: The past ten years and moving forward. In D. Ifenthaler & Y. J. Kim (Eds.), Game-based assessment revisted (pp. 3–12). Springer. https://doi.org/10.1007/978-3-030-15569-8_1
*Kristanto, Y. D. (2018). Technology-enhanced pre-instructional peer assessment: Exploring stu-dents’ perceptions in a statistical methods course. Online Submission, 4(2), 105–116.
*Küchemann, S., Malone, S., Edelsbrunner, P., Lichtenberger, A., Stern, E.,
Schumacher, R., Brünken, R., Vaterlaus, A., & Kuhn, J. (2021). Inventory for the
assessment of representational competence of vector fields. Physical Review
Physics Education Research, 17(2), 20126.
*Kühbeck, F., Berberat, P. O., Engelhardt, S., & Sarikas, A. (2019). Correlation of online assess-ment parameters with summative exam performance in undergraduate medical education of pharmacology: A prospective cohort study. BMC Medical Education, 19(1), 412. https://doi.org/10.1186/s12909-019-1814-5
*Law, S. (2019). Using digital tools to assess and improve college student writing. Higher Educa-tion Studies, 9(2), 117–123.
Lee, H.-S., Gweon, G.-H., Lord, T., Paessel, N., Pallant, A., & Pryputniewicz, S. (2021). Machine learning-enabled automated feedback: Supporting students’ revision of scientific arguments based on data drawn from simulation. Journal of Science Education and Technology, 30(2), 168–192. https://doi.org/10.1007/s10956-020-09889-7
Lenhard, W., Baier, H., Hoffmann, J., & Schneider, W. (2007). Automatische Bewertung offener Antworten mittels Latenter Semantischer Analyse [Automatic scoring of constructed-response items with latent semantic analysis]. Diagnostica, 53(3), 155–165. https://doi.org/10.1026/0012-19220.127.116.11
*Li, L., & Gao, F. (2016). The effect of peer assessment on project performance of students at dif-ferent learning levels. Assessment & Evaluation in Higher Education, 41(6), 885–900.
*Li, L., Liu, X., & Steckelberg, A. L. (2010). Assessor or assessee: How student learning im-proves by giving and receiving peer feedback. British Journal of Educational Technology, 41(3), 525–536. https://doi.org/ 10.1111/j.1467-8535.2009.00968.x
*Liu, E. Z.-F., & Lee, C.-Y. (2013). Using peer feedback to improve learning via online peer as-sessment. Turkish Online Journal of Educational Technology—TOJET, 12(1), 187–199.
*Liu, X., Li, L., & Zhang, Z. (2018). Small group discussion as a key component in online as-sessment training for enhanced student learning in web-based peer assessment. Assessment & Evaluation in Higher Education, 43(2), 207–222. https://doi.org/10.1080/02602938.2017.1324018
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://doi.org/10.1177/0002764213479367
*López-Tocón, I. (2021). Moodle quizzes as a continuous assessment in higher education: An ex-ploratory approach in physical chemistry. Education Sciences, 11(9), 500. https://
*Luaces, O., Díez, J., Alonso-Betanzos, A., Troncoso, A., & Bahamonde, A. (2017).
Content-based methods in peer assessment of open-response questions to grade students as authors and as graders. Knowledge-Based Systems, 117, 79–87. https://doi.org/10.1016/j.knosys.2016.06.024
*MacKenzie, L. M. (2019). Improving learning outcomes: Unlimited vs. limited attempts and time for supplemental interactive online learning activities. Journal of Curriculum and Teaching, 8(4), 36–45. https://doi.org/10.5430/jct.v8n4p36
*Mao, J., & Peck, K. (2013). Assessment strategies, self-regulated learning skills, and perceptions of assessment in online learning. Quarterly Review of Distance Education, 14(2), 75–95.
*Martin, F., Ritzhaupt, A., Kumar, S., & Budhrani, K. (2019). Award-winning faculty online teaching practices: Course design, assessment and evaluation, and facilitation. The Internet and Higher Education, 42, 34–43. https://doi.org/10.1016/j.iheduc.2019.04.001
Martin, F., & Whitmer, J. C. (2016). Applying learning analytics to investigate timed release in online learning. Technology, Knowledge and Learning, 21(1), 59–74. https://doi.org/10.1007/s10758-015-9261-9
*Mason, R., & Williams, B. (2016). Using ePortfolio’s to assess undergraduate paramedic stu-dents: a proof of concept evaluation. International Journal of Higher Education, 5(3), 146–154. https://doi.org/ 10.5430/ijhe.v5n3p146
*McCarthy, J. (2017). Enhancing feedback in higher education: Students’ attitudes towards online and in-class formative assessment feedback models. Active Learning in Higher Education, 18(2), 127–141. https://doi.org/10.1177/146978741770761
*McCracken, J., Cho, S., Sharif, A., Wilson, B., & Miller, J. (2012). Principled assessment strate-gy design for online courses and programs. Electronic Journal of E-Learning, 10(1), 107–119.
*McNeill, M., Gosper, M., & Xu, J. (2012). Assessment choices to target higher order learning outcomes: the power of academic empowerment. Research in Learning Technology, 20(3), 283–296.
*McWhorter, R. R., Delello, J. A., Roberts, P. B., Raisor, C. M., & Fowler, D. A. (2013). A cross-case analysis of the use of web-based eportfolios in higher education. Journal of In-formation Technology Education: Innovations in Practice, 12, 253–286.
*Meek, S. E. M., Blakemore, L., & Marks, L. (2017). Is peer review an appropriate form of as-sessment in a MOOC? Student participation and performance in formative peer review. As-sessment & Evaluation in Higher Education, 42(6), 1000–1013.
*Milne, L., McCann, J., Bolton, K., Savage, J., & Spence, A. (2020). Student satisfaction with feedback in a third year Nutrition unit: A strategic approach. Journal of University Teach-ing and Learning Practice, 17(5), 67–83. https://doi.org/10.53761/18.104.22.168
Montenegro-Rueda, M., Luque-de la Rosa, A., Sarasola Sánchez-Serrano, J. L., & Fernández-Cerero, J. (2021). Assessment in higher education during the COVID-19 pandemic: A sys-tematic review. Sustainability, 13(19), 10509.
Moore, M. G., & Kearsley, G. (2011). Distance education: a systems view of online learning. Wadsworth Cengage Learning.
*Mora, M. C., Sancho-Bru, J. L., Iserte, J. L., & Sanchez, F. T. (2012). An e-assessment approach for evaluation in engineering overcrowded groups. Computers & Education, 59(2), 732–740. https://doi.org/10.1016/j.compedu.2012.03.011
Newton, P. E. (2007). Clarifying the purposes of educational assessment. Assessment in Education: Principles, Policy & Practice, 14(2), 149–170. https://doi.org/10.1080/09695940701478321
*Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703–714. https://doi.org/10.1016/j.chb.2017.03.028
*Nicholson, D. T. (2018). Enhancing student engagement through online portfolio assessment. Practitioner Research in Higher Education, 11(1), 15–31.
*Ogange, B. O., Agak, J. O., Okelo, K. O., & Kiprotich, P. (2018). Student perceptions of the effectiveness of formative assessment in an online learning environment. Open Praxis, 10(1), 29–39.
*Ortega-Arranz, A., Bote-Lorenzo, M. L., Asensio-Pérez, J. I., Martínez-Monés, A., Gómez-Sánchez, E., & Dimitriadis, Y. (2019). To reward and beyond: Analyzing the effect of re-ward-based strategies in a MOOC. Computers & Education, 142, 103639. https://doi.org/10.1016/j.compedu.2019.103639
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grim-shaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., . . . Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ, 372, n71. https://doi.org/10.1136/bmj.n71
Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. National Academy Press.
*Pinargote-Ortega, M., Bowen-Mendoza, L., Meza, J., & Ventura, S. (2021). Peer assessment using soft computing techniques. Journal of Computing in Higher Education, 33(3), 684–726. https://doi.org/10.1007/s12528-021-09296-w
*Polito, G., & Temperini, M. (2021). A gamified web based system for computer programming learning. Computers and Education: Artificial Intelligence, 2, 100029. https://doi.org/10.1016/j.caeai.2021.100029
*Reilly, E. D., Williams, K. M., Stafford, R. E., Corliss, S. B., Walkow, J. C., & Kidwell, D. K. (2016). Global times call for global measures: investigating automated essay scoring in lin-guistically-diverse MOOCs. Online Learning, 20(2), 217–229.
*Rogerson-Revell, P. (2015). Constructively aligning technologies with learning and assessment in a distance education master’s programme. Distance Education, 36(1), 129–147.
*Ross, B., Chase, A.-M., Robbie, D., Oates, G., & Absalom, Y. (2018). Adaptive quizzes to in-crease motivation, Mark engagement and learning outcomes in a first year accounting unit. International Journal Of Educational Technology In Higher Education, 15(1), 1–14. https://doi.org/10.1186/s41239-018-0113-2
*Sampaio-Maia, B., Maia, J. S., Leitao, S., Amaral, M., & Vieira-Marques, P. (2014). Wiki as a tool for Microbiology teaching, learning and assessment. European Journal of Dental Edu-cation, 18(2), 91–97. https://doi.org/10.1111/eje.12061
*Sancho-Vinuesa, T., Masià, R., Fuertes-Alpiste, M., & Molas-Castells, N. (2018). Exploring the effectiveness of continuous activity with automatic feedback in online calculus. Computer Applications in Engineering Education, 26(1), 62–74. https://doi.org/10.1002/cae.21861
*Santamaría Lancho, M., Hernández, M., Sánchez-Elvira Paniagua, Á., Luzón Encabo, J. M., & de Jorge-Botana, G. (2018). Using semantic technologies for formative assessment and scor-ing in large courses and MOOCs. Journal of Interactive Media in Education, 2018(1), 1–10. https://doi.org/10.5334/jime.468.
*Sarcona, A., Dirhan, D., & Davidson, P. (2020). An overview of audio and written feedback from students’ and instructors’ perspective. Educational Media International, 57(1), 47–60. https://doi.org/10.1080/09523987.2020.1744853
*Scalise, K., Douskey, M., & Stacy, A. (2018). Measuring learning gains and examining implica-tions for student success in STEM. Higher Education Pedagogies, 3(1), 183–195. https://doi.org/10.1080/23752696.2018.1425096
*Schaffer, H. E., Young, K. R., Ligon, E. W., & Chapman, D. D. (2017). Automating individual-ized formative feedback in large classes based on a directed concept graph. Frontiers in Psychology, 8, 260. https://doi.org/10.1080/23752696.2018.1425096
Schumacher, C., & Ifenthaler, D. (2021). Investigating prompts for supporting students' self-regulation—A remaining challenge for learning analytics approaches? The Internet and Higher Education, 49, 100791. https://doi.org/10.1016/j.iheduc.2020.100791
*Schultz, M., Young, K., K. Gunning, T., & Harvey, M. L. (2022). Defining and measuring au-thentic assessment: a case study in the context of tertiary science. Assessment & Evaluation in Higher Education, 47(1), 77–94. https://doi.org/10.1080/02602938.2021.1887811
*Sekendiz, B. (2018). Utilisation of formative peer-assessment in distance online education: A case study of a multi-model sport management unit. Interactive Learning Environments, 26(5), 682–694. https://doi.org/10.1080/10494820.2017.1396229
*Senel, S., & Senel, H. C. (2021). Remote assessment in higher education during COVID-19 pan-demic. International Journal of Assessment Tools in Education, 8(2), 181–199.
*Shaw, L., MacIsaac, J., & Singleton-Jackson, J. (2019). The efficacy of an online cognitive as-sessment tool for enhancing and improving student academic outcomes. Online Learning Journal, 23(2), 124–144. https://doi.org/ 10.24059/olj.v23i2.1490
Shute, V. J., Wang, L., Greiff, S., Zhao, W., & Moore, G. (2016). Measuring problem solving skills via stealth assessment in an engaging video game. Computers in Human Behavior, 63, 106–117. https://doi.org/10.1016/j.chb.2016.05.047
Stödberg, U. (2012). A research review of e-assessment. Assessment & Evaluation in Higher Ed-ucation, 37(5), 591–604. https://doi.org/10.1080/02602938.2011.557496
*Stratling, R. (2017). The complementary use of audience response systems and online tests to implement repeat testing: a case study. British Journal of Educational Technology, 48(2), 370–384. https://doi.org/ 10.1111/bjet.12362
*Sullivan, D., & Watson, S. (2015). Peer assessment within hybrid and online courses: Students’ view of its potential and performance. Journal of Educational Issues, 1(1), 1–18. https://doi.org/10.5296/jei.v1i1.7255
*Taghizadeh, M., Alavi, S. M., & Rezaee, A. A. (2014). Diagnosing L2 learners’ language skills based on the use of a web-based assessment tool called DIALANG. International Journal of E-Learning & Distance Education, 29(2), n2.
*Tawafak, R. M., Romli, A. M., & Alsinani, M. J. (2019). Student assessment feedback effective-ness model for enhancing teaching method and developing academic performance. Interna-tional Journal of Information and Communication Technology Education, 15(3), 75–88. https://doi.org/10.4018/IJICTE.2019070106
*Tempelaar, D. (2020). Supporting the less-adaptive student: The role of learning analytics, forma-tive assessment and blended learning. Assessment & Evaluation in Higher Education, 45(4), 579–593.
Tempelaar, D. T., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning analytics application using formative assessment. Computers in Human Behavior, 78, 408–420. https://doi.org/10.1016/j.chb.2017.08.010
*Tenório, T., Bittencourt, I. I., Isotani, S., Pedro, A., & Ospina, P. (2016). A gamified peer as-sessment model for on-line learning environments in a competitive context. Computers in Human Behavior, 64, 247–263. https://doi.org/10.1016/j.chb.2016.06.049
*Thille, C., Schneider, E., Kizilcec, R. F., Piech, C., Halawa, S. A., & Greene, D. K. (2014). The future of data-enriched assessment. Research & Practice in Assessment, 9, 5–16.
*Tsai, N. W. (2016). Assessment of students’ learning behavior and academic misconduct in a student-pulled online learning and student-governed testing environment: A case study. Journal of Education for Business, 91(7), 387–392. https://dx.doi.org/10.1080/08832323.2016.1238808
*Tucker, C., Pursel, B. K., & Divinsky, A. (2014). Mining student-generated textual data in MOOCs and quantifying their effects on student performance and learning outcomes. Computers in Education Journal, 5(4), 84–95.
*Tucker, R. (2014). Sex does not matter: Gender bias and gender differences in peer assessments of contributions to group work. Assessment & Evaluation in Higher Education, 39(3), 293–309. http://dx.doi.org/10.1080/02602938.2013.830282
Turkay, S., & Tirthali, D. (2010). Youth leadership development in virtual worlds: A case study. Procedia - Social and Behavioral Sciences, 2(2), 3175–3179. https://doi.org/10.1016/j.sbspro.2010.03.485
*Turner, J., & Briggs, G. (2018). To see or not to see? Comparing the effectiveness of examina-tions and end of module assessments in online distance learning. Assessment & Evaluation in Higher Education, 43(7), 1048–1060. https://doi.org/10.1080/02602938.2018.1428730
*Vaughan, N. (2014). Student engagement and blended learning: Making the assessment connec-tion. Education Sciences, 4(4), 247–264. https://doi.org/10.3390/educsci4040247
*Wadmany, R., & Melamed, O. (2018). "New Media in Education" MOOC: Improving peer as-sessments of students’ plans and their innovativeness. Journal of Education and E-Learning Research, 5(2), 122–130. https://doi.org/10.20448/journal.509.2018.52.122.130
*Wang, S., & Wang, H. (2012). Organizational schemata of e-portfolios for fostering higher-order thinking. Information Systems Frontiers, 14(2), 395–407. https:// doi.org/10.1007/s10796-010-9262-0
*Wang, Y.-M. (2019). Enhancing the quality of online discussion—assessment matters. Journal of Educational Technology Systems, 48(1), 112–129. https://doi.org/10.1177/0047239519861
*Watson, S. L., Watson, W. R., & Kim, W. (2017). Primary assessment activity and learner per-ceptions of attitude change in four MOOCs. Educational Media International, 54(3), 245–260. https://doi.org/10.1080/09523987.2017.1384165
Webb, M., Gibson, D. C., & Forkosh-Baruch, A. (2013). Challenges for information technology supporting educational assessment. Journal of Computer Assisted Learning, 29(5), 451–462. https://doi.org/10.1111/jcal.12033
Webb, M., & Ifenthaler, D. (2018). Assessment as, for and of 21st century learning using infor-mation technology: An overview. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International Handbook of IT in Primary and Secondary Education (2nd ed., pp. 1–20). Springer.
Wei, X., Saab, N., & Admiraal, W. (2021). Assessment of cognitive, behavioral, and affective learning outcomes in massive open online courses: A systematic literature review. Comput-ers & Education, 163, 104097.
*Wells, J., Spence, A., & McKenzie, S. (2021). Student participation in computing studies to un-derstand engagement and grade outcome. Journal of Information Technology Education, 20, 385–403. https://doi.org/10.28945/4817
*West, J., & Turner, W. (2016). Enhancing the assessment experience: Improving student percep-tions, engagement and understanding using online video feedback. Innovations in Educa-tion and Teaching International, 53(4), 400–410. http://dx.doi.org/10.1080/14703297.2014.1003954
Whitelock, D., & Bektik, D. (2018). Progress and challenges for automated scoring and feedback systems for large-scale assessments. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International Handbook of IT in Primary and Secondary Education (2nd ed., pp. 617–634). Springer.
*Wilkinson, K., Dafoulas, G., Garelick, H., & Huyck, C. (2020). Are quiz-games an effective re-vision tool in anatomical sciences for higher education and what do students think of them? British Journal of Educational Technology, 51(3), 761–777. https://doi.org/ 10.1111/bjet.12883
*Wu, C., Chanda, E., & Willison, J. (2014). Implementation and outcomes of online self and peer assessment on group based honours research projects. Assessment & Evaluation in Higher Education, 39(1), 21–37. http://dx.doi.org/10.1080/02602938.2013.779634
*Xian, L. (2020). The effectiveness of dynamic assessment in linguistic accuracy in efl writing: an investigation assisted by online scoring systems. Language Teaching Research Quarterly, 18, 98–114.
*Xiao, Y. A. N. G., & Hao, G. A. O. (2018). Teaching business english course: Incorporating portfolio assessment-based blended learning and MOOC. Journal of Literature and Art Studies, 8(9), 1364–1369. https://doi.org/10.17265/2159-5836/2018.09.008
*Yang, T. C., Chen, S. Y., & Chen, M. C. (2016). An investigation of a two-tier test strategy in a university calculus course: Causes versus consequences. IEEE Transactions on Learning Technologies, 9(2), 146–156.
*Yeh, H.-C., & Lai, P.-Y. (2012). Implementing online question generation to foster reading com-prehension. Australasian Journal of Educational Technology, 28(7), 1152–1175.
*Zhan, Y. (2021). What matters in design? Cultivating undergraduates’ critical thinking through online peer assessment in a confucian heritage context. Assessment & Evaluation in Higher Education, 46(4), 615–630. https://doi.org/10.1080/02602938.2020.1804826
*Zong, Z., Schunn, C. D., & Wang, Y. (2021). What aspects of online peer feedback robustly pre-dict growth in students’ task performance? Computers in Human Behavior, 124, 106924. https://doi.org/10.1016/j.chb.2021.106924
As a condition of publication, the author agrees to apply the Creative Commons – Attribution International 4.0 (CC-BY) License to OLJ articles. See: https://creativecommons.org/licenses/by/4.0/.
This licence allows anyone to reproduce OLJ articles at no cost and without further permission as long as they attribute the author and the journal. This permission includes printing, sharing and other forms of distribution.
Author(s) hold copyright in their work, and retain publishing rights without restrictions