The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Ji Eun Lee, Mimi Recker, Min Yuan


This study investigated the validity and instructional value of a rubric developed to evaluate the quality of online courses offered at a midsized public university. This rubric was adapted from an online course quality rubric widely used in higher education, the Quality Matters rubric. We first examined the reliability and preliminary construct validity of the rubric using quality ratings for 202 online courses and eliminated twelve problematic items. We then examined the instructional value of the rubric by investigating causal relationships between 1) course quality scores, 2) online interactions between students, instructors, and content, and 3) student course performance (course passing rates). A path analysis model, using data from 121 online courses enrolling 5,240 students, showed that only rubric items related to learner engagement and interaction had a significant and positive effect on online interactions, while only student-content interaction significantly and positively influenced course passing rates.


Online course quality, Rubric, Online interactions, Rubric Reliability, Rubric Validity

Full Text:



Anderson, T. (2008). The theory and practice of online learning. Athabasca, Canada: AU Press.

Anderson, T. D., & Garrison, D. R. (1998). Learning in a networked world: New roles and responsibilities. In C. C. Gibson (Ed.), Distance learners in higher education (pp. 97–112). Madison, WI: Atwood Publishing.

Author. (2003).

Author. (2015).

Author. (2016).

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational Research, 79(3), 1243–1289.

Blackboard Inc. (2017). Blackboard exemplary course program rubric. Retrieved from

Borokhovski, E., Tamim, R., Bernard, R. M., Abrami, P. C., & Sokolovskaya, A. (2012). Are contextual and designed student-student interaction treatments equally effective in distance education? Distance Education, 33(3), 311–329.

Britto, M., Ford, C., & Wise, J. M. (2013). Three institutions, three approaches, one goal: addressing quality assurance in online learning. Online Learning Journal, 17(4).

California Community College (2016). Course design rubric for the online education initiative. Retrieved from

California State University (2015). Quality learning and teaching. Retrieved from

Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education (6th ed.). New York, NY: Routledge.

Custard, M., & Sumner, T. (2005). Using machine learning to support quality judgments. D-Lib Magazine, 11(10), Retrieved from

Egerton, E. O., & Posey, L. (2007). Quality standards inventory. Retrieved from

Han, L., Neilands, T. B., & Dolcini, M. M. (2001). Factor analysis of categorical data in SAS. Retrieved from

Hixon, E., Barczyk, C., Ralston-Berg, P., & Buckenmeyer, J. (2016). The impact of previous online course experience students' perceptions of quality. Online Learning, 20(1), 25-40.

Hoey, R. (2017). Examining the characteristics and content of instructor discussion interaction upon student outcomes in an online course. Online Learning, 21(4), 268–281.

Illinois Center College (2017). Quality online course initiative. Retrieved from

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers & Education, 95, 270–284.

Ke, F. (2013). Online interaction arrangements on quality of online interactions performed by diverse learners across disciplines. Internet and Higher Education, 16(1), 14–22.

Kuo, Y., Walker, A. E., Belland, B. R., & Schroder, K. E. E. (2013). A predictive study of student satisfaction in online education programs in online courses. International Review of Research in Open and Distance Learning, 14(1), 1–39.

Lee, J. (2014). An exploratory study of effective online learning: Assessing satisfaction levels of graduate students of mathematics education associated with human and design factors of an online course. International Review of Research in Open and Distance Learning, 15(1), 111–132.

Lee, Y., & Choi, J. (2011). A review of online course dropout research: Implications for practice and future research. Educational Technology Research and Development, 59(5), 593-618.

Liu, I. F., Chen, M. C., Sun, Y. S., Wible, D., & Kuo, C. H. (2010). Extending the TAM model to explore the factors that affect intention to use an online learning community. Computers & Education, 54(2), 600–610.

McFarland, J., Hussar, B., Wang, X., Zhang, J., Wang, K., Rathbun, A., Barmer, A., Forrest Cataldi, E., and Bullock Mann, F. (2018). The Condition of education 2018 (NCES 2018-144). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

Moore, M. G. (1989). Three types of interaction. American Journal of Distance Education, 3, 1–7.

Murray, M., Pérez, J., Geist, D., & Hedrick, A. (2012). Student interaction with online course content: Build it and they might come. Journal of Information Technology Education: Research, 11, 125–140.

New Mexico State University (2011). Online course design rubric. Retrieved from

Palomar College (2012). Online course best practices checklist. Retrieved from

Quality Matters (2018). Quality matters higher education rubric.

Retrieved from

Raubenheimer, J. E. (2004). An item selection procedure to maximize scale reliability and validity. SA Journal of Industrial Psychology, 30(4), 59–64.

Reigeluth, C. (1999). Instructional design theories and models: A new paradigm of instructional theory. Mahwah, NJ: Lawrence Erlbaum Associates.

Roblyer, M., & Wiencke, W. (2003). Design and use of a rubric to assess and encourage interactive qualities in distance courses. American Journal of Distance Education, 17(2), 77–98.

Sher, A. (2009). Assessing the relationship of student-instructor and student-student interaction to student learning and satisfaction in web-based online learning environment. Journal of Interactive Online Learning, 8(2), 102–120.

Southern Regional Education Board (2006). Checklist for evaluating online courses. Retrieved from

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education, 50(4), 1183–1202.

Swan, K., Matthews, D., Bogle, L., Boles, E., & Day, S. (2012). Linking online course design and implementation to learning outcomes: A design experiment. The Internet and Higher Education, 15, 81–88.

University of North Dakota (2016). Rubric for evaluating online courses. Retrieved from

Yong, A. G., & Pearce, S. (2013). A beginner’s guide to factor analysis focusing on exploratory factor analysis. Tutorials in Quantitative Methods for Psychology, 9(2), 79–94.


Copyright (c) 2020 Ji Eun Lee, Mimi Recker, Min Yuan

License URL: