Correlation between Grades Earned and Time in Online Courses
DOI:
https://doi.org/10.24059/olj.v21i4.1013Keywords:
Online Learning, analytics, Higher educationAbstract
Online education is rapidly becoming a significant method of course delivery in higher education. Consequently instructors are analyzing student performance in an attempt to better scaffold student learning. Learning analytics can provide insight into online students’ course behaviors. Archival data from 167 graduate level education students enrolled in 4 different programs and 9 different online courses was analyzed in an attempt to determine if there was a correlation between their grades and the time spent in specific areas within the course: the total time within the course, the course modules, document repository, and synchronous online sessions. Data was analyzed by total time in course, time in modules, time in document repository, and time in the online synchronous discussions as well as by program. Time spent in each component did not correlate with the specific letter grade, but did correlate with earning an A or not earning an A. The sample was composed of students from four different graduate education programs: Educational Leadership, Reading, Instructional Design, and Special Education. Variations were found between programs, but the differences did not significantly correlate with the grade earned in the course. A logical progression revealed that of all the predictor variables, only time spent in synchronous online sessions showed as a significant predictor of receiving an A in the course. This is important information for instructor when providing scaffolding for students.References
Allen, E. & Seaman, J. (2010). Learning on demand: Online education in the United States 2009. Needham, MA: Sloan Consortium.
Avella, J. T., Kebritchi, M., Nunn, S. G., & Kanai, T. (2016). Learning analytics methods, benefits, and challenges in higher education: A systematic literature review. Online Learning, 20(4). Retrieved from http://olj.onlinelearningconsortium.org/index.php/olj/article/view/790/201
Bhardwaj, B. K., & Pal, S. (2011). Data mining: A prediction for performance improvement using classification. International Journal of Computer Science and Information Security, 9(4), 136-140.
Campus Computing. (2010). The 2010 national survey of information technology in U.S. higher education. Retrieved from http://www.campuscomputing.net/sites/www.campuscomputing.net/files/Green-CampusComputing2010.pdf.
Campus Computing. (2015). The 2015 campus computing survey. Retrieved from http://www.campuscomputing.net/item/2015-campus-computing-survey-0
Dawson, S., McWilliam, E., & Tan, J. P.-L. (2008). Teaching Smarter: How mining ICT data can inform and improve learning and teaching practice. Paper presented at ASCILITE 2008, Melbourne, Australia.
Jo, I., Kim, D., & Yoon, M. (2015). Constructing proxy variables to measure adult learners’ time management strategies in LMS. Educational Technology & Society, 18(3), 214-225.
Joo, Y., Jang, M., & Lee, H. (2007). An in-depth analysis of dropout factors based on cyber university student’s dropout experiences. The Journal of Educational Information Media, 13(3), 209-234.
Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an ‘‘early warning system†for educators: A proof of concept. Computers & Education, 54(2), 588-599. doi: 10.1016/j.compedu.2009.09.008
Macfadyen, L. P. & Dawson, S. (2012). Numbers are not enough. Why e-Learning analytics failed to inform and institutional strategic plan. Educational Technology & Society, 15(3), 149-163.
Means, B., Toyanna, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, D.C.: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development.
Mitchell, T. R. (1985). An evaluation of the validity of correlational research conducted in organizations. Academy of Management Review, 10(2), 192–205.
Picciano, A.G. (2012). The evolution of big data and learning analytics in American higher education. Journal of Asynchronous Learning Networks, 16 (3), 9-20.
Simon, M., & Goes, J. (2013). Dissertation and scholarly research: Recipes for success. Seattle, WA: Dissertation Success LLC.
Tinto, V. (1998). Learning communities: Building gateways to student success. The National Teaching and Learning Forum, 7(4). Retrieved from http://www.ntlf.com/html/lib/suppmat/74tinto.htm
West, D. & Heath, D. (2016). Let’s talk learning analytics: A framework for implementation in relation to student retention. Online Learning, 20(2). Retrieved from http://olj.onlinelearningconsortium.org/index.php/olj/article/view/792/202
Zhao, Y., Lei, J., Yan, B., Lai C., & Tan, H. S. (2005). What makes the difference? A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107(8), 1836-1884.
Downloads
Published
Issue
Section
License
As a condition of publication, the author agrees to apply the Creative Commons – Attribution International 4.0 (CC-BY) License to OLJ articles. See: https://creativecommons.org/licenses/by/4.0/.
This licence allows anyone to reproduce OLJ articles at no cost and without further permission as long as they attribute the author and the journal. This permission includes printing, sharing and other forms of distribution.
Author(s) hold copyright in their work, and retain publishing rights without restrictions