An Instructor Learning Analytics Implementation Model

Holly McKee

Abstract


With the widespread use of learning analytics tools, there is a need to explore how these technologies can be used to enhance teaching and learning. Little research has been conducted on what human processes are necessary to facilitate meaningful adoption of learning analytics. The research problem is that there is a lack of evidence-based guidance on how instructors can effectively implement learning analytics to support students with the purpose of improving learning outcomes. The goal was to develop and validate a model to guide instructors in the implementation of learning analytics tools. Using design and development research methods, an implementation model was constructed and validated internally. Themes emerged falling into the categories of adoption and caution with six themes falling under adoption including: LA as evidence, reaching out, frequency, early identification/intervention, self-reflection, and align LA with pedagogical intent and three themes falling under the category of caution including: skepticism, fear of overdependence, and question of usefulness.  The model should enhance instructors’ use of learning analytics by enabling them to better take advantage of available technologies to support teaching and learning in online and blended learning environments. Researchers can further validate the model by studying its usability (i.e., usefulness, effectiveness, efficiency, and learnability), as well as, how instructors’ use of this model to implement learning analytics in their courses affects retention, persistence, and performance.


Keywords


Learning Analytics

Full Text:

PDF

References


Agnihotri, L., & Ott, A. (2014) Building a Student At-Risk Model: An End-to-End Perspective. Proceedings of the 7th International Conference on Educational Data Mining, London, UK.

Ali, L., Asadi, M., Gašević, D., Jovanović, J., & Hatala, M. (2013). Factors influencing beliefs for adoption of a learning analytics tool: An empirical study. Computers & Education, 62, 130-148.

Ali, L., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58(1), 470-489.

Dawson, S., Gasevic, D., Siemens, G., & Joksimovic, S. (2014). Current state and future trends: a citation network analysis of the learning analytics field. Proceedings of the Fourth International Conference on Learning Analytics and Knowledge, Indianapolis, Indiana.

Dringus, L. P. (2012). Learning analytics considered harmful. Journal of Asynchronous Learning Networks, 16, 87-100.

Dziuban, C., Moskal, P., Cavanagh, T., & Watts, A. (2012). Analytics that inform the university: Using data you already have. Journal of Asynchronous Learning Networks, 16(3), 21-38.

Ferguson, R., Macfadyen, L. P., Clow, D., Tynan, B., Alexander, S., & Dawson, S. (2014) Setting learning analytics in context: Overcoming the barriers to large-scale adoption. Journal of Learning Analytics, 1(3), 120-144.

Harrison, S., Villano, R., Lynch, G., & Chen, G. (2015). Likelihood analysis of student enrollment outcomes using learning environments variables: a case study approach. Proceedings of the Fifth International conference on Learning Analytics and Knowledge, Poughkeepsie, New York.

Jayaprakash, S. M., & Lauría, E. J. M., (2014) Open academic early alert system: Technical demonstration. Proceedings of the Fourth International Conference of Learning Analytics and Knowledge, Indianapolis, Indiana.

Knight, S., & Shum, S. B. (2014). Epistemology, assessment, pedagogy: Where learning meets analytics in the middle space. Journal of Learning Analytics, 1(2), 23-47.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist. 1-21.

Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: a proof of concept. Computers & Education, 54(2), 588-599.

Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Educational Technology & Society, 15(3), 149-163.

Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2015). The LATUX workflow: designing and deploying awareness tools in technology-enabled learning settings. Proceedings of the Fifth International Conference on Learning Analytics and Knowledge, Poughkeepsie, New York.

Mazza, R., & Dimitrova, V. (2007). CourseVis: A graphical student monitoring tool for supporting instructors in web-based distance courses. International Journal of Human-Computer Studies, 65(2), 125-139.

Mo, S., & Zhao, L., (2012). Using tracking data for continuous monitoring in management distance learning courses. Academy of Educational Leadership Journal, 16, 89-98.

Mor, Y., Ferguson, R., & Wasson, B. (2015). Editorial: Learning design, teacher inquiry into student learning, and learning analytics: A call for action. British Journal of Educational Technology, 46(2), 221-229.

Richey, R. C., & Klein, J. D. (2007). Design and development research. Mahwah, NJ: Lawrence Earlbaum Associates, Inc.

Roll, I., & Winne, P. H. (2015). Understanding, evaluating, and supporting self-regulated learning using learning analytics. Journal of Learning Analyics, 2(1), 7-12.

Rubin, J. & Chisnell, D. (2008). Handbook of Usability Testing: How to plan, design, and conduct effective tests. Indianapolis, IN: Wiley Publishing, Inc.

Ruipérez-Valiente, J. A., Muñoz-Merino, P. J., Leony, D., & Delgado Kloos, C. (2015). ALAS- KA: A learning analytics extension for better understanding the learning process in the Khan Academy platform. Computers in Human Behavior, 47(0),139-148.

Scheffel, M., Drachsler, H., Stoyanov, S., & Specht, M. (2014). Quality indicators for learning analytics. Educational Technology & Society, 17(4), 117-132.

Selber, S. A. (2004). Multiliteracies for a digital age. Carbondale, Southern Illinois University Press.

Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE review, 46(5), 30.

Spivey, M. F., & McMillan, J. J. (2013). Using the Blackboard course management system to analyze student effort and performance. Journal of Financial Education, 39(1/2), 19-28.

West, D., Heath, D., & Huijser, H. (2016). Let’s talk learning analytics: A framework for implementation in relation to student retention. Online Learning, 20(2), 30-50.

Wise, A. F. (2014). Designing pedagogical interventions to support student use of learning analytics. Proceedings of the Fourth International Conference of Learning Analytics and Knowledge, Indianapolis, Indiana.

Wise, A. F., Vytasek, J. M., Hausknecht, S., Zhao, Y. (2016). Developing learning analytics design knowledge in the “middle space”: The student tuning model and align design framework for learning analytics use. Online Learning, 20(2), 155-182.

Wise, A. F., Zhao, Y., & Hausknecht, S. N. (2014). Learning analytics for online discussions: Embedded and extracted approaches. Journal of Learning Analytics, 1(2), 48-71.

You, J. W. (2015). Examining the effect of academic procrastination on achievement using LMS data in e-learning. Educational Technology & Society, 18(3), 64-74.

Zacharis, N. Z. (2015). A multivariate approach to predicating student outcomes in web enabled blended learning courses. The Internet and Higher Education, 27, 44-5




DOI: http://dx.doi.org/10.24059/olj.v21i3.1230