What We Learned When We Compared Discussion Posts from One MOOC Hosted on Two Platforms

Authors

DOI:

https://doi.org/10.24059/olj.v25i4.2897

Keywords:

MOOCs, discussion forums, user interface design, learning experience design

Abstract

We compared discussion posts from a data science ethics MOOC that was hosted on two platforms. We characterized one platform as “open” because learners can respond to discussion prompts while viewing and responding to others. We characterized the other platform as “locked” because learners must respond to a discussion prompt before they can view and respond to others. Our objective is to determine whether these platform differences are consequential and have the potential to impact learning. We analyzed direct responses to two discussion prompts from two modules located in modules two and six of an eight module course. We used conventional content analysis to derive codes directly from the data. Posts on the “open” platform were characterized by failure to completely address the prompt and showed evidence of persuasion tactics and reflective activity. Posts on the “locked” platform were characterized by an apparent intent to complete the task and an assertive tone. Posts on the “locked” platform also showed a diversity of ideas through the corpus of responses. Our findings show that MOOC platform interfaces can lead to qualitative differences in discussion posts in ways that have the potential to impact learning. Our study provides insight into how “open” and “locked” platform designs have the potential to shape ways that learners respond to discussion prompts in MOOCs. Our study offers guidance for instructors making decisions on MOOC platform choice and activities situated within a learning experience.

We used conventional content analysis to derive codes directly from the data. Posts on the “open” platform were characterized by failure to completely address the prompt and showed evidence of persuasion tactics and reflective activity. Posts on the “locked” platform were characterized by an apparent intent to complete the task and an assertive tone. Posts on the “locked” platform also showed a diversity of ideas through the corpus of responses. Our findings show that MOOC platform interfaces can lead to qualitative differences in discussion posts in ways that have the potential to impact learning. Our study provides insight into how “open” and “locked” platform designs have the potential to shape ways that learners respond to discussion prompts in MOOCs. Our study offers guidance for instructors making decisions on MOOC platform choice and activities situated within a learning experience.

Author Biographies

Rebecca M. Quintana, University of Michigan

Center for Academic Innovation

Learning Experience Design Lead

Juan D. Pinto, University of Illinois

PhD Student

University of Illinois at Urbana-Champaign

Yuanru Tan, University of Wisconsin

PhD Student

University of Wisconsin 

References

Almatrafi, O., & Johri, A. (2018). Systematic review of discussion forums in massive open online courses (MOOCs). IEEE Transactions on Learning Technologies, 12(3), 413–428.

Azhar, T. F., & Santoso, H. B. (2019). Evaluation of instructional and user interface design for MOOC: Short and free FutureLearn courses. In the 2019 International Conference on Advanced Computer Science and Information Systems (ICACSIS) (pp. 425–434). IEEE.

Bonafini, F. C., Chae, C., Park, E., & Jablokow, K. W. (2017). How much does student engagement with videos and forums in a MOOC affect their achievement? Online Learning, 21(4), 223–240. http://dx.doi.org/10.24059/olj.v21i4.1270

Boud, D., Keogh, R., & Walker, D. (Eds.). (1985). Reflection: Turning experience into learning. Kogan Page.

Bruff, D. O., Fisher, D. H., McEwen, K. E., & Smith, B. E. (2013). Wrapping a MOOC: Student perceptions of an experiment in blended learning. Journal of Online Learning and Teaching, 9(2), 187.

Chandrasekaran, M. K., Kan, M., Tan, B. C., & Ragupathi, K. (2015). Learning instructor intervention from MOOC forums: Early results and issues. Proceedings of the 8th international conference on education data mining, pp. 218–225.

Cohen, D. K., Raudenbush, S. W., & Ball, D. L. (2003). Resources, instruction, and research. Educational evaluation and policy analysis, 25(2), 119–142.

Conole, G. (2014). A new classification schema for MOOCs. The International Journal for Innovation and Quality in Learning, 2(3), 65–77.

Coursera: Take the world’s best courses, online. (n.d.). Retrieved August 14, 2020, from http://www.coursera.org

De Vries, H., Elliott, M. N., Kanouse, D. E., & Teleki, S. S. (2008). Using pooled kappa to summarize interrater agreement across many items. Field Methods, 20(3), 272–282.

Dowell, N., Poquet, O., & Brooks, C. (2018). Applying group communication analysis to educational discourse interactions at scale. In J. Kay and R. Luckin, (Eds.), Rethinking Learning in the Digital Age: Making the Learning Sciences Count, 13th International Conference of the Learning Sciences (ICLS) 2018, Volume 3. London, UK: International Society of the Learning Sciences.

Downes, S. (2009, February 24). Connectivist dynamics in communities. Half an Hour. http://halfanhour.blogspot.co.uk/2009/02/connectivist-dynamics-in-communities.html

Edx: Free Online Courses by Harvard, MIT, & More, online. (n.d.). Retrieved August 14, 2020, from https://www.edx.org/

Eisenberg, M., & Fischer, G. (2014). MOOCs: A perspective from the learning sciences. Proceedings of the 11th International Conference of the Learning Sciences (ICLS), 190–197

Emanuel, J. P., & Lamb, A. (2017). Open, online, and blended: transactional interactions with MOOC content by learners in three different course formats, Online Learning 21 (2). http://dx.doi.org/10.24059/olj.v21i2.845

Ferguson, R., & Sharples, M. (2014). Innovative pedagogy at massive scale: teaching and learning in MOOCs. In European Conference on Technology Enhanced Learning (pp. 98–111). Springer.

Glory, C., & Santoso, H. B. (2019). Evaluation and recommendations for edX MOOC platform based on instructional design and interaction design principles. In 2019 International Conference on Advanced Computer Science and Information Systems (ICACSIS) (pp. 441–450). IEEE.

Håklev, S., & Slotta, J. D. (2017, May). A principled approach to the design of collaborative MOOC curricula. In European Conference on Massive Open Online Courses (pp. 58–67). Springer, Cham.

Hanifa, M. R., & Santoso, H. B. (2019). Evaluation and recommendations for the instructional design and user interface design of Coursera MOOC platform. In the 2019 International Conference on Advanced Computer Science and Information Systems (ICACSIS) (pp. 417–424). IEEE.

Hara, N., Bonk, C. J. & Angeli, C. (2000). Content analysis of online discussion in an applied educational psychology course. Instructional Science 28, 115–152. https://doi.org/10.1023/A:1003764722829

Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288. https://doi.org/10.1177/1049732305276687

Kellogg, S., Booth, S., & Oliver, K. (2014). A social network perspective on peer supported learning in MOOCs for educators. The International Review of Research in Open and Distributed Learning, 15(5). https://doi.org/10.19173/irrodl.v15i5.1852

Oktavia, T., Prabowo, H., & Supangkat, S. H. (2018). The comparison of MOOC (massive open online course) platforms of edx and coursera (Study case: Student of programming courses). In 2018 International Conference on Information Management and Technology (ICIMTech) (pp. 339–344). IEEE.

Poquet, O., Dowell, N., Brooks, C., & Dawson, S. (2018a). Are MOOC forums changing? LAK ’18 Proceedings of the 8th International Conference on Learning Analytics and Knowledge, pp. 340–349. http:// dx.doi.org/10.1145/3170358.3170416.

Poquet, O., Kovanovic, V., Hennis, T., de Vries, P., Joksimovic, S., Gasevic, D., & Dawson, S. Social presence in Massive Open Online Courses. (2018b). International Journal of Research in Open and Distributed Learning, 19(3). http://dx.doi.org/10.19173/irrodl.v19i3.3370

Quintana, R. M., Hearn, C., Peurach, D. J., & Gabriele, K. (2020). Self-directed, community-supported learning in practice: A case of elevated support. In L. Wilton & C. Brett (Eds.), Handbook on research on online discussion-based teaching methods (pp. 312–332). IGI Global.

Quintana, R. M., & Tan, Y. (2021). Visualizing course structure: Using course composition diagrams to reflect on design. Tech Trends, 65, 562–575. https://doi.org/10.1007/s11528-021-00592-x

Scardamalia, M., & Bereiter, C. (2006). Knowledge building: Theory, pedagogy, and technology. In K. Sawyer (Ed.), Cambridge Handbook of the Learning Sciences (pp. 97–118). Cambridge University Press.

Tan, Y., & Quintana, R. M. (2019, March). What can we learn about learner interaction when one course is hosted on two MOOC platforms? Proceedings of the 9th International Conference on Learning Analytics and Knowledge (LAK). (pp. 149–150). Tempe, Arizona.

Tan, Y., Quintana, R. M., & Sohn, J. (2020, April). Cross-platform engagement in MOOCs: Understanding learner audiences on two course delivery platforms. Poster accepted to the Annual Meeting of the American Educational Research Association (AERA). Conference Cancelled.

Thomas, D. R. (2006). A general inductive approach for analyzing qualitative evaluation data. American Journal of Evaluation, 27(2), 237–246.

Tsironis, A., Katsanos, C., & Xenos, M. (2016). Comparative usability evaluation of three popular MOOC platforms. 2016 IEEE Global Engineering Education Conference (EDUCON) (pp. 608–612). IEEE.

Wong, J. S., Pursel, B., Divinsky, A., & Jansen, B. J. (2015). Analyzing MOOC discussion forum messages to identify cognitive learning information exchanges. Proceedings of the Association for Information Science and Technology, 52(1), 1–10. https://doi.org/10.1002/pra2.2015.145052010023

Downloads

Published

2021-12-01

Issue

Section

Special Conference Issue: AERA Online Teaching and Learning SIG