USING RUBRICS AND CONTENT ANALYSIS FOR EVALUATING ONLINE DISCUSSION: A CASE STUDY FROM AN ENVIRONMENTAL COURSE

Authors

  • Maha Bali
  • Adham R. Ramadan

DOI:

https://doi.org/10.24059/olj.v11i4.1713

Keywords:

Action Research, Asynchronous Discussion, Computer Conferencing, Content Analysis, Environment, Online Discussion, Rubrics

Abstract

This paper presents a case study of using course-specific rubrics combined with content analysis, together with instructor and student feedback, to assess learning via online discussion. Student feedback was gathered via Small Group Instructional Diagnosis, and instructor feedback was collected through formal interviews. Content analysis used emergent coding with different assessment criteria for each phase of the online discussion. Student participation was high, with a number of students feeling they learned beyond what was discussed in class. Some students however were overloaded by the large number of postings and repetitiveness during some of the phases of the discussion. The instructor was pleased to find students who were quiet in class being active in the online discussion. However, he found that student contributions demonstrated insufficient reflection and critical thinking. Content analysis showed that students met, on average, 59-82% of the essential assessment criteria in their postings, and that their contributions significantly improved as the online discussion progressed. However, a limited number of postings reflected critical thinking. In using online discussion, the use of assessment criteria is therefore commendable, as it was found that content analysis gave an insight beyond student and instructor
perceptions. The insights gleaned from the methodology indicate its usefulness in assessing online discussion activities more objectively, and with respect to specific learning objectives.

References

McConnell, D. Implementing Computer Supported Cooperative Learning, 2nd Ed., London: Kogan Page, 2002.

Hiltz, S. R. Impacts of college-level courses via Asynchronous Learning Networks: Some Preliminary Results. Journal of Asynchronous Learning Networks 1(2): 1–19, August 1997. Online http://www.sloan-c.org/publications/jaln/v1n2/pdf/v1n2_hiltz.pdf.

Green, L. Online Conferencing: Lessons Learne, 1998. Online http://www.emoderators.com/moderators/lessonse.pdf.

Meyer, K. The Ebb and Flow of Online Discussions: What Bloom Can Tell Us About Our Students’ Conversations. Journal of Asynchronous Learning Networks 9(1): 53–63, March 2005. Online http://www.sloan-c.org/publications/jaln/v9n1/pdf/v9n1_meyer.pdf.

Hammond, M. A Review of Recent Papers on Online Discusson in Teaching and Learning in Higher Education. Journal of Asynchronous Learning Networks 9(3): 9–23, 2005. Online http://www.sloan-c.org/publications/jaln/v9n3/pdf/v9n3_hammond.pdf.

Bali, M. and A. Ellozy. Does WebCT Enhance Learning? Case Studies at AUC. 3rd International E-learning Conference, Egypt, January 2005.

Preece, J., B. Nonnecke and D. Andrews. The top 5 reasons for lurking: Improving community experiences for everyone. Computers in Human Behavior 20(1): 201–223, 2004.

Offir, B., I. Barth, I. Lev and A. Shteinbok. Teacher–student Interactions and Learning Outcomes in a Distance Learning Environment. The Internet and Higher Education 9(2): 65–75, 2003.

Nisbet, D. Measuring the Quantity and Quality of Online Discussion Group Interaction. Journal of eLiteracy 1(2): 122–139, 2004.

COHERE. Briefing on Blended Learning, 2004. Online: http://www.cohere.ca/briefing.html. Wu, D. and S. R. Hiltz. Predicting Learning from Asynchronous Online Discussions. Journal of Asynchronous Learning Networks 8(2): 139–152, April 2004. Online http://www.sloan-c.org/publications/jaln/v8n2/pdf/v8n2_meyer.pdf.

Fredericksen, E., A. Pickett, P. Shea, W. Pelz and K. Swan. Student Satisfaction and Perceived Learning with Online Courses: Principles and Examples from the SUNY Learning Network. Journal of Asynchronous Learning Networks 4(2): 7–41, September 2000. Online http://www.sloanc.org/publications/jaln/v4n2/pdf/v4n2_fredericksen.pdf.

Young. A. and C. Norgard. Assessing the Quality of Online Courses from the Students’ Perspective. The Internet and Higher Education 9(2): 107–115, 2006.

Monroe, B. Fostering Critical Engagement in Online Discussions: The Washington State University Study, 2003. Online: http://www.evergreen.edu/washcenter/Fall2003Newsletter/Pg31-33.pdf.

Hein, T. L. and E. S. Irvine. Assessment of student understanding using on-line discussion groups. Proceedings of Frontiers in Education Conference, Tempe, Arizona, USA, 130–135. (Nov. 4-7,1998). Online: http://ieeexplore.ieee.org/iel4/5943/15885/00736819.pdf?arnumber=736819.

Hara, N., C. J. Bonk and C. Angeli. Content Analysis of Online Discussion in an Applied Educational Psychology. Instructional Science 28(2): 115–152, 2000.

Meyer, K. Evaluating Online Discussion: Four Different Frames of Analysis. Journal of Asynchronous Learning Networks 8(2): 101–114, April 2004. Online: http://www.sloan-c.org/publications/jaln/v8n2/pdf/v8n2_meyer.pdf.

Roblyer, M. D. and W. R. Winecke. Exploring The Interaction Equation: Validating a Rubric to Assess and Encourage Interaction in Distance Courses. Journal of Asynchronous Learning Networks 8(4): 24–37, December 2004. Online: http://www.sloan-c.org/publications/jaln/v8n4/pdf/v8n4_roblyer.pdf.

CSU. Rubric for Online Instruction, CSU, Chico. (2003) Online: http://www.csuchico.edu/celt/roi/.

Henri, F. Computer conferencing and content analysis. In A Kaye (Ed.) Collaborative Learning through Computer Conferencing: The Najaden Papers, 117–136. London: Springer-Verlag, 1992.

Lally V. Analysing teaching and learning in networked collaborative learning environments: Issues and work in progress. In V. Lally and D. McConnell (eds.), Networked Collaborative Learning and ICTs in Higher Education: The Edinburgh Papers, 5–26. Sheffield: School of Education University of Sheffield, 2002a.

Fahy, P. J., G. Crawford and M. Ally. Patterns of Interaction in a Computer Conference Transcript. International Review of Research in Open and Distance Learning 2(1): 2001. Online: http://www.icaap.org/iuicode?149.2.1.4.

Gunawardena, C. N., C. A. Lowe, and T. Anderson. Analysis of a global debate and the development of an interaction analysis model for examining social construction of knowledge in computer conferencing. Journal of Educational Computing Research 17(4): 397–431, 1997.

Parlett, M. and D. Hamilton. Chapter 1.1: Evaluation as illumination: a new approach to the study of innovatory programmes. In D. Hamilton, D. Jenkins, C. King and B. MacDonald (Eds.), Beyond the Numbers Game, 6–22. London: Macmillan Education Ltd., 1977.

White, K. Mid-Course Adjustments: Using Small Group Instructional Diagnoses To Improve Teaching and Learning. In Washington Center's Evaluation Committee (Ed.) Assessment in and of Collaborative Learning, 1995.Online: http://www.evergreen.edu/washcenter/resources/acl/c4.html.

Laurillard, D. What students bring to learning. In Rethinking University Teaching: A Framework for the Effective Use of Educational Technology, 30–47. London: Routledge, 1993.

Pearson Education. The Advantages of Rubrics: Part one in a five-part series, 2005. Online: http://www.teachervision.fen.com/page/4522.html.

Morse, K. Does One Size Fit All? Exploring asynchronous learning in a multicultural environment. Journal of Asynchronous Learning Networks 7(1): 37–56, February 2003. Online: http://www.sloan-c.org/publications/jaln/v7n1/pdf/v7n1_morse.pdf.

Downloads

Published

2019-02-11

Issue

Section

Empirical Studies