Using Online Tools to Develop Higher Order Learning Among Tertiary Students

Authors

DOI:

https://doi.org/10.24059/olj.v26i3.2667

Keywords:

higher order thinking, online, taxonomy, scaffolding, higher education

Abstract

It is widely recognised that the development of higher order thinking skills is a fundamental goal of higher education. There are a variety of online tools that assist the development of student higher order thinking. In this paper, a process of scaffolding for the writing of higher-order questions enabled through peer learning activities is explored. Data collected over two years across five cohorts shows that there was an overall statistically significant improvement in the number of higher-order questions produced by students at the end of each unit. The findings reveal a viable peer teaching tool that can easily be embedded into existing programmes to develop the necessary critical thinking skills for higher education students.

References

Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., Raths, J., & Wittrock, M. C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. Longman.

Bates, S. P., Galloway, R. K., & McBride, K. L. (2012). Student-generated content: Using PeerWise to enhance engagement and outcomes in introductory physics courses. In Proceedings of the AIP Conference. University of Edinburgh.

Bates, S. P., Galloway, R. K., Riise, J., & Homer, D. (2014). Assessing the quality of a student-generated question repository. Physical Review Special Topics-Physics Education Research, 10(2), 1-11. https://doi.org/10.1103/PhysRevSTPER.10.020105

Bergmann, J., & Sams, A. (2012). Flip your classroom: Reach every student in every class every day. ISTE.

Biggs, J. B., & Collis, K. (1982). A system for evaluating learning outcomes: The SOLO taxonomy. Academic Press.

Biggs, J. B., & Collis, K. (2014). Evaluating the quality of learning: The SOLO taxonomy (Structure of the Observed Learning Outcome). Academic Press.

Bloom, B. S. (1956). Taxonomy of educational objectives. McKay.

Casey, M., Bates, S., Galloway, K., Galloway, R., Hardy, J., Kay, A., Kirsop, P., & McQueen, H. (2014). Scaffolding student engagement via online peer learning. European Journal of Physics, 35(4), 1-9. https://doi:10.1088/0143-0807/35/4/045002

Casner-Lotto, J., & Barrington, L. (2006). Are they really ready to work? Employers' perspectives on the basic knowledge and applied skills of new entrants to the 21st century US workforce: https://files.eric.ed.gov/fulltext/ED519465.pdf.

Caulfield-Sloan, M. B., & Ruzicka, M. F. (2005). The effect of teachers' staff development in the use of higher-order questioning strategies on third grade students' rubric science assessment performance. Planning and Changing, 36, 157-175.

Chan, C. C., Tsui, M., Chan, M. Y., & Hong, J. H. (2002). Applying the structure of the observed learning outcomes (SOLO) taxonomy on student's learning outcomes: An empirical study. Assessment & Evaluation in Higher Education, 27(6), 511-527. https://doi.org/10.1080/0260293022000020282

Crimmins, G., Nash, G., Oprescu, F., Alla, K., Brock, G., Hickson-Jamieson, B., & Noakes, C. (2016). Can a systematic assessment moderation process assure the quality and integrity of assessment practice while supporting the professional development of casual academics? Assessment & Evaluation in Higher Education, 41(3), 427-441. https://doi.org/10.1080/02602938.2015.1017754

Denny, P., Hamer, J., Luxton-Reilly, A., & Purchase, H. (2008). PeerWise: Students sharing their multiple choice questions. In Proceedings of the Fourth International Workshop on Computing Education Research. ICER.

Denny, P., Hanks, B., Simon, B., & Bagley, S. (2011). PeerWise: Exploring conflicting efficacy studies. In Proceedings of the Seventh International Workshop on Computing Education Research. ICER.

Denny, P., Luxton-Reilly, A., & Hamer, J. (2008). The PeerWise system of student contributed assessment questions. In Proceedings of the Tenth Conference on Australasian Computing Education. ICER.

Denny, P., Luxton-Reilly, A., Hamer, J., & Purchase, H. (2009). Coverage of course topics in a student-generated MCQ repository. In Proceedings of the Fourteenth Annual ACM SIGCSE Conference on Innovation and Technology in Computer Science Education. ACM.

Denny, P., Luxton-Reilly, A., & Simon, B. (2009). Quality of student contributed questions using PeerWise. In Proceedings of the Eleventh Australasian Conference on Computing Education. ACE.

Denny, P., Simon, B., & Micou, M. (2010). Evaluation of PeerWise as an educational tool for bioengineers. American Society for Engineering Education, 15, 1-11.

Douglas, M., Wilson, J., & Ennis, S. (2012). Multiple-choice question tests: a convenient, flexible and effective learning tool? A case study. Innovations in Education and Teaching International, 49(2), 111-121. https://doi.org/10.1080/14703297.2012.677596

Feeley, M., & Parris, J. (2012). An Assessment of the PeerWise Student-Contributed Question System's Impact on Learning Outcomes: Evidence from a Large Enrollment Political Science Course. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2144375

Galloway, K. W., & Burns, S. (2015). Doing it for themselves: Students creating a high-quality peer-learning environment. Chemistry Education Research and Practice, 16(1), 82-92. https://doi.org/10.1039/c4rp00209a

Giacumo, L. A., Savenye, W., & Smith, N. (2013). Facilitation prompts and rubrics on higher‐order thinking skill performance found in undergraduate asynchronous discussion boards. British Journal of Educational Technology, 44(5), 774-794. https://doi.org/10.1111/j.1467-8535.2012.01355.x

Gilboy, M. B., Heinerichs, S., & Pazzaglia, G. (2015). Enhancing student engagement using the flipped classroom. Journal of Nutrition Education and Behavior, 47(1), 109-114. https://doi.org/10.1016/j.jneb.2014.08.008

Hancock, G. R. (1994). Cognitive complexity and the comparability of multiple-choice and constructed-response test formats. The Journal of Experimental Education, 62(2), 143-157.

Hardy, J., Bates, S. P., Casey, M. M., Galloway, K. W., Galloway, R. K.,

Kay, A. E., Kirsop, P., & McQueen, H. A. (2014). Student-generated content: Enhancing learning through sharing multiple-choice questions. International Journal of Science Education, 36(13), 2180-2194. https://doi.org/10.1080/09500693.2014.916831

Jones, Z. (2015). The effects of higher-order thinking skills and lower order thinking skills on academic achievement of students in world history class [Master’s thesis, Milligan College].

Kenney, J. (2020). Activities to promote higher-order thinking in virtual asynchronistic chemistry learning. Chem Ed Xchange. https://www.chemedx.org/blog/activities-promote-higher-order-thinking-virtual-asynchronous-chemistry-learning

Liu, O. L., Frankel, L., & Roohr, K. C. (2014). Assessing critical thinking in higher education: Current state and directions for next‐generation assessment. ETS Research Report Series 2014(1). https://doi.org/10.1002/ets2.12009

Luxton-Reilly, A., & Denny, P. (2010). Constructive evaluation: A pedagogy of student-contributed assessment. Computer Science Education, 20(2), 145-167. https://doi.org/10.1080/08993408.2010.486275

McCollister, K., & Sayler, M. F. (2010). Lift the ceiling increase rigor with critical thinking skills. Gifted Child Today, 33(1), 41-47.

McLoughlin, C., & Marshall, L. (2000). Scaffolding: A model for learner support in an online teaching environment. Paper presented at the Flexible Futures in Tertiary Teaching. Proceedings of the 9th Annual Teaching Learning Forum. Curtin University.

McNeil, R. C. (2011). A program evaluation model: Using Bloom's taxonomy to identify outcome indicators in outcomes-based program evaluations. Journal of Adult Education, 40(2), 24-29.

Nevid, J. S., Ambrose, M. A., & Pyun, Y. S. (2017). Effects of higher and lower level writing-to-learn assignments on higher and lower level examination questions. Teaching of Psychology, 44(4), 324-329. https://doi.org/10.1177/0098628317727645

Obukhova, L., & Korepanova, I. (2009). The Zone of Proximal Development: A spatiotemporal model. Journal of Russian & East European Psychology, 47(6), 25-47. https://doi.org/10.2753/RPO1061-0405470602

Omar, N., Haris, S. S., Hassan, R., Arshad, H., Rahmat, M., Zainal, N. F. A., & Zulkifli, R. (2012). Automated analysis of exam questions according to Bloom's taxonomy. Procedia-Social and Behavioral Sciences, 59, 297-303. https://doi.org/10.1016/j.sbspro.2012.09.278

Parkin, B., & Harper, H. (2018). Teaching with intent: Scaffolding academic language with marginalised students. PETAA.

Preus, B. (2012). Authentic instruction for 21st-century learning: Higher-order thinking in an inclusive school. American Secondary Education, 40(3), 59-79. https://www.jstor.org/stable/43694141

Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students' satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior, 71, 402-417. https://doi.org/10.1016/j.chb.2017.02.001

Rivers, J. Smith, A., Higgins, D., Mills, R., Maier, A. G., Howitt, S. (2017). Asking and answering questions: Partners, peer learning, and participation. International Journal of Students as Partners, 1(1), 1-10. https://doi.org/10.15173/ijsap.v1i1.3072

Vygotsky, L. (1978). Interaction between learning and development. Readings on the Development of Children, 23(3), 34-41.

Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem-solving. Journal of Child Psychology and Psychiatry, 17(2), 89-100.

Downloads

Published

2022-09-01

Issue

Section

Section II