Student Satisfaction with Online Learning: Is it a Psychological Contract?

Charles Dziuban, Patsy Moskal, Jessica Thompson, Lauren Kramer, Genevieve DeCantis, Andrea Hermsdorfer

Abstract


The authors explore the possible relationship between student satisfaction with online learning and the theory of psychological contracts. The study incorporates latent trait models using the image analysis procedure and computation of Anderson and Rubin factors scores with contrasts for students who are satisfied, ambivalent, or dissatisfied with their online learning experiences. The findings identify three underlying satisfaction components: engaged learning, agency, and assessment. The factor score comparisons indicate that students in the general satisfaction categories characterize important differences in engaged learning and agency, but not assessment. These results lead the authors to hypothesize that predetermined, but unspecified expectations (i.e., psychological contracts) for online courses by both students and faculty members are important advance organizers for clarifying student satisfaction.

Keywords


Student satisfaction; online learning; psychological contract

Full Text:

PDF

References


Abrami, P.C., & d’Apollonia, S. (1991). Multidimensional students’ evaluations of teaching effectiveness—generalizability of “N=1” research: Comment of marsh. Journal of Educational Psychology, 83(3), 411-415. doi: 10.1037/0022-0663.83.3.411

Akdemir, O., & Koszalka, T. A. (2008). Investigating the relationships among instructional strategies and learning styles in online environments. Computers & Education, 50(4), 1451-1461. doi: 10.1016/j.compedu.2007.01.004

Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Newburyport, MA: Sloan Consortium.

Anderson, R. D., & Rubin, H. (1956). Statistical inference in factor analysis. Proceedings of the Third Berkeley Symposium of Mathematical Statistics and Probability, 5, 111-150.

Arbaugh, J.B. (2007). An empirical verification of the community of inquiry framework. Journal of Asynchronous Learning Network, 11(1), 73-85.

Arbaugh, J. B. (2001). How instructor immediacy behaviors affect student satisfaction and learning in web-based courses. Business Communication Quarterly, 64(4), 42-54. doi: 10.1177/108056990106400405

Argyris, C. (1960). Understanding organizational behavior. Homewood, IL: Dorsey.

Bangert, A. W. (2006). Identifying factors underlying the quality of online teaching effectiveness: An exploratory study. Journal of Computing in Higher Education, 17(2), 79-99. doi: 10.1007/BF03032699

Bangert, A. W. (2008). The development and validation of the student evaluation of online teaching effectiveness. Computers in the Schools, 25(1), 35-47. doi: 10.1080/07380560802157717

Battalio, J. (2007). Interaction online: A reevaluation. Quarterly Review of Distance Education, 8(4), 339-352.

Bolliger, D. U. (2004). Key factors for determining student satisfaction in online courses. International Journal on E-Learning, 3(1), 61-67.

Bordelon, D. E. (2012). Where have we been? Where are we going? The evolution of American higher education. Procedia-Social and Behavioral Sciences, 55(5), 100-105. doi: 10.1016/j.sbspro.2012.09.483

Bordia, S., Hobman, E. V., Resubog, S. L. D., & Bordia, P. (2010). Advisor-student relationship in business education project collaborations: A psychological contract perspective. Journal of Applied Social Psychology, 40(9), 2360-2386. doi: 10.1111/j.1559-1816.2010.00662.x

Bowker, G. C., & Star, S. L. (1999). Sorting things out: Classification and its consequences. Cambridge, MA: The MIT Press.

Carnaghan, C., & Webb, A. (2007). Investigating the effects of group response systems on student satisfaction, learning, and engagement in accounting education. Issues in Accounting Education, 22(3), 391-409. doi: http://dx.doi.org/10.2308/iace.2007.22.3.391

Craig S. C. & Martinez M. D. (2005). Ambivalence and the structure of political opinion. New York: Palgrave Macmillian.

Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297-334. doi:10.1007/BF02310555

Diamond, J. (1999). Germs, guns, and steel: The fates of human societies. New York: W. W. Norton & Company, Inc.

Dziuban, C., & Dziuban, J. (1998). Reactive behavior patterns in the classroom. Journal of Staff, Program & Organization Development, 15(2). 85-31.

Dziuban, C., McMartin, F., Morgan, G., Morrill, J., Moskal, P., & Wolf, A. (2013). Examining student information seeking behaviors in higher education. Journal of Information Fluency, 2(1), 36-54.

Dziuban, C., & Moskal, P. (2011). A course is a course is a course: Factor invariance in student evaluation of online, blended and face-to-face learning environments. The Internet and Higher Education, 14(4), 236-241. doi: 10.1016/j.iheduc.2011.05.003

Dziuban, C., Moskal, P., Brophy-Ellison, J., & Shea, P. (2007). Student satisfaction with asynchronous learning. Journal of Asynchronous Learning Networks, 11(1). Needham, MA: Sloan-C.

Dziuban, C. D., Moskal, P. D., & Dziuban, E. K. (2000). Reactive behavior patterns go online. The Journal of Staff, Program & Organizational Development, 17(3), 155-179.

Dziuban, C.D., Moskal, P.D., Juge, F., Truman-Davis, B., Sorg, S. & Hartman, J. (2003). Developing a web-based instructional program in a metropolitan university. In B. Geibert & S. H. Harvey (Eds.), Web-wise learning: Wisdom from the field (pp. 47-81). Philadelphia, PA: Xlibris Publications.

Dziuban, C., Moskal, P., Kramer, L., & Thompson, J. (2013). Student satisfaction with online learning in the presence of ambivalence: Looking for the will-o’-the-wisp. Internet and Higher Education, 17, 1-8. doi: 10.1016/j.iheduc.2012.08.001

Dziuban, C. D., & Shirkey, E. C. (1974). When is a correlation matrix appropriate for factor analysis? Some decision rules. Psychological Bulletin, 81(6), 358-361. doi: 10.1037/h0036316

Dziuban, C. D., & Shirkey, E. C. (November, 1993). S.D. 50—A sequential psychometric criterion for the number of common factors. Presented at The Annual Conference for Florida Educational Research Association, Destin, Florida.

Feldman, K. A. (1976). The superior college teacher from the student’s view. Research in Higher Education, 5, 243-288. doi: 10.1007/BF00991967

Feldman, K. A. (1993). College students’ views of male and female college teachers: Part II— evidence from students’ evaluation of their classroom teachers. Research in Higher Education, 34(2), 151-191.

doi:10.1007/BF00992161

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2), 87-105. doi: 10.1016/S1096-7516(00)00016-6

González-Gómez, F., Guardiola, J., Martín Rodríguez, Ó., & Montero Alonso, M. Á. (2012). Gender differences in e-learning satisfaction. Computers & Education, 58(1), 283-290. doi: 10.1016/j.compedu.2011.08.017

Greenwald, A.G., & Gilmore, G. M. (1997). Grading leniency is a removable contaminant of student ratings. American Psychologist,52(11), 1209-1217. doi: 10.1037/0003-066X.52.11.1209

Guttman, L. (1953). Image theory for the structure of quantitative variates. Psychometrika, 18, 277-269. doi:10.1007/BF02289264

Guttman, L. (1954). Some necessary conditions for common factor analysis. Psychometrika, 19, 149-161. doi:10.1007/BF02289162

Hedges, L.V., & Olkin, I. (1985). Statistical methodology in meta-analysis. San Diego, CA: Academic Press.

Hendrickson, A. E., & White, P. O. (1964). Promax: A quick method for rotation to oblique simple structure. British Journal of Statistical Psychology, 17(1), 65-70. doi: 10.1111/j.2044-8317.1964.tb00244.x

Hill, B. D. (2011). The sequential Kaiser-Meyer-Olkin procedure as an alternative for determining the number of factors in common-factor analysis: A Monte Carlo simulation Doctoral dissertation, Oklahoma State University.

Hochberg, Yosef (1988). A sharper bonferroni procedure for multiple tests of significance. Biometrika, 75(4): 800–802. doi:10.1093/biomet/75.4.800.

Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Strauss, Giroux.

Kaiser, H. F. (1958). The varimax criterion for analytic rotation in factor analysis. Psychometrika, 23(3), 187-200. doi:10.1007/BF02289233

Kaiser, H., & Caffrey, J. (1965). Alpha factor analysis. Psychometrika, 30(1), 1-14. doi:10.1007/BF02289743

Kaiser, H. F., & Rice, J. (1974). Little jiffy, mark IV. Journal of Educational and Psychological measurement, 34(1), 111-117. doi: 10.1177/001316447403400115

Ke, F., & Kwak, D. (2013). Constructs of student-centered online learning on learning satisfaction of a diverse online student body: A structural equation modeling approach. Journal of Educational Computing Research, 48(1), 97-122. doi: 10.2190/EC.48.1.e

Keengwe, J., Diteeyont, W., & Lawson-Body, A. (2012). Student and instructor satisfaction with e-learning tools in online learning environments. International Journal of Information and Communication Technology Education (IJICTE), 8(1), 76-86. doi:10.4018/jicte.2012010108

Kim, C., Damewood, E., & Hodge, N. (2000). Professor attitude: Its effect on teaching evaluations. Journal of Management Education, 24(4), 458-473. doi:10.1177/105256290002400405

Kuo, Y. C., Walker, A. E., Belland, B. R., & Schroder, K. E. (2013). A predictive study of student satisfaction in online education programs. The International Review of Research in Open and Distance Learning, 14(1), 16-39.

Lakoff, G. (1987). Women, fire, and dangerous things. Chicago: University of Chicago Press.

Liu, G. Z., & Hwang, G. J. (2010). A key step to understanding paradigm shifts in e‐learning: towards context‐aware ubiquitous learning. British Journal of Educational Technology, 41(2), E1-E9. doi: 10.1111/j.1467-8535.2009.00976.

Long, W. A. (2011). Your predictable adolescent. Charleston, SC: BookSurge Publishing.

Mahmood, A., Mahmood, S. T., & Malik, A. B. (2012). A comparative study of student satisfaction level in distance learning and live classroom at higher education level. Turkish Online Journal of Distance Education (TOJDE), 13(1), 128-136.

Marsh, H. W., & Roche, L.A. (1997). Making students’ evaluations of teaching effectiveness effective: The critical issues of validity, bias, and utility. American Psychologist, 52(11), 1187-1197. doi: 10.1037/0003-066X.52.11.1187

McKeachie, W.J. (1997). Student ratings: The validity of use. American Psychologist, 52(11), 1218-1225.

Norberg, A., Dziuban, C. D., & Moskal, P. D. (2011). A time-based blended learning model. On the Horizon, 19(3), 207-216. doi: 10.1108/10748121111163913

Raja, U., Johns, G., & Ntalianis, F. (2004). The impact of personality on psychological contracts. The Academy of Management Journal, 47(3), 350-367. doi: 10.2307/20159586

Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68-88.

Roberts, C. (2007). The unnatural history of the sea. Washington DC: Island Press.

Rousseau, D. M. (1990). Normative beliefs in fund-raising organizations linking culture to organizational performance and individual responses. Group & Organization Management, 15(4), 448-460. doi: 10.1177/105960119001500408

Rousseau, D. M. & Tijoriwala, S. A. (1998). Assessing psychological contracts: Issues, alternatives and measures. Journal of Organizational Behavior, 19, 679-695. doi: 10.1002/(SICI)1099-1379(1998)19:1+<679::AID-JOB971>3.0.CO;2-N

Rubin, B., Fernandes, R., & Avgerinou, M. D. (2013). The effects of technology on the community of inquiry and satisfaction with online courses. The Internet and Higher Education, 17, 48-57. doi: 10.1016/j.iheduc.2012.09.006

Sax, L. J., Gilmartin, S. K., & Bryant, A. N. (2003). Assessing response rates and nonresponse bias in web and paper surveys. Research in Higher Education, 44(4), 409-432. doi:10.1023/A:1024232915870

Shirky, C. (2010). Cognitive surplus: Creativity and generosity in a connected age. New York: Penguin.

Shirky, C. (2008). Here comes everybody: The power of organizing without organizations. New York: Penguin.

Spies, A. R., Wilkin, N. E., Bentley, J. P., Bouldin, A. S., Wilson, M. C., & Holmes, E. R. (2010). Instrument to measure psychological contract violation in pharmacy students. American Journal of Pharmaceutical Education, 74(6), 1-11.

Stevens, J.P. (2002). Applied multivariate statistics for the social sciences (4th ed.). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.

Stewart, L., Hong, E., & Strudler, N. (2004). Development and validation of an instrument for student evaluation of the quality of web-based instruction. The American Journal of Distance Education, 18(3), 131-150. doi: 10.1207/s15389286ajde1803_2

Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance education, 22(2), 306-331. doi:10.1080/0158791010220208

Tolstoy, L. (2004). Anna Karenina. (R. Pevear & L. Volokhonsky, Trans.). New York, NY: Penguin. (Original work published 1878).

Wade-Benzoni, K. A., Rousseau, D. M., & Li, M. (2006). Managing relationships across generations of academics: Psychological contracts in faculty-doctoral student collaborations. International Journal of Conflict Management, 17(1), 4-33. doi: 10.1108/10444060610734154

Wang, M. C., Dziuban, C. D., Cook, I. J., & Moskal, P. D. (2009). Dr. Fox rocks: Using data-mining techniques to examine student ratings of instruction. In M. C. Shelley, L. D. Yore, & B. Hand (Eds.), Quality research in literacy and science education: International perspectives and gold standards (pp. 383-398). Dordrecht, Netherlands: Springer. doi:10.1007/978-1-4020-8427-0_19

Watts, D. J. (2011). Everything is obvious. New York: Crown Publishing Group, Random House.

Weigert, A. J. (1991). Mixed emotions: Certain steps toward understanding ambivalence. Albany: State University of New York Press.

Young, B. R., & Dziuban, E. (2000). Understanding dependency and passivity: Reactive behavior patterns in writing centers. Writing Center Journal, 21(1), 67-87.




DOI: http://dx.doi.org/10.24059/olj.v19i2.496