The Impact of Multimedia in Course Design on Students’ Performance and Online Learning Experience: A Pilot Study of an Introductory Educational Computing Course

The creation of multimedia assets for online courses is a time-intensive endeavor. Faculty have limited access to instructional designers for this and other course design functions. This study sought to determine if multimedia use in course design contributes positively to student performance or their perception of the online learning experience, after controlling for faculty course design expertise. Students (totaling 142) were enrolled in an Introductory Educational Computing Course between 2016 and 2018 designed according to Quality Matters standards based on an informal internal review as well as a course designed according to instructor preferences. Eighty-four students, who participated in the courses designed according to Quality Matters standards based on an informal internal review, were surveyed about their perceptions. While it may be of no surprise that multimedia use did not impact student performance directly, based on end-of-point course totals, it did positively influence student perceptions of the online learning experience. A performance gap between ethnicities in this study was not observed, as evidence through end-of-course total points. This may be salient given the prevalence of such performance gaps in most educational settings. Course policies and instructional strategies perceived by students as helpful may be one contributing factor to this lack of performance gap. Furthermore, the use of multimedia in course design was found to reduce cognitive load, as shown by the amount of time spent inside the learning management system. What this means for multimedia use in course design and the student online learning experience concludes this paper.


The Impact of Multimedia in Course Design on Students' Performance and Online Learning Experience: A Pilot Study of an Introductory Educational Computing Course
Of the many tasks with which higher education instructional designers are immersed, the production of multimedia is the most time intensive.According to a 2010 study published by the consulting firm of Bryan Chapman, one hour of multimedia production can range from an average low of 49 development hours for simple text, graphics, and assessment questions using template driven rapid authoring tools, to an average low of 217 hours for highly interactive simulations that take advantage of audio, video, and animations.Aside from multimedia production, the overall process of developing a learning solution or an academic course also includes conducting a needs assessment, developing a course outline, working with subject matter experts, designing the course in a learning management system or course authoring tool, identifying appropriate approaches, and collaborating with a variety of stakeholders (Association of Talent Development, 2015).This process is in addition to the managing, training, and support functions that instructional designers undertake daily.Because of their many roles and functions, higher education faculty are limited in their access to instructional designers for course design and technology integration assistance.In a survey conducted by Intentional Futures (2016) of 780 higher education instructional designers, 41% of the instructional designers surveyed came from institutions that had a staff of 1 to 5. The institutions that had instructional designers on staff were from small, medium, and large enrollments and the range of instructional designers on staff was the same.Given this context, the need to measure the impact of multimedia in course design on student performance and their perception of the online learning experience is relevant to the time allocated for the various functions of instructional designers and to faculty who develop and design online content in learning management systems.This paper will discuss the conclusions of past studies on the topic of multimedia use in course design as it pertains to student performance and perceptions of the online learning experience.In addition, a discussion of the research questions, methods, and the student participants will follow.The paper will conclude with the results of the study, followed by implications of multimedia use in course design.

Review of Related Literature
The use of multimedia in online course design in higher education is considered one of many best practices.According to Meyer, Rose, and Gordon (2014), flexibility and individuality are vital components to removing barriers to learning.Their universal design for learning framework proposes that barriers to learning would be minimized if instructors provided multiple means of engaging learners and representing instructional content, along with allowing students multiple means of expressing what they have learned.Multimedia integration is often used and recommended for these purposes.
According to Bledsoe and Simmerok (2013), videos, audio clips, and photos were used to produce a virtual comprehensive counseling center environment for online graduate psychology students.They found that students had an opportunity to practice statistical concepts, collaborate on a research project, and demonstrate knowledge through tests and quizzes.The author's concluded that the multimedia environment enhanced their understanding of the course content.
The CHLOE Report (2017), jointly published by Quality Matters (QM), an organization that promotes quality standards for course design in all delivery formats, and Eduventures, a provider of research analysis and advisory services to higher education, surveyed 104 chief online officers.They found that lecture capture, authoring tools, and video platforms were perceived as important multimedia technologies.The Online Learning Consortium's course design scorecard references required course design content such as a course welcome and a course orientation, which are often created with multimedia.The same is true of the course design rubric published by the California Community Colleges' Online Education Initiative (2016).Given that the integration of a variety of multimedia is strongly encouraged in online course design, a study of its use and the impact on students' performance and online learning experience is timely.
Across many disciplines, courses infused with multimedia are perceived as more effective than those without.For example, in comparison to voiceover lectures, Chen and Wu (2015) found that video recordings of lectures through lecture capture and picture-in-picture lectures produced superior student performance based on an instructor-created learner performance test sheet.As a result, they concluded that video lecture type is a worthwhile consideration for online learning.In comparison to publisher-provided multimedia content through MyMathLab, Hegeman (2015) found that college algebra students performed significantly better on online and handwritten assessments when the instructor was in the role of content provider, as shown through instructorgenerated video lectures and recorded handwritten solutions and oral explanations.Similarly, Vazquez and Chiang (2016) found that students who accessed multimedia pre-lectures scored higher on comprehension and retention than students with access to only textbooks in an economics course.Stanley and Zhang (2018) in their study on student-produced videos and academic performance found students who engaged in the course by producing their own videos had better learning gains than those that did not.While these studies report improved performance with multimedia, Lang (2016) concluded there was no difference in student performance in a computer education course using video lectures versus text tutorials for the same content.Liaw et al. (2015) also concluded there was no difference in student performance when acute nursing care students completed a hands-on patient simulation versus a web-based simulation of the same content.Research studies on the impact of multimedia on students' performance and online learning experience is nonconclusive.Quality assurance measures for course design in the learning management system were not controlled for in these studies but could influence student performance.Specifically, best practices for curricular alignment between learning objectives, materials, and assessments, as advocated by Quality Matters, were not addressed in these studies.
Online courses that do not integrate multimedia are often perceived as not accommodating of diverse learning styles.Higher education instructors are strongly encouraged to accommodate a variety of learning styles in the development and design of their courses.Cakiroglu (2014) in a study analyzing the effect of learning styles and study habits of distance learners on learning performance, used Kolb's Learning Style Inventory to measure student learning styles and a researcher developed achievement test to measure performance.Cakiroglu found some learning styles performed better than others and concluded that learning styles might have an impact on the effectiveness of instructional strategies in online courses.Also using Kolb's Learning Style Inventory, Tan and Laswad (2015) examined the impact of learning styles on academic performance of introductory accounting students.Their performance was measured using multiple-choice and constructed response questions.Because some learning styles were found to perform better on some of the assessments, it was concluded that learning styles should be considered when designing assessments so as not to diminish the validity and fairness of the assessment.Also using Kolb's Learning Style Inventory, Chen (2015) found differences in student performance based on learning styles.Because of those differences, she concluded that accommodating learners with different needs is essential to learning, particularly in the online classroom.Multimedia is often used to accommodate diverse learning styles.Avoiding its use suggests that the curriculum may not be accessible to all learners.
Because the production of multimedia can be a time-intensive process and access to instructional design services is limited in most higher education institutions, identifying the impact of multimedia in course design on student performance and their perception of the online learning experience is essential.According to Garrison et al., the online learning experience includes teacher presence, peer presence, and cognitive presence.Online instructors often facilitate these three types of presence through the use of multimedia.These online learning experience elements are believed to influence student performance as measured by course grades, (Chen and Wu, 2015;Hegeman, 2015;Vazquez and Chiang, 2016) as well as student satisfaction as measured by perceptual surveys (Dixson, 2010;Young and Bruce, 2011;Leppink et al., 2013).These three types of presence translate into learner engagement strategies.Martin and Bolliger (2018) found that learner-to-learner, learner-to-content, and learner-to-instructor interactions were equally important and that students rated instructor interaction strategies the highest.Krause, Portolese, and Bonner (2017) found that students enjoyed the instructor's use of multimedia for instructorto-learner communication even though they were unlikely to use the same for communication purposes.These three types of interactions often utilize a variety of multimedia, making the identification of the impact of multimedia on student performance and their online learning experience salient.
Studies of the impact of multimedia in course design on student performance and their perception of the online learning experience is growing.However, there has been very little discussion of the expertise required of course designers to infuse multimedia and apply best course design practices using a learning management system that contributes to successful student outcomes.This study seeks to determine the following: 1. Does course design expertise contribute positively to student performance as determined by total points at the end of the course?
2. Does multimedia use in course design positively contribute to student performance by total points at the end of the course?
3. Does student perception of the online experience influence their performance based on total points at the end of the course?
This study proposes to add to the growing body of literature by assessing the impact of multimedia in well-designed online courses delivered through a learning management system.

Methods
This study used a mixed-methods approach to determine the impact of multimedia in course design on student performance and student perception of the online learning experience.The impact and perception were measured using a 60-item, five-point Likert scale survey administered anonymously through the learning management system that included open-ended questions.Some questions were used as written from published surveys, while others were modified to fit the context of the study (Dixson, 2010;Young & Bruce, 2011;Hadie & Yusoff, 2016;Leppink, Paas, Van Der Vleuten, et al., 2013).This provided evidence for the validity for some of the items and scaled Likert responses followed acceptable practices for social science research (Fowler, 2009).Because the survey assessed a learning objective, students earned 100 points for completing the anonymous survey, which made up approximately 10% of the course grade.Student responses to the Likert-scaled survey were compared to the end-of-course total points to determine whether or not course design expertise and multimedia use positively contributed to student performance.Two nonequivalent group designs were applied to several online sections of the same Educational Computing Level 1 course offered from Spring 2016 to Spring 2018.

Study One Design
The design of study one consisted of eight sections of Educational Computing Level 1 delivered through the learning management system.Four sections of the course were QMdesigned, as determined through internal informal assessment, and compared to four sections of the same course designed according to instructor preferences.A course template for each of the two design types was created and each course section for each design type was copied from the respective template during the two-year study period.The online course design met Quality Matter's standards based on an informal internal review using the fifth edition rubric and was later reviewed using the sixth edition rubric.The reviewer was a QM-certified peer reviewer, having completed a minimum of 16 hours of training for the purpose of assessing courses according to QM standards.All essential standards were met and the course earned 94 of 100 points in the informal QM internal review.The course design of each section explicitly articulated the alignment of learning objectives, instructional materials, and course activities, among other course design components.For comparison, four sections of the same course were designed according to instructor preferences.T-tests were used to analyze if students performed better in the Quality Matters courses or the courses built according to instructor preferences based on end-of-course total points, not letter grades determined by the grading scale provided in the course syllabi.
A total of 142 students participated in the study one design.Students were predominately female (81%), Hispanic (36%), and white (41%), and enrolled in online programs (72%).The ethnic and online programs aspects of the student demographics represent the general student demographics of the university used in this project.Sixty percent of the students were between the ages of 19 and 29, 26% were between the ages of 30 and 39, 7% were between the ages of 40 and 49, and 1% were between the ages of 50 and 59.Students enrolled in the Quality Matters course sections totaled 84 and 58 were enrolled in the non-Quality Matters courses.To determine the effect of course design on end-of-course point totals, a t-test was used to determine any statistically significant difference between mean point totals from the two groups.

Study Two Design
The study two design involved the course design of four sections of educational computing Level 1 in the learning management system.Two sections were enriched with a variety of multimedia content-i.e., embedded instructor created videos of assignment directions, embedded instructional content videos, software simulations, and images.For comparison, the course content of two sections was built using formatted text-i.e., written assignment directions, and text transcripts to required video content that included a text link to the required video.Both groups used the same textbook with images, step-by-step instructions, and adhered to Quality Matters standards based on the same informal internal review described in the study one design.
A total of 84 students participated in the study two design.Students in the multimedia-rich sections totaled 46, while 38 students made up the text-based sections.Similar to the study one design, students were predominantly female (80%), ethnically Hispanic (37%), and white (41%), and students enrolled in online programs (66%).Sixty percent of the students were between the ages of 19 and 29, 26% were between the ages of 30 and 39, 6% were between the ages of 40 and 49, and 1% were between the ages of 50 and 59.

Multiple Regression and T-Test
Academic performance is only one aspect of a student's online learning experience.Another aspect is their perception of and overall satisfaction with a course.A survey was administered in study two during the last week of an eight-week course with both Likert-scale and open-ended questions.Some questions were used as written from published surveys, while others were modified to fit the context of the study (Dixson, 2010;Young & Bruce, 2011;Hadie & Yusoff, 2016;Leppink, Paas, Van Der Vleuten et al., 2013).Once the questions were identified, they were combined into scales from which end-of-course total points were analyzed using a t-test and multiple regression.The five-point Likert-scale answer choices included two positive options, two negative options, and a middle neutral option.The survey designed follows quantitative research norms in the social sciences (Fowler, 2009).Each scale was assessed for reliability using Cronbach's alpha.Questions were added to or deleted from a scale to strengthen the scale's reliability score.The resulting four scales and their reliability coefficients appear below: • Course Design Scale-Cronbach's Alpha = .94 • Community and Learner Engagement Scale-Cronbach's Alpha = .70 • Teaching Style Scale-Cronbach's Alpha = .85 • Cognitive Load Scale-Cronbach's Alpha = .83

Content Analysis of Open-Ended Survey Responses
A content analysis of student responses to the open-ended questions made up the qualitative approach used to determine the influences on student perceptions of the online experience.Of the 60 survey items, the following three open-ended questions were analyzed: 1. What parts of this course were most useful?
2. What parts of this course need improvement?3. Is there anything about this course that you'd like us to know that we didn't ask?Please provide additional comments.
Similar responses to these open-ended questions were grouped and a classification assigned.Axial themes were then identified and reported.

Results
Several variables were evaluated after the implementation of the study one design, the study two design, the four scales, and the content analysis of the open-ended survey responses.The study one design compared courses designed according to Quality Matters standards to courses designed according to instructor preferences.The study one design compared two course sections of an informally and internally evaluated course designed according to Quality Matters The Impact of Multimedia in Course Design on Students' Performance and Online Learning Experience: A Pilot Study of an Introductory Educational Computing Course standards to two course sections that were designed according to instructor preferences to determine if course design expertise improved student performance based on end-of-course total points.As a point of interest, performance gaps between the sexes, age groups, and ethnicities based on the same measure, were reported given the availability of the data.The study two design compared a course rich with multimedia to a course designed with formatted text.End-of-course total points were used to determine if students in a given course design performed higher.In addition, students in the study two design were given a survey to assess their perceptions of the online course design.The results were used to predict the influence of their perceptions on their performance, based on end-of-course total points.

Study One Design Results
Of the 142 students whose end of course point totals were used in the study, 59% were enrolled in courses that adhered to Quality Matters (QM) standards based on an informal internal review using the fifth edition rubric, which was later scored using the 6 th edition rubric.The remaining 41% of the participants were enrolled in a course that was designed according to instructor preferences.Students in the QM developed courses earned higher end-of-point total scores (x ̅ = 907.80)than those in the non-QM courses (x ̅ = 847.93)t (140) = 2.47, p = 0.02.However, there were no statistically significant performance gaps between age groups [F(3, 131) = 1.895, p = .134]or ethnicities [F(2, 134) = 1.343, p = .257]in either group.Performance differences between males and females were statistically significant [F(1, 140) = 5.862, p = .017],perhaps due to the large discrepancy between the number of males and females in the sample.
The reduction in score variance is also significant to note.The majority of students in the QM course earned a final letter grade ranging from B-to A+.Student scores in the non-QM courses ranged from D+ to A+.This means the end-of-course total points score variance is smaller for the QM course, as graphically displayed in Figure 1

Study Two Results
Of the 84 students who participated in courses with two different visual designs, 55% were enrolled in a multimedia-rich design and the remaining 45% were enrolled in a text-based design.The multimedia-rich course contained images, text, embedded media, hyperlinks, and interactive simulations.The text-based course included written assignment directions, images needed for assignment completion, and text transcripts to required video content with a text hyperlink to required videos.Although participants in the text-based course design scored slightly lower than those in the multimedia course, both of which adhered to Quality Matters standards based on an informal internal view using the fifth edition rubric, and then later scored using the sixth edition, the outcome was not statistically significant.The mean score for participants in the multimediarich course was 920.76 and the mean score for the participants in the text-based course was 897.09, t (82) = -.98,p = .33.

Results of Perception Survey Scales
A 60-item survey was administered to study two participants to ascertain their overall perceptions of the online learning experience.The scales were course design, community and learner engagement, teaching style, and cognitive load.Of the four scales administered, participants in the multimedia-rich course design experienced a lower cognitive load.The higher the score on this scale indicates a lower level of cognitive load.The mean for the multimedia-rich course was 2.96 and the mean for the text-based course was 2.68 [t (81) = 2.31, p = .02].
A regression was used to predict the influences of student perceptions given end-of-point total scores.Of the four scales, "community and learner engagement" most influenced end-ofpoint total scores in the course, as shown in Model 2, which consists of course design and community and learning engagement.The adjusted R 2 accounts for 7% of the change between models 1 and 2 in comparison to the minimal change between the other models.This indicates that community and learner engagement account for 8% of the variance associated with the end-ofcourse point totals.

Results of Course Activity Overview Report
The course activity overview is a report run from within the learning management system.The report includes the total and average time spent per active student and the total amount and type of activity each student had in the course.This report was gathered for each of the four course sections in the study two design.Students enrolled in the text-based course design sections spent more time in the course on average than did students enrolled in the multimedia-rich course design sections.Students in the multimedia-rich course logged an average of 37 hours in the learning management course, while students in the text-based course logged an average of 51 hours in the

Results of Content Analysis of Open-Ended Survey Responses
Qualitative data responses from the three open-ended questions were grouped according to their similarity in meaning.For example, the following responses to the question, "What parts of this course were most useful to you?" were grouped because they conveyed similar thoughts: • "On the critical and other assignments, having the description of the assignment, and the rubric on the same page so there won't be any need to miss vital information." • "The instructor was very organized and stated everything that was required for submission." • "The instructor provided everything needed to complete the assignments, which was very helpful." Once comments were grouped based on similar thoughts or meanings, a classification was assigned.For example, the three responses above were given the designation "organization" because they conveyed the benefit of organizing content well in an online course.Similarly, responses to the question "What parts of this course need improvement?"could also be grouped under the designation organization.The following responses illustrate this occurrence: • "I had some trouble with getting all the information for each assignment in one place." • "I didn't like the way the discussion board was set up." • "The discussion board was unorganized … it was hard to see who's [sic] post was their response or their reply" Axial themes that influenced student perceptions of the online course experience emerged from the open-ended questions.Axial themes were identified based on patterns present in student responses and classifications assigned to those patterns.Student perceptions of the online experience were most influenced by the following: The organization of the courses in this study influenced student of the online course experience.Students in the online courses in this study experienced a weekly course structure.Comments regarding course organization focused on expectations, finding information, and navigating the course.A sense of clear expectations was perceived when students felt like they understood assignment requirements, "The professor was very organized and stated everything that was required prior to submission."The course organization is not perceived positively when students can't find what they need, "Important information was located in multiple places … and I missed a few things."Because each online class differs from the next, it is important to explain the course components, "In my other classes it is set up a little differently." Instructor and social presence also influenced student perceptions of the online course experience in this study.Interactions between the instructor and students as well as among the students themselves were noted and requested.The course included text and verbal discussions to foster interactions among students.Although one student commented the verbal conversations, "made [her] feel more connected to peers and made [her] classmates feel like real people," other students wanted more interactions in the course, "I would have liked more opportunities to interact with the teacher and other students."Instructor communication with students via email and instructional videos with students was noted as helpful.One student stated, "The weekly emails of what was expected and how to do it is what I found most helpful," and another student commented the "… video explanation of the assignments was helpful …" Instructional strategies used in the courses in this study influenced student perceptions of the online course experience.Strategies such as modeling assignment expectations, allowing resubmissions based on instructor feedback, and scaffolding the large assignment were used.Students found each of these strategies helpful.Many students mentioned the "… instructional step-by-step videos for each project …" were helpful.Several students expressed, "The way that the critical assignment was split up into sections.It helped us to contribute to the critical assignment piece by piece."Other students commented they "…liked how we were able to use the instructor's feedback and resubmit assignments to help get a better grade."The use of these strategies may be one reason there was no performance gaps among students that were statistically significant based on sex, age, or ethnicity.
Assignments in this study were noted as influencing the online course experience when they were perceived by students as relevant and doable.Students defined relevance as skills they could use beyond the classroom: "I feel like this course will benefit me in the future despite going into education or not."Students defined doable as something that can be completed "… provide specific website building templates that are compatible to uploading excel worksheets, PowerPoint and so on."

Discussion
This study was conducted to measure the impact of multimedia in course design on student performance and their perception of the online learning experience.Given the limited time instructional designers have available to dedicate to the creation of multimedia assets in course design, it is essential to identify the impact of the time investment on student performance of learning outcomes and their perception of their overall online learning experience in the course.
It's not a surprise that students perform better in a course designed according to internationally recognized research-based course design standards like Quality Matters.While we may know this intuitively, quality online course design does not always occur.The fact that students performed better in a course designed according to Quality Matters is worthy of mention because online courses are generally fully designed before students enter.If they are not designed according to best practices in course design, students are likely set up to perform less optimally before they even begin the course.
It may also not be surprising that there is no statistical significance in student performance between text-based and multimedia rich visual course designs.Once both visual designs were controlled for course design expertise through adherence to Quality Matters standards, as determined through informal and internal review, students in both visual designs performed equally well.It is important to note that the strength of multimedia in course design lies in its ability to reduce cognitive load and facilitate community and learner engagement.Student perceptions of community and learner engagement, in this study, were the strongest predictors of high student performance.The likelihood that students will perceive the course well, and thus perform well, is influenced by the strategic use of multimedia in the course to reduce cognitive load and facilitate community and learner engagement.The degree to which this is planned can support student performance before the course even begins.
An unexpected finding of this study that is worthy of further investigation is the absence of performance gaps, particularly between ethnicities.The California Assessment of Student Performance and Progress documents performance gaps in test scores between ethnicities in K-12 schools and educational researchers have discussed inequitable outcomes in graduation rates between ethnicities in higher educational settings (Steel, 1992;Graduation Rates by Sector, Gender, and Race/Ethnicity, 2006;Flores & Park, 2013).The absence of performance gaps between ethnicities is in stark contrast to the norm.The intersection between course policies, such as the ability to resubmit assignments for up to full credit, and instructional strategies, such as modeling and scaffolding assignment expectations, are potentially replicable practices that could produce similar results for others.The performance gap between males and females was another unexpected finding.While further study is warranted, the overrepresentation of females in the study may be a contributing factor to the results versus an ontological difference between males and females.
The realization that student perception of community and learner engagement in an online course as a strong indicator of high student performance is salient.It places a moral burden on course designers to do this work well.Course design has the potential to exacerbate or mitigate societal inequities as well as larger university outcomes related to student performance, retention, persistence, and graduation.In a society where the benefits of earned degrees are not rewarded to everyone equally regardless of race, gender, socioeconomic status, sexual orientation, and abilities, it is essential to design quality online courses that set adult learners up for success before the course begins.

Limitations and Future Research
As a pilot study, the mixed study research methodologies employed were appropriate.Creswell (2003) noted that mixed studies can potentially reduce biases present using a single method.To find the axial themes within student open-ended responses, discovery methods used included several readings of the written responses, identifying themes that emerged, and constructing typologies (Taylor & Bogden, 1998).Quantitative data from Likert-scale questions on the student survey was analyzed through multiple regression statistics to determine influences.Mertens (1998) noted that surveys allow the collection of data from large numbers of people.While student responses may potentially be less than genuine within the confines of a graded anonymous assignment, the administration of the survey and the results were consistent across all course sections.Furthermore, the methods chosen support the investigation of the study.
The number of students enrolled in the courses during the research period is appropriate for a small pilot study, even though the results are not generalizable beyond its setting.Future research might invite more professors to assess a course designed to meet Quality Matters standards in an informal internal review and compare student outcomes to previously taught sections of the same course designed according to their preferences.In addition, if the instructional strategies employed were replicated in future studies and performance compared between ethnicities, the scalability of the practices and potential to reduce performance gaps could also be determined.If such opportunities were extended to faculty across the nation, findings could be confirmed or denied with broader generalizability.
As with all perception surveys, participants may not be aware of their thoughts and feelings as it pertains to the questions asked.A George Burns quote was used by Mertens (1998) to describe this phenomenon.
If you were to go around asking people what would make them happier, you'd get answers like a new car, a bigger house, a raise in pay, winning a lottery, a facelift, more kids, less kids, a new restaurant to go to-probably not one in a hundred would say a chance to help people.And yet that may bring the most happiness of all (Mertens, 1998, p. 106).
Although, in this study, this phenomenon does not seem to be in effect, as there were no contradictions between the quantitative and qualitative data analyzed. below.

Figure 1 .
Figure 1.Reduction of Score Variance for Participants in Quality Matter's Courses = .01].The 26% decrease in the average logged time between the different visual designs of the course is statistically significant.