Video-Based Feedback on Student Work: An Investigation into the Instructor Experience, Workload, and Student Evaluations

This exploratory study uses qualitative and quantitative data to analyze instructor experiences in adding video feedback to written notes in online courses. This study asks if instructors will feel more "connected" in video feedback courses, report increased workloads, and see an improvement in their performance evaluations in video feedback courses. The results reveal video feedback requires more time than written feedback (i.e., non-video feedback), generates varied instructor experiences concerning social presence, and has little to no impact on instructor performance evaluations. The study concludes that more research is needed to fully understand the instructor experience when using videos, especially in environments where part-time, adjunct instruction is the norm.

Social presence is one way to understand this phenomenon. When social presence occurs online, communicators tend to feel connected through intimacy and immediacy (Crim, 2006). and increased social presence can allow both to feel depth to their interaction (Tu & McIsaac, 2002). Video feedback may be a channel for this, thus making the experience meaningful and valuable to both (Borup et al., 2012;Wade, 2016). Additionally, the rapport established through video feedback (Thompson & Lee, 2012) could lead to higher student evaluations of instructors. However, whether video feedback generates effective social presence and higher student evaluations of instructor performance is in question.
Whereas previous research focused on the impact of video feedback on students' sense of connectedness with their instructors (Borup et al., 2014;Parton et al., 2010;Wade, 2016), few studies have focused on instructor experience. Those that mention instructor experience have relied on a small pool of instructors: Wise et al. (2004) collected information from two instructors, Jusoff and Khodabandelou (2009) interviewed four, Mathisen (2012) interviewed six, and Wade (2016) had five instructors in her project. Other scholars reported on personal experiences as participant observers in providing video feedback (Henderson & Phillips, 2015;Lamey, 2015;Mayhew, 2017;Silva, 2012). It is also unclear if these instructors were full-time or adjunct.
Furthermore, few studies have tackled the impact of video feedback on instructor workload. Studies mentioning workload report a range of outcomes from requiring more grading time (McCarthy, 2015), to less or roughly half the time (Griffiths & Graham, 2009;Henderson & Phillips, 2015;Hyde, 2013;Mathisen, 2012), or the same amount of time as written feedback (Jones et al., 2012). It is important to note that there were no studies that included a systematic and controlled use of time logs for tracking video feedback verses written feedback. Silva (2012) used time logs, but methodology was not transparent about the time spent between reading, commenting, and creating written and video feedback. Wood et al. (2011) gave a time log on audio feedback but did not use written feedback as a comparative. No studies were found that looked at the connection between providing video feedback and student evaluations of instructors specifically.
Analyzing the instructor's perspective seems to be an afterthought versus student sentiment and learning. This project fills the gap by examining the instructor experience of video feedback through positive or negative social presence, its impact on workload, and influence on student evaluations of instructor performance.

Review of Relevant Literature
Social Presence Short et al. (1976) introduced the idea of social presence by explaining it as a "salience" or "sense of being there" within a communication medium. Social presence has since expanded to include building community (Tu & McIsaac, 2002;Wise et al., 2004), projecting a "real" self (Garrison et al., 2009), emotional connectedness (Akcagoglu & Lee, 2016;Swan & Shih, 2005), and awareness of others in a virtual setting (Tu, 2002). While most research has approached the idea of social connection from the perspective of the student (Hostetter & Busch, 2013;Swan & Shih, 2005;Tao, 2009;West & Turner, 2006), the instructor is the second part of the "social" equation. So, their sense of "connection" matters as well for students to feel an authentic bond and realize the potential learning dividends. Following the work of Garrison et al. (2009) and Swan and Shih (2005), we define social presence as a mutual "connectedness" instructors and students feel towards each other within an online environment.
However, online courses pose a significant challenge: materials, content, and the process of learning are broadly asynchronous. Crim (2006) posited that online students were likely to be passive observers rather than active engagers due to lack of face-to-face interaction, missing out on social presence. Since students often learn privately and with limited social interaction, they may not feel the "salience" of an instructor or other learners. This can then be a challenge for instructors who wish to both connect and engage with students. To address this lack of connectedness, researchers point to the importance of building social presence through affective bonding and developing meaningful relationships. Krause et al. (2017) suggest the use of multimedia tools provides students with the impression of social presence, adding value to the online learning experience. Clark et al. (2015) found student perceptions of teaching and social presence were significantly higher with the use of videos and that all students interviewed in their study mentioned social presence as a positive component of their online course.
Leaving video feedback might be an enhancement over text-based exchanges, where there are few nonverbal or emotional cues. However, delivering video critiques is generally one-way communication without reciprocal student feedback (Lamey, 2015). Without genuine "interaction," it is difficult to call it entirely "social." Hughes et al. (2007) discovered that a deficiency in the perception of social presence could lead to disappointment and a decrease in affective learning. Attempts at building social presence may lead to negative feelings if both parties do not feel it is a real, reciprocal exchange. As discussed later, it can also lead to frustration for instructors, which might result in "negative social presence." This is something not well documented in previous research. Attention is now turned to the assumed value of feedback, and video feedback specifically, and its links to social presence.

Social Presence and Video Feedback
Traditionally, written feedback corrects student work, provides direction for improvement and helps facilitate a relationship between instructor and student. This improves student satisfaction and feelings of social connection, which is thought to enhance the learning experience (Carless, 2006;Gunawardena & Zittle, 1997;McCarthy, 2015). For Hattie and Timperley (2007), ideal feedback explains how students meet goals, improve, and remarks on progress. High-quality feedback allows instructors to identify growth opportunities and guides learners towards increased knowledge and expertise (Gaytan & McEwen, 2007;Wiggins, 2012), and to develop meaningful and productive instructor-student relationships (Wade, 2016). Additionally, supportive and forward-moving feedback reinforces instructor credibility (Witt & Kerssen-Griep, 2011), demonstrates instructor's care for students and their success (Borup et al., 2014), encourages student self-reflection (Hyde, 2013), and motivates students to become engaged in learning (Griffiths & Graham, 2009). Some scholars argue that all of this can occur in more meaningful ways through video (McCarthy, 2015;Wade, 2016).
Scholars argue that video feedback builds social bonds and is preferred over written feedback (Borup et al., 2014;Lamey, 2015;McCarthy, 2015;Parton et al., 2010;Wade, 2016;West & Turner, 2016). Studies show that students find video feedback more detailed, conversational, and connecting, as they can see their instructor in-person and observe tone, inflection, expressions, eye contact, and personality (Anson et al., 2016;Killingback et al., 2019;Lamey, 2015;Mathisen, 2012;Wade, 2016;West & Turner, 2016). Others found videos allowed for increased bonding between students and instructors (Mathisen, 2012;Parton et al., 2010). Finally, there was a persistent pattern of students reporting that video feedback was more specific and constructive (Henderson & Phillips, 2015;Mayhew, 2016) and that it would help them improve their performances (West & Turner, 2016). Little research has investigated the impact of videos on student evaluations of instructors or student performance or student evaluations of instructors. Studies have not focused on precise impacts on learning (Jung et al., 2002;Richardson & Swan, 2003;Spears, 2012). Positive links exist between social presence and student perceptions of their learning, but most do not analyze actual grades (Hostetter & Busch, 2013;Swan & Shih, 2005;Wise et al., 2004). None of these studies examined the impact of video feedback specifically. Finally, most research addresses student perception of video feedback, quality of feedback, and social presence (Spears, 2012;Wade, 2016), but little of the scholarship looks at faculty connectedness to their students, reactions to being required to provide such feedback, or impact on workload and evaluations of their teaching. Attention is now turned to instructor perspectives, as it is contended that the literature does not sufficiently address their experiences when providing video feedback.

Instructor Perspectives on Video Feedback
As stated, most research does not address instructor perspectives (Borup et al., 2015;Wade, 2016). Research focuses on the positives, including how videos improve instructor tone and build connections with students (Lamey, 2015). Jones et al. (2012) found that video feedback, combined with screencasts, allowed instructors to facilitate fluidity in commenting and highlighting text. Mathisen (2012) reported that instructors found their video feedback to be more efficient and of higher quality. Mayhew (2017) asserted that instructors saw video feedback as a richer medium for building student relationships by bridging interpersonal communication gaps, managing interpretations, reducing distance, and addressing improvements. These positive experiences reported by instructors demonstrate the potential benefits of stronger social presence in video feedback.
There are, however, also concerns about the instructors' reactions to using video feedback. While Wade (2016) found that some instructors reported positivity about making videos, some expressed concerns that they had to "look presentable" to make the videos, perhaps meaning that there was some self-doubt about being assessed on a new criterion: appearance (p. 72). Borup et al. (2014) found that instructors reported frustrations about conveying their emotions on camera unintentionally after years of written feedback. The instructor's morale and emotions are especially important when recording videos (Jones et al., 2012), as resulting negative feelings could be conveyed to students.
Another challenge with video feedback is a lack of consensus on whether it takes more or less time for instructors. Wade (2016) broadly reported it took instructors 30 minutes to two hours to grade each paper and leave videos. McCarthy (2015) reported it took 10 to 15 minutes to give feedback with audio, 20 minutes to give feedback in writing, and 20 to 25 minutes with videos. Silva (2012) compared video and written feedback. For the first paper, using only video feedback averaged 12 minutes whereas reading and writing comments averaged 20 minutes. For the second paper, using only video feedback averaged 20 minutes, while reading and commenting averaged 30 minutes.
Overwhelmingly, there is a lack of precision accounting for and reporting the time spent recording video feedback as compared to the time spent providing it in a written form. Borup et al. (2014) said, "video appeared to be more time consuming and less convenient than providing text feedback" (p. 245). By contrast, Mathisen (2012) reported video feedback took "one quarter of the time" of written feedback for one instructor and that it was "much easier" for another (p. 107). Similarly, Lamey (2015) and Hyde (2013) vaguely reported that it took instructors "less time" when using only videos than leaving written feedback. After some initial technical issues, Jones et al. (2012) reported that it took tutors a similar amount of time to leave video feedback as written, saying this form of feedback "is no longer or shorter than that for traditional marking" (p. 594). None of these studies explained the time-tracking process or reported on the employment status of their instructors, creating questions about the validity and generalizability of findings.
Instructor skepticism might also arise if students do not take the feedback seriously or show improvement. Instructors in Wade's (2016) study reported concern that video feedback was "time consuming" and that there was no way of knowing if students understood it. Wei & Yanmei (2018) found that when students did not study task-specific written feedback and were not able to achieve better learning outcomes, instructors felt demoralized. Silva (2012) also expressed frustration when a student did not implement her feedback. Furthermore, Henderson and Phillips (2015) noted that "large volumes of feedback may be redundant, with only a proportion of the feedback being received by the student" (p. 62). To summarize, while existing research focused on the benefits and drawbacks of video feedback, little has been written on instructors' potential frustration with being required to do added work while experiencing one-way "social" presence; spending too much time leaving the video feedback; and not seeing positive results from the time invested. This raises the question of whether it is worth requiring instructors to provide video feedback when there is no evidence of what students are gaining from the experience and if it does indeed increase social presence. For part-time adjunct instructors, who make up a high percentage of all faculty positions (Flaherty, 2018), there seem to be few incentives to use their valuable teaching hours on social bonding. This can then lead to instructors experiencing negativity around their attempts at social presence.

Student Evaluations of Teaching (SETs)
While few studies have examined the relationship between quality of feedback and SETs, considerable scholarship has investigated the criteria students use to evaluate instructors. Hornstein (2017) claims students are often "dispassionate evaluators" (p. 3) and Bassett et al. (2017) found a similar pattern of students not putting much effort into their SETs. Liu (2012) also explained that first-year college students (the majority of the student population of this project's study) were found to provide the lowest rankings of instructors. Engagement seems to be a key factor in student assessment though, and personalities can be an influential factor in that and more evident via video. With videos, instructors may be able to communicate more effectively to students and increase their likeability, and one of the purposes of analyzing End of Course Survey (EOCS) data in this study was to test to see if that was the case. Hornstein (2017) says positive evaluations of instructors on competence, likeability, communication skills and humor are correlated with higher student ratings. Williams and Ceci (1997) found that when instructors showed more enthusiasm through nonverbal cues, they received higher rankings. Additionally, studies have also found a relationship between higher SET scores and being seen as "open" (Kim & MacCann, 2018) or "motivating, helpful, explanatory, and accessible" (Phipps et al., 2006, p. 241). Therefore, it is arguable that videos could help instructors demonstrate passion for the topic through engagement with course content or use of pitch, tone and volume to be more open and relatable, potentially increasing their likeability which could, in turn, translate into higher student engagement and then higher evaluation scores.
To respond to the gaps in existing literature on the instructor experience and address the concerns raised here, this study aims to examine instructor reactions to being required to provide video feedback, evaluate the impact of video feedback on the instructors' workload, and examine the differences in SETs in courses with and without the use of video feedback.

Methods
This exploratory research study followed a mixed-method strategy conducting shortanswer anonymous surveys with instructors, collecting quantitative data for workload, and comparing the EOCS results in courses with and without the use of video feedback. This study was part of a larger project that investigated student performance as illustrated through grades earned and sentiment, as indicated through EOCS results.
At the university where this data was collected, all courses are five weeks long and adjunct instructors (the majority of the workforce) have a 12-hour workweek contract. Students take the course analyzed as part of their degree plan to fulfill the oral and interpersonal communication general education requirement. Participants were informed of the study at the beginning of the course and could opt-out of video feedback by contacting the instructor at any time during the course. Like in all courses at this university, the EOCS was optional and did not affect final grades.
Upon obtaining Institutional Review Board approval, researchers invited online instructors who regularly teach this course to participate in the project. Initially, ten instructors volunteered , but one dropped out. All instructors had between two and seven years of experience teaching this course, and three were co-authors of this paper. Eight of the nine instructors were part-time adjunct faculty and one was permanent full-time. Participating instructors received continuing learning credits as incentives for participation and signed informed consent. All three instructors who were both the participants and the authors were open to leaving videos and did not feel strongly either way. They likely represented the general sentiment of instructors, which was curiosity and a willingness to try a new feedback system mixed with a concern about the amount of time invested. Like the other instructors, they had some thoughts going into the study based on their personal experiences. Ultimately, all feedback was anonymous, with the exception of one instructorresearcher who recognized her own contributions in the data, and all researchers made an effort to go into the project with an open mind.
Data were collected on two sections with video feedback and two sections without video feedback for a total of four courses taught by each of the participating instructors, with a total of 36 courses reviewed. Instructors were asked to log all grading time and record the number of papers marked in each session. Data were also compiled using EOCS results for those same 36 courses.
The standard institutional grading guidelines at this institution include offering a variation of praise, constructive criticism, and suggestions for future improvement for written assignments both in the margins of the paper itself and in the "Summary Feedback" area of the gradebook. Due to the comparative nature of the study, instructors were advised to provide two types of feedback for two assignments in each of the four courses taught-written feedback in the non-video textonly (T) sections, and mixed text and video-feedback in the video (V) sections. For the T courses, instructors were asked to offer in-text notes within the paper and summative text-based feedback in "Summary Feedback" area of the gradebook that followed the standard institutional grading guidelines. For the V courses, instructors were asked to offer in-text notes within the paper, record 3 to 4 minute feedback videos, and use pre-written templates for "A" through "F" work for the "Summary Feedback" area. Using these templates ensured that no work was added to providing "Summary Feedback" in V courses and that the measured difference was only in summative textbased feedback (T courses) versus creating the videos (V courses).
Two assignments were selected for this study. Instructors provided video feedback on the week 1 paper, which was a two-page paper where students explore assigned topics, and the week 3 paper, which was a five-page draft of the final paper. These two papers were selected with the assumption of having the most impact on the student experience. In week 1, a video could help establish an early bond with each student and provide specific information on the expectations for all written work in the class. In week 3, providing feedback on final paper drafts would help students improve their final papers. By weeks 4 and 5, it would then be possible to assess the impact that investment of feedback had on student sentiment, accounted for in the EOCS results.

Open-Ended Instructor Survey
Online open-ended surveys of the instructors were conducted for their ability to provide a broad overview of instructor sentiment and to allow future research to set up themes to explore through in-depth interviews. Instructors were asked the following eight questions to gauge their reactions to leaving videos: Q7: Did you feel you were more supportive in any way when using video?
Q8: What do you see as the pros and cons?
All answers were compiled into a single document and systematically analyzed. First, a number of positive versus negative sentiments was examined through the language expressed. Positive and negative sentiment was obvious by some saying the experience was "wonderful" while others said, "I am not too fond of it." If they were mixed, it was noted. Next, key words were highlighted and those phrases were used to compile a picture of instructor sentiment on each question.

Measuring Workload
Instructors logged time spent through online time-tracking software. Using an Excel spreadsheet, they also recorded the grading week/grading session, start time, end time, total minutes, and number of papers graded. Results accounted for each instructor and class individually, determining the time spent on each grading session and time spent per paper, in both T and V courses.

EOCS Results
Data regarding EOCS were drawn from summary reports for each section. At the end of each course, all students were asked to complete evaluations voluntarily and provide anonymous feedback about their experiences, including assessing instructors on various categories. These reports provide the number of students responding and the percentage of responses for each score for each EOCS question. These were scored using a 0 to 4, 5-point Likert scale. Using the total number of responses for each survey and the percentage of responses at each level for each question, the responses for each question in each section were then reconstructed. Four of the sixteen provided questions were singled out for their focus on evaluating the quality of assignment feedback and recommending the instructor and the course: • The instructor's feedback aligns with her/his communicated expectations.
• The instructor provides useful feedback for improving students' quality of work.
• I would recommend this instructor to another student.
• I would recommend this course to another student.
Data were then collected, and t-tests were conducted to determine if leaving video feedback impacted EOCS scores.

Qualitative Survey Results-Short Answer Questionnaire Results
All participating instructors were asked to complete an open-ended survey following project completion. Eight out of nine instructors completed a survey. Results are presented below.

Q1: What did you think of the video feedback element?
For general impressions, six out of eight instructors were positive about using video feedback. They indicated videos to be a "wonderful addition," "interesting," "real," and "healthy." Some commented on nonverbal elements, mentioning s an important "visual" dimension which allowed them to create a "warm and friendly" atmosphere enabling them to explain things "in person." There were two negative comments, but only one participant (not a study researcher) demonstrated complete negativity. In a mixed review, an instructor mentioned the connection was one-sided and resulted in feeling "personally insulted" when the student did not follow advice. The instructor also felt disappointed when student work did not appear to improve. The instructor who disliked the experience expressed dissatisfaction with video feedback due to time and technology challenges.

Q2: What did you think of the time allotment?
In terms of reflecting on time, there was at least one negative comment shared in six out the eight replies. Most mentioned that providing video feedback took too much time, with one instructor responding that it is still worthwhile because it is "better for the student," and another noting that an adjustment to the grading plan for the week resolved the time issue. One instructor was concerned that a five-minute video could equal a half-hour upload time, and another pointed out that it took away from time s/he could have spent with the student in other areas. Another participant stated that making videos "requires faculty to make notes of student strengths and weaknesses in the assignment in order to provide constructive feedback the student can use to understand strengths and make improvements to future work." This demonstrates there is often preparatory time needed to make videos, and instructors expressed some frustration with feeling their work was not valued or acted upon even after extra time invested.

Q3: Did leaving video feedback change the way you approached feedback? If yes, explain.
Half of the respondents indicated a positive change, two were negative, and two were mixed. For those who felt a change in approach, responses ranged from requisite positivity to improvements in grading practices. One participant expressed feeling "obligated to have a smile on my face, even with potentially negative news…[to] soften the blow of negative feedback." Another valued the more detailed explanation of points as opposed to writing them out. Some mentioned paying more attention, having a "vested interest" in student success, and seeing an improvement in their T feedback.
Other participants did not change their approach to feedback. They stated that their practice is to engage with students throughout all other areas of the course and to be straightforward in both T and VT feedback. One participant explained that they used the "sandwich style" of T feedback (offering praise, constructive criticism, and positive suggestions, per institutional guidelines) and then applied a similar approach to videos, thus employing the same standard of practice via a different medium. In all, no-change participants felt adding video feedback was not different but another way of engaging with students.
Among negative experiences, a common frustration resulted from instructors feeling a lack of student actions toward suggested improvements in the VT courses, bringing to question whether instructors' video feedback was impactful. Two instructors said there was no evidence students were viewing or reacting to the video feedback. One said, "I was spending more time creating substantial feedback, but there wasn't evidence it was being utilized or valued." Here, two instructors again had a "negative" experience with social presence, as they did not experience the returns on the invested time.

Q4: Did using video feedback change your tone? If yes, explain.
There were mixed reviews, with five of eight instructors stating it did not change their delivery. One instructor worried about sounding "worn out" or "bored" by the end of a video session. This instructor indicated feeling bad about delivering negative news but could not offer an entirely sympathetic connection because it was one-sided. Adding to this, the instructor felt "emotional distress when I had to tell someone they receive a failing grade." Others said it was easier to be supportive via video through smiling, encouraging, or empowering students, where "a smiling face and kind tone are better received than written feedback, especially if the grade is low." One instructor mentioned that they were able to fully demonstrate personal concern and investment.

Q5: Did using video feedback make you focus on advice for improvement?
Here instructors were more positive, with five of the eight stating that it allowed them to focus on how the students could improve. One of the instructors said they created "mini lessons" to help them overcome their struggles, that they could offer advice rather than just focus on what was wrong and noted that they looked more closely at each assignment.
Two of the instructors were more negative in their responses, saying videos did not make a difference in their advice, as they felt that it would either be taken or not regardless of delivery style. One instructor felt comments were repetitive in both video feedback weeks. The instructor said that for both T and V feedback, students did not follow the advice, reiterating the feeling that students did not appreciate the effort, or the instructor felt no return on the time investment of leaving the videos.

Q6: Did you feel more connected with students in your classes where you left video feedback?
When asked about their feelings of being "connected" to students, three of the eight reported a change, four reported none, and one reported mixed feelings. Out of the four instructors who indicated no connection, one said that any connection felt was short-lived in first few minutes of leaving feedback. Others were frustrated, with one saying they felt unheard by and disconnected from students. Another respondent shared that no one responded to the videos or commented that they helped, so the instructor felt disappointed that students were not engaged. Instructors felt negative about the experience because of a false sense of "connection." From those who responded with positivity and indicated a feeling of connectedness to their students, they received more texts and emails when they left video feedback, and felt the students appeared to be more connected with the instructor.

Q7: Did you feel you were more supportive in any way when using video?
Instructors were more united in stating that the videos allowed them to be more supportive, with only one of the eight disagreeing Some said they smiled more to be positive and encouraging, that they felt advocacy for the student, that it was "much more personal," and that they could be "up beat [sic] and energetic." However, this same instructor said, "This was the only way I felt I was more supportive since written feedback lacks nonverbals." Commenting more on the dynamic of leaving VT feedback, one instructor said, "It was like I was talking with them, not typing to them," which lends support to emotional connectedness. Another said, "I had to sometimes force a smile and I think that smile actually forced my brain to be more positive and encouraging. I wanted them to see that they had an advocate who would work for them." So, for some, leaving the videos led to a change in how they felt about the experience or their tone. Reiterating this point, an instructor said, "I think what the video did was provide me an opportunity for them to see me and hear me and know that it is not just words, that I am really invested in them as students." Here, T communication is seen as being less powerful than spoken.
To sum up, these instructors felt the medium altered the experience and led to a change of the tone of the message and potentially message reception.

Q8: What do you see as the pros and cons?
Responses were mixed based on those who enjoyed the experience and those who did not. One respondent remarked "for me, it is all cons" and one said "I do not see any cons." In four replies there were more pros listed than cons, in three there were more cons than pros listed, and in one there was an even split.
For the positives, instructors mentioned feeling more connected, real, positive, and that they offered actionable advice. One indicated full appreciation for the practice and expressed a desire for video feedback to become a standard part of their future grading process. Another said the videos "forced me to be more positive, with a smile on my face. That smile did translate into being more positive and offering more forward-looking advice." The most persistent complaint was about the workload increases, with six mentioning time as a challenge. One said the con was "the time it took and the lack of student satisfaction." Among others, video feedback was indicated as physically and emotionally taxing, that it did not lead to change, that it was time wasted, and redundant. Another said, "The time angle can be a problem as it takes longer to video them. I have to video it on my computer then upload it." Two instructors raised doubts about whether students were watching the videos. One said: I am not sure students were watching it, just as I am not sure they read the [standard] feedback. Student [sic] who do well do not need video feedback, and students who do poorly and NEED it do not read it.
Similarly frustrated, another said: … it felt like my time in class was wasted giving redundant feedback no one listened to. I became offended about it after doing this a few times and tried less in the class because I felt like they didn't care.
Two other instructors mentioned that they faced persistent technological issues. Because of the added time and frustration with technology, one instructor was frank in expressing uncertainty that they would "want to teach under these conditions." While some instructors enjoyed the project, others had a bad experience making the videos and struggled seeing any value in it. Calculations of time spent leaving videos is now offered to gain a better understanding of if and how much leaving videos adds to instructor workload.

Workload
One of the critiques offered in the literature review was that there was limited formal tracking of time spent grading in classes with and without video-based feedback. To address this, participants in this study tracked the time they spent providing feedback on two papers in four courses for a total of eight.
There was a great deal of variation in how much time instructors spent grading and giving feedback and a considerable difference in the number of students who submitted papers in each class. Therefore, an accounting of how much total time was spent and time spent per paper is offered to give a more realistic picture of how much time leaving videos adds to instructor workload. It is also important to note that two instructors faced significant technological issues. One instructor was not able to get the learning management system (LMS) video function to work, so they recorded each video and uploaded a link for students. Another instructor abandoned the plan of using the internal video system and offered students video feedback by sending an attached video file for one of the assignments. These challenges led to a significant increase in hours for both participants. Others used the video software embedded within Canvas, the LMS used by the school, without reported technical issues.

Week 1 Papers
As Table 1 indicates, it took significantly more time to provide students with V feedback than T feedback on Week 1 papers. In all 18 courses combined, it took almost 37 more hours to leave V (which included preparation, creation, and upload time) vs. T feedback. When dividing this by the 18 V courses, it added an average of over one hour to each instructor's grading session, or two hours of work for each course (however, even these numbers varied significantly).  Table 2 offers the average number of minutes instructors spent on each paper and the variation of time spent on the Week 1 papers. For the T feedback, instructors ranged from spending 3.42 to 14 minutes on each paper. For the V feedback, they spent between 9 and 28 minutes. The average went from 8.78 to 14.34 minutes spent per paper (a 63% increase). Given that on average instructors at this institution see about 27 week 1 papers in any given class, Table 2 indicates video feedback would add an average of 149.85 minutes (27 x 5.55) to each V grading session, which is 2.5 hours (again a 63% increase).
Instructors were asked to leave 3-to 4-minute videos. Many instructors left shorter videos (see Table 3). On average, in week 1, instructors left videos that were 2.64 minutes long. Combined video lengths per class ranged from 19.25 to 94 minutes, with an average of 51.63 minutes of video left in week 1. The week 3 papers were longer (roughly five pages versus two pages in Week 1) and therefore the grading sessions were typically longer. In the T courses, it took 70.9 hours to leave feedback on papers, averaging 3.94 hours for each grading session. In the V courses, it took 95.35 hours, averaging 5.3 hours for each instructor. Therefore, videos added 24.4 hours to total instructor time and this was an average of an additional 1.36 hours per instructor (a 35% increase). It should be noted that fewer papers are submitted in Week 3, as students drop or stop participating. Because of this, it is essential to consider the time spent per-paper. On average, it took 11.84 minutes to grade and give feedback per paper in the T class and 15.98 minutes in the V class (see Table 5). Videos then added 4.14 minutes per paper, which is again a 35% increase. As the table demonstrates, instructor grading time varied significantly.
Roughly 23 students end up submitting the week 3 paper. Under that assumption, it would add about 95 minutes (23 x 4.14) to the instructor workload this week, or 1.58 hours, which is again a 35% increase (see table 5).

EOCS Results
There were 329 student responses to the surveys for both classes (V = 174 students, and T = 154 students), yielding an n of 328 for each test in this section. The results of an independent samples t-test to compare answers in V and T courses found no significant difference (p < 0.05) for any of the questions (see Table 7). The use of video feedback does not appear to result in a change of student perceptions related to the instructor or course.

Discussion
Overall, this study found leaving videos led to mixed reactions from instructors (similar to results from Wade, 2016), increased workload (in line with findings from Borup et al., 2014, andMcCarthy, 2015), and did not significantly impact EOCS scores. Within the qualitative data, there were mixed feelings about the experience, with some instructors demonstrating excitement and others disappointment. About half of the instructors enjoyed leaving videos and found value in the idea, while the other half felt they were frustrating or meaningless. When listing the pros and cons, which share the instructors' overall attitude about the experience, four listed more pros than cons, three listed more cons, and one was even. Those who viewed the experience as "warm" or "real" may have felt a personal connection with their students by utilizing video feedback, extending findings of McCarthy (2015), Thompson and Lee (2012), and West and Turner (2015), who found similar results. Those who were disappointed felt a lack of social connection to their students, or a negative social presence, as their efforts at social connection felt lost. This extends the concerns raised by Silva (2012) and Wei and Yanmei (2018), who reported being frustrated that students did not act upon the feedback given to them. One of the assumptions of social presence is the idea of interaction and mutual connectedness (Biocca et al., 2003;Hwang & Park, 2007). In this study, some instructors felt otherwise, indicating a one-way transaction instead of "interaction." Supporting the work of Lamey (2015), instructors noted that the videos were "one-sided," that instructors could not see students' reactions, and they questioned whether the students were even watching them. Some instructors also acknowledged that their recommendations were not "utilized or valued," and described providing feedback on failing papers as "distressing." On the positive side, similar to Henderson and Phillips (2015), instructors acknowledged that video feedback provided a space to foster students' improvement. Instructors expressed that they could provide verbal mini-lessons or offer specific direction with more ease than T feedback, and through this, felt more connected and invested.
Providing video feedback could also be overwhelming and tiring for the instructor. Those instructors who feel frustrated by the video-making/posting process or discouraged by a lack of student reaction may have negative experiences with attempts at social presence. Dissatisfaction with providing videos could potentially translate into an unfavorable experience with presence, which could be subtly conveyed through the instructor's tone (Hughes et al., 2007). Therefore, it is relevant to recognize that this video-based social presence might not always be positive. Instructors expressed concerns that their recorded verbal messages may come off as harsh or too flippant, and that nonverbal signals can produce unintended impressions, creating barriers or distance that might not exist in T feedback. Finally, concerns that students might not understand the feedback or feel that it was "useful" align with similar findings as Wade (2016).
In line with Borup et al (2014) and McCarthy (2015), results of the quantitative data indicated that leaving videos requires a lot more time, adding to instructor workload, and creating a considerable variation in the time spent providing feedback. This study found an increase of one to two hours of work for instructors per week. Some technology concerns (also raised in Borup et al., 2014;McCarthy, 2015;& Wade, 2016) further complicated the process and created additional frustration and workload issues. For example, being unable to get the LMS video-system to work in a reasonable amount of time led one instructor to working beyond the 12-hour contract and finding alternative delivery methods. It took an average of 10.26 minutes for each paper in the T courses and 15.15 minutes for each in the V courses, even if the videos were only an average of 2.59 minutes long. When considering a requirement to provide video feedback, it might be necessary to assess the time it typically takes to complete grading. Within a 12-hour weekly pay period, video feedback required up to 2 hours per week of additional grading. For those who struggled to provide timely feedback, or had issues with technology, there were significant added time commitments. Therefore, leaving videos can require valuable work time and could lead to instructors cutting time in other important areas (e.g., preparing content or participating in discussions), to compensate for this loss.
Finally, there was no significant relationship between leaving videos and EOCS results, which may extend studies that found no clear link between leaving videos and improved learning outcomes (see Mathisen, 2012;Mayhew, 2016) and the fickleness of student evaluators (see Bassett et al., 2017;Hornstein, 2017). Within the quantitative EOCS results, no connection was found between leaving videos and improved scores on the four survey questions analyzed. This lack of improved EOCS scores could have led to a demoralization for instructors as well. After they put many more hours into the feedback, absence of seeing a significant impact could hurt one's morale and thus cause "negative" feelings about any attempts at social presence. Experiencing increased workload without seeing concrete results in instructor evaluation scores seems to also have been distressing.

Limitations and Future Directions
One of the limitations of this exploratory study is the relatively small number of courses and instructors that were evaluated. A total of nine instructors monitored 36 courses-a small percentage of the number of instructors who teach the course sections. Future research should include a larger number of participating instructors, which will allow for expanded data collection.
The instructor teaching style may have created biased opinions about video feedback. Beginning with an already relatively small pool of potential participants, it is possible that instructors who volunteered had a predetermined bias about the study, thus confining the results and limiting their generalizability. Future studies should employ a random sampling of volunteers and assess a larger number of courses to allow for better representation in the results.
Feedback was a significant element in this study, yet its evaluation was confined to presence or absence only, limiting the results. Researchers focused exclusively on comparing the presence or absence of V and T feedback. Since there was no intent to assess the variations of verbal and nonverbal cues, students' reactions to the feedback, or the quality of that feedback, it was not possible to examine the exact impact V feedback played on students' perceptions of effectiveness. Future research should conduct in-depth interviews and a more systematic data analysis.
Finally, this study did not assess student access to video feedback. Participating instructors raised questions about whether or not students were viewing the videos, so it was not possible to establish if student performance was affected. Understanding how and when students utilize feedback may give insight on how to create useful content that promotes connectedness and learning.

Conclusion
This study examined the personal reactions of instructors to leaving V and T feedback, workload, and the effect on student evaluations in V and T courses. Individual instructor experiences varied. Some experienced positive social presence and embraced the value of video feedback in their courses. Others expressed negative social presence and frustration about their messages being ignored and seeing the extra work as meaningless. Results determined there is a slight to heavy increase in the worktime devoted to providing video feedback, which may lead to negative teaching experiences. This could be especially frustrating to adjuncts, who are being paid for a set number of hours. Contingent workers are already stretched with 89% reporting working for at least two universities, and 27% working for at least three (Douglas-Gabriel, 2019, para. 9). Furthermore, EOCS evaluations showed no persistent improvements. The combination of little change to instructor scores and the additional workload makes it difficult to justify advising the use of video feedback; however, instructors who enjoy using V feedback shouldn't be dissuaded from adding it to their pedagogy. But if the goal is improved student learning through V feedback, instructors need to see some validation of their work in order to motivate both institutions and instructors in adopting this practice.