Moving Assessment Online: Experiences within a School of Pharmacy

The COVID-19 pandemic required academic institutions to quickly transition to online learning and make changes to assessment procedures. This study examines how a school of pharmacy creatively approached the challenge of online assessment while maintaining the standards necessary to prepare practice-ready student pharmacists. To conduct traditional exams, instructors deployed two different types of methods using testing software: a video conferencing technology approach which mimicked pre-pandemic, on-campus proctored exams; or open-book, internet access-enabled exams that ensured academic integrity and rigor through various testing strategies. To assess students’ clinical skills, faculty used a combination of techniques such as physical examinations, patient interviews, and patient presentations. To understand the student experience with these assessments, students were surveyed using a 12-item questionnaire. Overall, online video proctoring maintained consistency in exam structure and administration, but required extensive instruction for both students and proctors. Students preferred unproctored, open-book, internet access-enabled, standard time exams versus proctored, closed-book, internet-access disabled, extended time exams. Changes to testing procedures, whether with proctored or unproctored methods, appeared to increase student stress.


Moving Assessment Online: Experiences within a School of Pharmacy
When the COVID-19 pandemic required all courses to move online, faculty around the world needed to create alternatives to current assessment methods while maintaining the same academic rigor and exam integrity. In pharmacy fields, the emphasis was to ensure that all students could continue to progress through the curriculum while abiding by the Accreditation Council for Pharmacy Education (ACPE) standards (Accreditation Council for Pharmacy Education, 2015). Flexibility and innovation were emphasized by ACPE to maintain the standards that produce a practice-ready pharmacist (Accreditation Council for Pharmacy Education, n.d.; Engle, 2020;J. Engle, personal communication, March 17, 2020). This manuscript focuses on the adjustments made to assessments at an individual pharmacy school during the pandemic, and students' responses and preferences concerning these changes.
This study, with original analysis of student preferences regarding assessments, took place within an ACPE accredited school of pharmacy located at two campuses in Virginia. Prior to the COVID-19 pandemic, courses were conducted as live, in-person classes using video teleconferencing (VTC) technology to connect the two campuses. The curriculum includes didactic-based courses, as well as clinical courses that focus more strongly on application and interaction. Written assessments were conducted through timed and closed-book means, and faculty proctored examinations consisting of multiple choice, matching, and essay questions using ExamSoft (www.examsoft.com), a secure assessment software platform. Examsoft was also used to administer some components of Objective Structured Clinical Examinations (OSCE). In addition to these components, during an OSCE, students demonstrated clinical skills through a live, interactive scenario with a simulated patient that may have included performing an interview, vitals assessments, physical examinations, or writing a clinical note.

Transitioning Assessment to the Remote Context
In didactic courses, instructors decided whether to implement proctored or unproctored exams. In clinical courses, instructors used simulated experiences to support online assessment. Below, we provide illustrative examples of these three types of approaches.

Didactic-based Courses: Proctored Assessments.
To provide a consistent method to proctor students electronically, the school of pharmacy created a semi-standardized process for using Zoom, an online video conferencing platform, to create a proctored exam environment. Students were informed of the process and time was dedicated to practice; as a result, students gained familiarity with the expectations and ensured that Zoom was working properly. Students connected to a Zoom meeting on a secondary device that was hosted by a faculty or staff proctor. Ten proctors were recruited for the exam to average 8 to 10 students in each Zoom meeting. In each meeting, students were instructed to show proctors a 360° view of their workspace. Students were allowed to contact their proctor using the chat function if they had questions during the exam.
In one class, all students were able to successfully upload their first exam. The course coordinator then held separate meetings to debrief with the proctors and students to identify issues that could be remedied for subsequent proctored assessments. Students cited concerns regarding the stability and reliability of their internet and feared they would be penalized for weak internet connectivity. Background noises and distractions (e.g., dogs barking, family member noise) by other students in the testing room were also concerns. For future proctored exams, students were instructed to mute their microphones and turn down the volume on their monitoring device. To respect privacy, students in the Zoom meetings were instructed to turn their heads when the proctors were scanning the individual testing areas. The revised process was shared with other faculty. Other courses then used this method of assessment, tailored for their own course specific needs.
Once students used the Zoom proctoring method, it became familiar and accepted for the rest of the semester, but a major disadvantage is that it was time-and labor-intensive. Proctors were different for each exam; therefore, pre-assessment meetings with clear instructions were necessary to ensure all proctors were familiar with the process.
Didactic-based Courses: Unproctored Assessments. Faculty were able to conduct unproctored assessments, if they preferred, which gave students the ability to use their notes and the internet while taking the assessment. Since this allowed for the open use of resources, faculty members used other mechanisms, such as timing and differing question types to evaluate the students' knowledge base and critical thinking skills. The lengths of time for each assessment in totality and per question were considered to prevent students from searching through their resources for each answer. Generally, each question was allotted two minutes, keeping in mind that knowledge-based questions may take a shorter period of time than an application-based question. Varied question types and unique questioning strategies were also used to maintain exam integrity. All questions and answer choices were randomized. Some faculty disabled the backward navigation feature in ExamSoft to prevent students from working together or allotting time to search for answers.

Application and Interaction-based Courses.
For clinical courses, faculty adopted creative methods to simulate experiences for online assessment. Below we provide three examples from different courses.
The first-year Physical Assessment final OSCE traditionally required the student to perform an interview, vitals assessments, and physical examinations on a simulated patient within 20 minutes. However, in the remote context, students no longer had access to the assessment tools needed to perform many of the physical exams (e.g., monofilament in the diabetic foot examination) or standardized simulation patients. For the online final OSCE, students were advised to recruit an adult within their residence to be their patient. Students were required to sign into a Zoom meeting at a designated time, during which they were assigned the physical examinations to demonstrate for the OSCE. Students had 40 minutes to set up their workspace, record only once, and upload the video to the learning management software. This accounted for the time it would take to conduct the three physical assessments and internet or technology issues that may occur when uploading the file.
In the second-year Patient Centered Care (PCC) course, the OSCE was redesigned to be administered exclusively through ExamSoft and used the standard Zoom proctoring format as previously described. Students were given a patient case and journal article one week prior to the OSCE. Immediately prior to the assessment, students were given a patient interview script and lab report through the learning management software. Next, each student started the ExamSoft assessment, which blocked internet access but not notes already on their computer. The exam required each student to write a clinical note and answer seven multiple choice or short answer questions pertaining to the article or the patient. Changing the questions from verbal (as was the case pre-pandemic) to written required creative restructuring and time extensions. To reduce student anxiety with the process, a practice OSCE occurred one week prior, during class time, and followed the same operating procedure.
The third-year PCC course simulated a pharmacist-provider situation, background, assessment, and recommendation presentation (SBAR) followed by a written final exam. To maintain a simulated environment, students moved through three, sequential Zoom-proctored meetings. Students used their laptop to connect to Zoom and used their tablets for additional tasks. In the first meeting, students received patient case information and were given time to formulate an SBAR. Students then proceeded to a second Zoom meeting where they presented the SBAR to the proctor and were asked patient-case questions. Each Zoom meeting was recorded so that graders could rewatch the videos as needed. The final Zoom meeting was for the proctored and written final exam administered through ExamSoft, which all students took at the same time. One drawback of this approach was that because students started the first part of the exam at different times, a gap of time existed for some between the second and final Zoom rooms. This time gap created the potential for some students to complete the SBAR portion of the exam early and share that information with other students, although this was explicitly discouraged. Additionally, internet connection and technology concerns delayed some students, creating a stressful environment as students encroached on the scheduled time of the final written exam. Building in more time between the SBAR portions of the exam and the final Zoom meeting could have alleviated some stress, for both coordinators and students, that resulted from unanticipated delays.

Understanding the Student Experience
Upon completion of the spring semester, all students enrolled in didactic courses received a survey regarding student preferences for the various online assessment methods used after the transition to online learning. Data collection was anonymous and all survey responses were collected in aggregate, based on a study protocol approved by our Institutional Review Board (IRB).
The survey was sent to N = 251 students who were in the first, second, or third year of the professional program: first year (n = 74), second year (n = 84) and third year (n = 93). Across the three classes, 54% (n = 135) responded to at least one question. Each class was equally represented (30-37%) in the survey results, with roughly half of each class responding (49-60%). The majority of students (69%) felt their assessment performance remained the same (46%) or improved (23%) following the move to the online assessment environment, while 31% felt their assessment scores declined.
The first set of questions asked students, "based on your experience with assessments during the pandemic, please pick your preference for taking a written assessment off-campus." As Table 1 shows, most students preferred unproctored and open-book exams over proctored and closed-book exams. However, these preferences seemed to shift depending on program year: among first-year students, almost all preferred unproctored (86%) or open-book exams (82%), while among third-year students, only about half preferred each of these methods. In addition, most students (75%) preferred assessments using non-adjusted, standard time with predominantly multiple-choice questions over assessments with an extended timeframe and composed mostly of short answer or essay questions. This preference did not seem to shift systematically by program year. In open-ended comments regarding proctored exams, several students reported that stress was heightened by increased management of items not related to exam content. For example, students were expected to manage increased communication, such as calendar notifications with Zoom links and emails from course coordinators and Zoom proctors, as well as Zoom software and internet connectivity on the day of each assessment. For unproctored exams, since different testing strategies were used by the coordinators, students felt they had to adjust how they studied and prepared for their exams. Both changes to assessment strategies seemed to increase student stress.
In addition to the first set of questions shown in Table 1, a second set of questions asked students the extent to which they agreed with the three statements shown in Figure 1. Results for these questions did not seem to vary by program year. A near-majority of students (47%) agreed or strongly agreed that "A proctored exam is necessary to maintain academic integrity." However, the majority of students disagreed that "the removal of backward navigation is necessary to maintain academic integrity." Open-ended comments revealed that students used backward navigation as a test-taking strategy and the removal of this created anxiety related to the assessment.

Figure 1.
Academic Integrity Survey Results.
Finally, a majority of students (69%) agreed or strongly agreed that the simulated experience for the online OSCE was appropriate. In open-ended comments, the first-year students reported the final OSCE was more comfortable and less stressful than the midterm. For the secondyear class, the response was positive with students grateful for the practice case and the consistency between the OSCEs. The third-year respondents commented that instructions were appropriate; however, they remarked about the complexity of instructions and issues related to timing for the exam. Each class expressed similar concerns regarding technical difficulties-for example, that losing their Zoom connection during the exam might flag them as cheating or decrease the time they had to complete the exam. A fraction of students requested a better method to ask questions during the exam and more time to complete the exam.

Discussion
Assessment best practices recommend efforts aimed to prevent dishonesty and emphasize that consistency creates a culture of academic integrity (Ray et al., 2018). In this study, the Zoom proctoring method was used as a temporary solution since commercial proctoring software was not approved for purchase. Overall, we found that while Zoom proctoring created a similar testing environment between in-person and online testing, it also posed a variety of challenges. First, successful implementation required coordinators who created a consistent approach, trained proctors, and students familiar with the process. The time and personnel needed may hinder its sustainability. Second, the approach posed technology issues for both students and faculty. For Please rate your level of agreement with the following statement: The removal of backward navigation is necessary to maintain academic integrity.
A proctored examination is necessary to maintain academic integrity.
Compared to the first OSCE of the semester (pre-pandemic), the simulated experience for the online OSCE assessment was appropriate.
students, Zoom monitoring required them to manage both Zoom and ExamSoft during the exam, which could be confusing. Students also worried about losing their Zoom connection during the exam, which may flag them as cheating or decrease time from the exam. For faculty and staff who proctored assessments, internet stability was also a concern, as some were disconnected in the middle of proctoring an exam. To exacerbate these issues, virtual learning removed the on-campus technology support that is often used during exams to troubleshoot examination or technology issues. To combat potential technology issues, the college employed a variety of approaches. Having a practice exam or OSCE improved student confidence in managing Zoom and ExamSoft. During the exam, students were instructed to not attempt fixing their internet if they lost connection while being proctored in Zoom. Since ExamSoft-completed exams are not able to be altered between student submission and file upload, upload deadlines were extended, having no impact on academic integrity. When faculty or staff were disconnected from a Zoom-proctoring meeting, they were expected to reconnect and alert the coordinators who were available to help by stepping into the Zoom meeting. In addition to technology challenges, some students felt Zoom proctoring was an invasion of privacy and feared students would look at their workspace despite being asked to turn their heads when surveying other students' work space. These students were able to request to have their space checked prior to the entrance of their peers in the Zoom meeting.
In contrast to proctored exams, unproctored exams relieved the stress of internet access and software management, but brought an additional concern related to testing strategies. During the pandemic, certain exam features (e.g., timed questions, shortened exam times, removal of backward navigation, and increased complexity of questions) were used for unproctored exams. Students felt this required them to adjust both their learning and testing strategies. In response, students were reminded that the North American Pharmacist Licensure Examination (NAPLEX) does not allow backward navigation (Carter, 2020). Further, time allotted for each question and the exam as a whole were closely evaluated and adjusted. Question stems were also shortened to decrease excessive reading burden.
Overall, students preferred unproctored, open-book, internet access-enabled and standard time exams, and they felt the simulated experiences for the online OSCEs were appropriate. However, any type of mid-semester or mid-curriculum adjustments to assessment appeared to increase student stress.
While this study provides a discussion of assessment strategies during the COVID-19 pandemic at one school of pharmacy, it does contain limitations. Primarily, this study does not assess the impact of these changes on exam scores and overall course grades. Since some exams were modified in format from the previous year, a direct comparison of student performance was not considered appropriate. The survey and information included in this study provide only preferences and perceptions of the proctoring and assessment modifications rather than more substantive, outcome data.