Realistic Job Preview as an Alternative Tool to Improve Student Readiness for Online Learning

Readiness for online learning has been established as a key component of student success in online classes. In addition, the COVID-19 pandemic has underscored how vital being prepared for online can be. This paper highlights an orientation technique widely used in the business field, namely Realistic Job Preview (RJP), as a method to prepare students for what online learning might be like. Our research proposes an RJP would help students adapt to their new role as online learners. For the purposes of this study, we developed a video providing a realistic preview of online learning following recommendations from RJP research. We then conducted a mixed methods study to examine student perceptions of our realistic preview video and an online readiness self-assessment. Overall, our findings provide strong evidence for the use of RJP as a strategy to improve student readiness for online learning.

orientation technique that is widely used in the business field, namely Realistic Job Preview (RJP), as a method to prepare students for what online learning might be like.
Research shows that students experiencing online learning for the first time go through a significant role adjustment (Amemado, 2013;Garrison, Cleveland-Innes, & Fung, 2004). As Cleveland-Innes, Garrison, and Kinsel (2009) put it, "Differences in the required activities of online learning, in comparison to classroom based face-to-face, result in new, required expectations and behaviors for learners. These new activities cluster into a pattern that is seen as the 'role' of online learner" (p. 5).
Since its introduction to the business literature by Weitz (1956), six decades of research have established RJP as an effective tool in producing positive outcomes by providing an accurate, favorable, and unfavorable description of an employee's new role (Breaugh, 2009;Rynes, 1991;Wanous, 1992). Positioning online learning as role acquisition, we propose RJP as a valuable tool for preparing students for online learning environments.
The structure of this paper is as follows: First, we discuss online readiness, RJP, and related research. Next, we describe our creative application of the RJP technique to orient students to online learning. Finally, we present findings from a mixed-method study that explored student reactions to an online readiness self-assessment and a video that provided a realistic preview of online learning.

Online Learning Readiness
To our knowledge, the first mention of online learning readiness appeared in Warner, Christie, and Choy (1998). After providing evidence of poor levels of readiness among learners for flexible delivery, including online delivery, the article made the following recommendation: There is a need for pre-course student screening to ensure that they have the skills necessary to succeed in their chosen courses. This is especially so in courses which are delivered flexibly. Screening should address both the skill and dispositional abilities of students (p. 11).
As stated earlier, self-assessments remain the most common tool for determining student readiness for online learning. However, some caution regarding their use is warranted. A systematic review of these readiness tools by Farid (2014) revealed that most suffer from poor psychometric qualities. Additionally, this research noted that while a few assessments published in the literature have good psychometric qualities, universities rarely use these, preferring to make their own in-house. Finally, Hall (2008) cautioned that the difficulties students have in recognizing their skills, expertise, and character often lead to inflated self-assessments, thereby rendering these questionnaires unreliable.
In sum, these concerns highlight a need to explore alternative approaches to enhancing student readiness for online learning.

Realistic Job Preview
Over the past several decades, the RJP has played a significant role in research and practice (Landis, Earnes, & Allen, 2013). During this time, it has been used extensively by businesses as part of the recruiting and/or onboarding process for new employees. Baur et al. (2014) describe the RJP as a technique that presents applicants with a realistic view of what they can expect from a job in a particular organization. Care is taken to show both positive and negative information without distortion. Breaugh and Billings (1988) identify five key elements of an RJP. They state that it: 1. is an accurate portrayal of both positive and negative aspects of the job; 2. deals with jobs specifically, rather than with a broad-brush overview; 3. describes the various aspects of the job, rather than focusing on a few elements; 4. is received from a credible source, such as incumbents or someone who has performed or supervised the job; and 5. covers information that is important for the applicant to know before deciding to accept the job offer.
The objective of the RJP is to create realistic expectations in the recruit or new employee about the organization and the job in question. More realistic expectations, in turn, have been shown to have a positive impact on performance, work attitudes, and employee retention (Baur et al, 2014;Phillips, 1998). A basic RJP model can be seen in Figure 1.

Figure 1
Model of the Impact of a Realistic Job Preview (RJP)

Parallels to Online Learning
Given the analogous relationship between a recruit to a new role in a business and a student to a new role in a learning environment, it is anticipated that the dynamics and results of the RJP model would be similar in the two settings. Several theories have been used to understand the effectiveness of the RJP approach: met expectations, self-selection, and ability to cope (Breaugh, 2009). Below, we discuss these theoretical foundations and draw parallels to online learning.
One explanation for why RJPs may increase retention and performance is the met expectations hypothesis, which proposes that an RJP provides a complete picture of what to expect on the job (i.e., both positive and negative aspects), leading to higher satisfaction of new employees, and thus higher retention. This is particularly important as job recruits typically have inflated expectations (Billsberry, 2007). In the context of online learning, the met expectations hypothesis is particularly relevant. Students' mismatched expectations of online learning have been well-established in the literature (Dobbs et al., 2009;Mortagy & Boghikian-Whitby, 2010;Wyatt, 2003). Prior research has identified misconceptions about course difficulty and workload, as well as lack of experience and preparedness for online learning as contributors to attrition among online students (Bocchi, Eastman, and Swift, 2004;Rovai and Downey, 2010). These inflated expectations are then likely to result in disappointment. Having a more accurate picture of online learning would reduce the likelihood of exaggerated expectations, leading to higher satisfaction and retention. This is also further supported by research showing that student expectations are a key factor in retention (Baxter, 2012).
A second explanation for the effectiveness of the RJP approach is known as the selfselection hypothesis. This theory suggests that employees who do not perceive themselves as being a good fit in terms of their needs and/or abilities would likely withdraw from being considered for a position (Breaugh, 2009). Research examining person-job fit has shown that applicants who perceive congruence between the job and their knowledge, skills, and abilities (KSAs) are more likely to remain in the selection process and accept the new role (Carless, 2005). This exercise of evaluating one's KSAs has also been discussed with regards to online learning. Abdous (2019) noted that students' taking the time to gain self-awareness of their needs and abilities increases their commitment and engagement with the online course. Accordingly, we posit that online students who receive a realistic preview of online learning would gain a better understanding of whether online learning would be a good fit for them.
A third explanation, ability to cope, focuses on how employees cope with job demands. Specifically, this theory suggests that an RJP reduces job dissatisfaction and turnover by making applicants aware of likely problems before joining. Then, when these problems do arise, new employees are not caught off-guard and thus better able to cope with them. In the online learning context, Abdous (2019) noted that "students' unpreparedness to take courses online hampers their ability to cope with the demands of their new learning environment" (p. 162). One particular aspect of this theory is that new hires, having been made aware of potential challenges, would "rehearse methods of handling these problems" before they encounter them (Breaugh, 2009, p. 205). Similarly, having completed an RJP, online students would be able to more effectively prepare for challenges and cope with the demands of this new role.
Despite the parallels between preparing recruits for jobs and preparing students for their new roles in new learning environments, little research has examined RJP in an educational setting. We aim to contribute to this small body of research. The purpose of this study is to examine the use of realistic job preview in an online learning context, highlighting student perceptions of this innovative approach, especially as it compares to today's most commonly used tool, the selfassessment. Specifically, our research questions are: 1. Is a realistic preview video an effective method to prepare students for online learning? 2. What are students' perceptions of the realistic preview video versus the online readiness self-assessment?
3. How do the realistic preview video and the online readiness self-assessment differ in their contribution to preparing students for online learning?

Participants
Our study's sample consisted of undergraduate business students at a university in the southern United States. These students were enrolled in at least one online course at the time of the study. After the semester started, business school faculty teaching online courses were contacted by the researchers to ask that the opportunity to participate in the study be provided to their students. Students were self-selected, voluntary participants. After accounting for incomplete and duplicate responses, the final sample included 146 students. The average age of participants was 30.6 with ages ranging from 19 to 59. Participants had taken a median of six current and previous online courses.

Data Collection
Data were collected via an electronic survey. The process involved two steps. In the first step, participants completed an online readiness self-assessment and watched the realistic preview video created by the research team. To control for order effects, the order of presentation of the online readiness self-assessment and the video varied randomly by participant. Immediately after the first step, participants responded to the electronic survey, answering questions that prompted them to compare the two methods: the realistic preview video and the online readiness selfassessment.
Online readiness self-assessment. For this study, participants completed an online readiness self-assessment called the Test of Online Learning Success (TOOLS, https://www.kirtland.edu/static/online-learning-self-assessment/). TOOLS is a self-report survey with 45 questions using a 5-point Likert scale. Items are grouped into five subscales that measure areas related to online readiness: computer skills, independent learning, dependent learning, need for online learning, and academic skills. Upon completion, students receive a personalized report with recommendations based on their scores.
Although several online readiness assessments exist, we selected TOOLS for this study since prior research has established its construct validity and reliability (Alem et al., 2014;Kerr, Rynearson, & Kerr, 2006). Additionally, while some universities use their own self-assessment surveys, they often note being inspired by the TOOLS survey (e.g., University of Arkansas https://online.uark.edu/students/readiness-quiz.php).

Developing a realistic preview video.
A video providing a realistic preview of online learning was developed by our research team for the purposes of this study. An RJP can take one of four formats: written, face-to-face discussion, audio only, or audiovisual. Recognizing that today's learners are more effectively engaged via video-based learning (Global Research and Insights, 2018) and that video has become a predominant way for online instructors to introduce content online, we opted to develop our realistic preview in video format.
In creating the video, we followed the recommendations of Breaugh and Billings (1988) for the qualities of an effective RJP. This meant presenting both positive and negative aspects of online learning along with information important for students to know before enrolling in an online course. Prior research emphasized the importance of source credibility in realistic job previews (Buda, 2003;Popovich & Wanous, 1982). Therefore, instead of having online faculty, administrators, or staff discuss what to expect in online courses, we had students with past experience with online learning present their stories.
For the video, we recruited students (N = 12) who had successfully completed at least one online course and were willing to share their perspectives on what being a successful online student entails. Video participants were selected based on our belief that the student population at the university would be able to easily relate to them. Interviews with each student were recorded and later transcribed. We then conducted a simple content analysis of the transcriptions to identify emerging topics, repetitions, commonalities, and differences across participating students' accounts of their online learning experiences. Next, we organized the most salient themes into a coherent audiovisual story. With the help of a graphic designer, we enhanced the video with images and animations.
The realistic preview of online learning video that we created (4 minutes, 22 seconds long) can be viewed at this link: https://bit.ly/OLJrealisticpreview. Electronic survey. Participants responded to an electronic survey immediately after completing the online readiness self-assessment and watching the realistic preview video. The survey measured how helpful each instrument might be in preparing someone for an online course. Table 1 lists all survey items.

Table 1
Survey Items Q# Item Q1 Which provides better information about what to expect when I take an online course? Q2 Which provides clear expectations about the basic skills (e.g., time management, organization skills, study habits, etc.) necessary for my success in an online course? Q3 Which does a better job of helping me identify areas in which I need to improve in order to position me for success in an online course? Q4 Which does a better job of helping me to determine whether an online course is right for me? Q5 Which is easier to use? Q6 Which would improve my performance in an online course? Q7 Which would improve the likelihood that I would not drop an online class? Q8 Before taking your first online course would you prefer: • Watching the video • Completing the self-assessment • Both watching the video and completing the self-assessment Q1-Q7 were evaluated using a 7-point bipolar scale, with the extreme left anchor being Video, the extreme right anchor being Self-Assessment, and the center point being Neither. For analysis purposes, Video was coded as -3, Self-Assessment was coded as +3, and Neither was coded as 0. Q8 asked participants to provide their preference by selecting a choice from four options, and Q9-Q10 were open-ended questions.

Data Analysis
For Q1-Q7, data analysis consisted of calculations of descriptive statistics and intercorrelations. For Q1-Q8, frequency data for ratings were translated into percentages for ease of interpretation. The two open-ended questions (Q9-Q10) were analyzed using thematic analysis, employing an inductive approach in which the themes identified were driven by the content of the comments (Braun & Clarke, 2006). This approach avoids the analysis being impacted by the researchers' bias from familiarity with previous research. Following the steps outlined by Braun & Clarke (2006), two members of our team read through the entire data set before coding the responses. Next, they developed initial codes individually. In this process, each comment was coded. Later steps included comparing and contrasting codes, discussing differences, and reaching agreements on coded items. A third member of the team reviewed coding decisions for 5% of the coded comments. Any discrepancies were discussed and adjustments made as necessary. Finally, through analysis of the list of codes and their relationships, themes were developed. Themes were then reviewed to form a coherent pattern.

Quantitative Results
Evaluation of electronic survey. A correlation matrix was calculated for the quantitative variables (Q1-Q7) to determine the intercorrelation of participant responses to the survey questions (Table 2). As can be seen, there is a high degree of intercorrelation among questions Q1-Q7. This suggests these questions are measuring the same underlying construct, i.e., instrument effectiveness. At the same time, the coefficients of determination of the significant correlations only range from 0.03 to 0.26, suggesting individual items are measuring different aspects of the construct.
Questions 1-7. Table 3 provides descriptive statistics for each of the seven questions that were rated. For Q1, Q2, and Q5, participants indicated a preference for the video. For Q3, Q4, Q6, and Q7 participants indicated a preference for the self-assessment. Mode -3.00 -3.00 3.00 3.00 -3.00 3.00 3.00 Note. A negative value of the mean, median, and mode indicates a preference for the video. A positive value indicates a preference for the self-assessment.
Below, we present charts for each of the survey items 1-7 listed in Table 1, each measured on the bipolar scale discussed earlier. We also present a summary of these figures in Table 4. However, it is important to note that Table 4 combines +1 to +3 as preference for self-assessment and -1 to -3 as preference for video. Therefore, the summary table does not display the variance of the full 7-point bipolar scale as seen in each chart. See Figures 2-8 for a more detailed understanding of student preferences on each item. Note. Due to rounding, percentages may not always appear to add up to 100%.

Figure 2
Better Information on What to Expect

Clearer Expectations on Basic Skills Necessary for Success
Looking first at Q1 and Q2 (Figures 2 and 3), it is clear that participants believe the video does a better job at addressing general expectations about online courses. Students featured in our realistic preview video discussed in detail what to expect in online classes and what skills they found necessary for success. We provide further evidence on how participants valued the specific content of the video in our discussion of the qualitative results for Q9.

Figure 4
Identifying Areas to Improve

Figure 5
Determining if Online is Right for Me When Q3 and Q4 are considered (Figures 4 and 5), it is clear that participants believe the self-assessment is more effective in helping them determine if online learning is a fit for them. This finding does not come as a surprise since the self-assessment focused on skill assessment with immediate feedback. The first sentence that appears at the top of the self-assessment also evidences this: "Take our Student Online Self-Assessment Survey to find out if you're a good candidate for online classes." On the other hand, the purpose of the video was to provide a realistic preview of online courses. Any comparison between online course requirements for success and personal qualities is incidental.

Figure 6
Ease of Use Q5, Q6, and Q7 provide a general view of instrument effectiveness. Participants view the video as being much easier to use ( Figure 6). This is likely because it requires little effort on their part other than a five-minute investment in time. Appreciation for the ease of videos is consistent with findings in a comprehensive review of video usage in educational settings (Kay, 2012). Further details comparing the video and self-assessment on ease of use are discussed in the qualitative results. Figure 7 indicates that approximately twice as many students believe the self-assessment will do a better job at helping them improve their performance in an online course (Q6). This is likely because the TOOLS (Test of Online Learning Success) self-assessment provides the student with a personalized report with recommendations based on their scores, whereas the video is not intended to be prescriptive in nature.

Figure 7
Improving my Performance

Improving Likelihood of Not Dropping
Ratings for Q7 were similar for watching the video and completing the self-assessment (Figure 8). This suggests that students vary in what causes them to drop an online course. Some may leave primarily due to deficits in the skills necessary to succeed in online courses, whereas others may drop because of the environment of the course itself. Providing a realistic preview of online courses helps students avoid mismatched expectations, allows them to self-select out prior to enrolling in an online course, and increases their ability to cope with online course demands.

Figure 9
Preferred Approach Participants were asked which approach they would prefer before taking their first online course (Q8). Results clearly indicated that they would prefer both watching the video and completing the self-assessment. This was followed by just watching the video and just doing the self-assessment (Figure 9). The preference for utilizing both techniques highlights the fact that each approach provides a unique contribution to the student in helping them determine if taking an online course is right for them. Using both approaches will become more important as higher education increasingly moves toward online instruction as an option for students.

Qualitative Results
Qualitative data from students' open-ended responses to the question of what they liked about the realistic preview video and the readiness self-assessment revealed several themes. Our analysis showed that there are benefits distinct to each method (i.e., realistic preview video vs. self-assessment) as well as benefits common across both approaches. A summary of themes and related participant comments is presented in Table 5 Table 5 Themes

and Comments from Participants
Benefits of realistic preview video. Three main themes emerged in explaining what participants liked about the realistic preview video: (a) peers as authors of real experience, (b) honest and complete picture, and (c) engaging and interactive medium. To summarize, the themes addressed by participants include who should deliver the message ("peers as authors of real experience"), what the message should include ("honest and complete picture"), and how it should be delivered ("engaging and interactive medium").

Peers as authors of real & relevant experience.
One of the most prevalent themes focused on the source of the message. Participants' repeated use of the words "actual students" and "real students" demonstrate their desire to "get the information straight from the students" who had relevant experience. Participants' comments made it evident that they preferred to hear from their peers over faculty or university administrators, noting this approach is "better than teachers or advisers suggesting students need to do something." To most participants, the video's appeal was not just that it featured "real students," it was that the video allowed participants to hear from peers with the same institutional affiliation. As an example, one student liked the "fact that they were [my university's] students." Several other students noted the value of hearing from "students like me." This added level of familiarity or connection seemed to enhance the audience's receptiveness to the message.
Participants liked that the video provided "feedback from people who have actually taken an online course." For them, listening to the stories of "people who have taken that [online learning] path" made the video content "relevant" and "relatable." Moreover, the fact that the video depicted a diverse group of peers, "real people from different life situations" and from a "range of different ages," was affirming. Honest and complete picture. In addition to valuing the credibility of the source of the message, participants praised the message content as being "honest" and showing a complete picture. For them, the video "talked about the truth of online classes" without painting a rosy picture. It provided "honest answers from both spectrums" giving "both the pros and cons of online classes." In the voice of one participant, the video "said exactly how it is." Other participants expressed similar sentiments: The honesty of the people in the video is what was most appealing. That could really help somebody who's debating about whether or not they should go ahead and take online courses. People telling their real experiences with taking online classes.
Everything mentioned in the video about online classes is very true and accurate. I honestly think it would provide enough information for students that are considering taking online classes for the first time to know what they are getting themselves into.
Engaging and interactive medium. A third theme distinct to the video was related to its message delivery. Participants noted the benefits of the audiovisual approach suggesting that it "keeps the attention of the viewer" and is "more interesting than reading." A number of participants lauded the engaging nature of the video: feels more interactive has people speak to you so it feels like you are there in person feels like you are communicating with [the] person pulls me in In addition to the audiovisual format, aspects of the production quality seemed to matter. Participants "liked the graphics throughout the video" and appreciated the "pace of the audio/visual and text," specifically commenting on its "upbeat" format.
Benefits of self-assessment. Two main themes emerged from participants expressing what they liked about the readiness self-assessment: (a) personalized feedback and (b) increased selfawareness.

Personalized feedback.
A major characteristic of the readiness self-assessment is that the results were personalized to the individual completing the assessment. Participants liked that the survey was "not generalized", and that it provided an opportunity where "you input your own situation" resulting in specific and "personal results in black and white." The uniqueness of each individual's situation mattered. As examples, participants noted: It helps you to determine if an online class is right for you based on your answers and results.
Your personal skills and competencies are different and this can help you understand if you need online classes.
It asked what I think works best for me, taking an interest in my study habits.
Increased self-awareness. The personalized approach increased participants' selfawareness. Participants noted that the survey: Forces you to be honest with yourself Made me really think about myself [Provided] insight into your own habits and how they can help you or hurt you Teaches you a lot about yourself [The self-assessment] helps with understanding myself because a video cannot tell me specifically if I need this or not.
A primary mechanism through which the self-assessment increased self-awareness is by helping participants, in their words, "identify strengths and weaknesses." The TOOLS online readiness questionnaire provided an overall score (from 0 to 225) as well as scores (from 0 to 5) for each of the five sub-areas: computer skills, independent learning, dependent learning, need for online learning, and academic skills. Participants particularly liked receiving quantitative measures to gauge their readiness for online learning. One participant noted: The resources provided at the end of the assessment was most appealing. You can see where you're at on a scale range in regards to if online classes would be a good fit for you.
Participants valued the developmental approach of the self-assessment reflected in the sub-area scores. For them, the results were not just a measure of readiness for online learning; they provided specific aspects to improve on to prepare for online. As some stated: [The self-assessment] let's you know where you need to focus and improve on.
It can pinpoint areas that I might need to work on to be successful in an online learning environment.
It gives you an insight [on] what you should be working on a little bit more.
[It] helps identify weaknesses and strong points that I have. I can then use those strengths and weaknesses to make a decision if I should move forward with taking online classes.
Common themes across both methods. While some themes were distinct between the realistic preview video and readiness self-assessment, a few themes were consistent across both approaches: (a) prompted reflection and introspection, (b) detailed, and (c) ease of use.

Prompted reflection & introspection.
Participants noted that both the video and selfassessment prompted reflection and introspection, albeit in different ways. For the video, a key method of prompting reflection was hearing from the experiences of others. In the words of some participants: It also helps hearing it from other students. Whether that's a psychological aspect or not, hearing it from another individual, that it takes discipline and consistent effort and more time, is realistic.
Listening to other students is usually taken more seriously than reading an evaluation.
It's good to actually hear people speak of their own experiences and points of view rather than just your own. Specifically, hearing from others helped them address their blind spots in their approach to online learning, providing as one student put it, "information I did not know I needed." Another student noted that "[the realistic preview] put certain things into perspective that [I] may not have thought about before watching the video." The self-assessment also appeared to prompt introspection primarily through the specific questions of the assessment. As one participant said, it "made you think a bit harder." Other participants corroborated this: Probing questions allowed the survey participant to think actively about whether online is a good fit.
[The self-assessment] asks specific questions that I would normally not think about on my own.
Overall, participants found the self-assessment "very comprehensive and introspective" and the results led participants to "step back and reassess what extra effort he or she needs to put in to online education." It appeared that asking participants to assess their own skills and abilities brought a sense of honesty to the introspection. One participant noted, "[the self-assessment] forces you to be honest with yourself. If you answer incorrectly then you are only lying to yourself." On the other hand, one student noted that on the self-evaluation "I can lie to myself easily." Detailed. For both the video and the self-assessment, study participants appreciated an indepth approach to determining online readiness. According to them, the video "provides details," "goes into depth," and "goes through many scenarios of what is needed" rather than just addressing one aspect of online learning. For the self-assessment, participants valued the "amount of detail" and that "expectations are more defined and in detail." Ease of use. It was clear that ease of use was a positive factor in both the video and the self-assessment.
For the video, participants felt that the audiovisual approach added an element of ease, suggesting it was "easy to watch" and "easy to listen to." The aspect of watching versus listening was also brought up, suggesting there is value in having graphical text with a voiceover. For example, one participant explained that the video was "easy to follow if you had to just listen and can't watch." The length of the video (4 minutes, 22 seconds) was an element of ease, "giving information quickly" and allowing participants to "get information when busy." The "low stress" nature of the video was also highlighted.
For the self-assessment, participants emphasized the length and structure of the questionnaire, especially the simplicity of the questions asked. They also appreciated that it was "short and to the point" and had "easy questions." The overall process was "easy to follow along" and "very simple and straight-forward." While both tools were viewed as easy to use, the quantitative data found that 2/3 of study participants felt the video was easier to use (see Figure 6).

Discussion and Conclusions
The 2018 report on Distance Education in the United States notes that enrollment in online programs has increased for 14 consecutive years (Seaman, Allen, & Seaman, 2018). Despite this growth in enrollment, online programs continue to struggle with engaging and retaining students (Berry, 2019) and experience higher drop-out rates than face-to-face courses (Thaiupathump, Bourne, and Campbell, 1999;Willging & Johnson, 2004).
Preparing students for online learning is a key factor in addressing this challenge. Adopting innovative approaches to preparing students is essential, especially considering 74 percent of reporting public institutions rated online education as critical to their long-term strategy (Allen & Seaman, 2009).
In the previous special issue on the COVID-19 Emergency Transition to Remote Learning, Jaggars (2021) suggests that lessons from the past year will influence student support practices for years to come. Indeed the pandemic has especially highlighted the critical role of readiness for online learning.
As we move beyond emergency remote instruction and look ahead to the future, instructors and institutions of higher education will need to leverage evidence-based practices from the literature to ensure high-quality online learning experiences (Means & Neisler, 2021). By incorporating principles from the RJP to improve readiness for online learning, our study provides one such example of an innovative research-based solution that can support students' online learning needs well into the future.
In this paper, we introduced RJP, a technique from the business field, to prepare or orient students for their new role as online learners. Drawing upon well-established principles from the RJP literature, we developed a video of a realistic preview of online learning. The video featured students who had previously taken online courses explain to their peers, who are new to online learning, what to expect from these learning environments. In the words of one study participant, we found that using an RJP approach "could really help somebody who's debating about whether or not they should go ahead and take online courses." Our study compared the realistic preview video with one of the most widely used orientation techniques, the online readiness self-assessment. We asked participants to directly compare the two tools and share their insights on what they liked about each. Overall, participants felt that both tools contributed to their readiness for online, but did so in different ways. For them, the video was a preferred tool for learning general expectations and gaining clarity on the basic skills necessary to succeed in online courses. They especially appreciated hearing this information from credible and honest peers who had taken online courses and were willing to openly share both the pros and cons of online. In terms of ease of use, participants also preferred the video.
As far as the self-assessment is concerned, participants felt it was helpful for identifying specific areas for them to improve on before taking an online course, and determining whether online is appropriate for them. They expressed appreciation for the personalized feedback provided by the self-assessment results, which in turn, increased their self-awareness and allowed them to pinpoint areas to improve. Participants also believed the self-assessment would contribute more to their improved performance in an online course.
In sum, the results of our study suggest that students may benefit from both approaches to online learning readiness. Perhaps the most compelling argument for the value of both methods is that 70% of participants would prefer to complete both the realistic preview video and selfassessment before taking their first online course.

Recommendations
Our research showed that the elements of the video that were most appealing to our participants were those in-line with recommendations from RJP experts. We encourage universities to create realistic previews for online learners, adopting the guidelines established in RJP research such as Breaugh (2009). Specifically, we recommend the following: 1. Share both the pros and cons of online learning. It is important to remember that an RJP is not a marketing tool. RJP experts warn of presenting overly flattering views. A sense of authenticity is created by including negative information along with the positive (Breaugh & Starke, 2009). Our study showed that an honest and complete picture was a key aspect of participants' positive responses to the video.
2. Use a credible relatable source to convey the information. Using students from our university increased students' receptiveness to the message as they liked to hear from "students like me." Universities may consider creating in-house realistic previews featuring their own students to enhance feelings of relatedness and familiarity.
3. Use a combination of readiness approaches. Our research shows that between readiness self-assessments and realistic preview videos, it is not one or the other-students prefer to complete both before taking their first online course. Different approaches may appeal to different students. Practically, it is also important to consider which readiness approach students would actually complete on a voluntary basis. Our participants noted that the video is "more likely to be watched" because it is "low stress" and has "the ease of listening and viewing via stream." 4. Address the topic in detail. An RJP should avoid a surface-level approach to the preview. Providing sufficient detail for students to gain an in-depth understanding of expectations is important. At the same time, if a video format will be used, length must also be taken into consideration. Research on online videos has shown that increasing the length of the video may run the risk of "trying to cover everything and eventually losing the interest" of students (Ou et al., 2019).
5. Present the RJP before students take their first online course. Having students complete an RJP beforehand allows them to self-select out of online courses should they decide online learning is not right for them. However, self-selecting out may not always be an option, so it is critical to ensure students are prepared.
6. While the RJP can be used to help students decide about enrolling in online courses, an RJP can also be used as reinforcement. As one participant noted, "Even though I thought, I know this stuff, I did like to have it reinforced. I have taken online classes for years now.
If it was my first online class this would have been extremely helpful."

Limitations
While this study provided important insights, some limitations must be acknowledged. The first limitation stems from the self-selected nature of the survey participants and the small sample size. As such, results cannot be generalized beyond the current study. Another limitation is that all participants were business majors. A suggestion for further research is that researchers examine student perceptions of the realistic preview video versus the online readiness self-assessment with participants from fields outside of business. It would be interesting to see how students from other disciplines would respond. Lastly, it would have been interesting to have some participants use a readiness self-assessment other than TOOLS to see if variations in the self-assessment instrument make a difference in how students compare it to the video.