Shifting Teaching and Learning in Online Learning Spaces: An Investigation of a Faculty Online Teaching and Learning Initiative

This article presents results from a study of a year-long, teaching and learning center-directed, professional development initiative that focused on both the technology and the pedagogical supports for online and blended course delivery at a research university. The purpose of this mixed methods study was two-fold. The first purpose was to investigate pedagogical changes that occurred as a result of the professional development that included a year-long faculty learning community by exploring influences on pedagogical changes. The second purpose was to understand the perceptions of the diffusion of innovations (DOI) characteristics that influenced the level of adoption of online/blended teaching by faculty participants. A survey was used to measure the perceived characters of innovation as defined in the theoretical framework. Following the survey, one-on-one interviews that were linked to the DOI theoretical framework were conducted to better understand those characteristics. The results presented herein focus on barriers, challenges, and successes of adopting e-learning pedagogy in these online and blended learning environments.


Review of Related Literature
Recent trends in higher education indicate that distance learning courses are in high demand with over 31.6% of enrolled undergraduate, graduate, and noncredit students in the United States taking at least one course in a fully online format (Allen & Seaman, 2018). Given the popularity of online and blended courses and programs, 63.3% of chief academic officers in institutions of higher education in the United States have integrated online or blended learning into their long-term strategic planning (Allen & Seaman, 2018;Chen, Lambert, & Guidry, 2010). With more than 6.3 million university students enrolled in online courses (Allen & Seaman, 2018), it has become accepted that "institutions must ensure that online students receive high-quality instruction, support services, and other fringe benefits enjoyed by traditional face-to-face students" (Chen et al., 2010(Chen et al., , p. 1229. Universities are attempting to meet this need through an array of professional development opportunities for their instructors that focus on various aspects of teaching and learning in a distance learning environment. While universities address the growing complexities of distance education, studies regarding these efforts by institutions remain limited.
The existing literature often provides numerical figures that depict how many faculty members adopted a given online teaching practice as a result of professional development. Alternatively, researchers tend to list barriers or lessons learned that are disconnected from existing innovation adoption or implementation theories. In contast, some existing research focuses on the design and implementation of a given professional development program without considering how these design decisions influenced adoption decisions by faculty members. Seldom does the existing research use theory to support the investigation of these practices. Hence, there is a dearth of literature at the nexus of theory, experiences of the instructors, and professional development for online teaching and learning in higher education.
The existing literature base contains several studies of adopting online teaching among higher education instructors in specific fields where the focus is expanding upon the nuances of that field such as with agriculture (e.g., Drape, 2013) or with nursing (e.g., Cash & Tate, 2012). The literature, however, rarely focuses on researching the adoption of distance education through a theoretical framework. Additionally, the details of how professional development influenced faculty members' teaching approaches are seldom told. When this story of adoption is told, it is usually captured in a single survey as in the work conducted by Shea (2007) where the researcher used a survey to capture motivating and demotivating factors to teaching online.
Of the studies reviewed that focused specifically on course instructors' professional development in higher education around online learning, only four studies were located that explicitly noted a theoretical framework that grounded the study. For example, Barker (2003) researched faculty development that used change theory to leverage faculty buy-in. Additionally, Shipman (2017) used the Substitution, Augmentation, Modification, and Redefinition (SAMR) model, which focuses on technology's impact on teaching and learning, to identify challenges and barriers to technology use in university classrooms. Shea, Pickett, and Li (2005) used the DOI theory as a lens to analyze satisfaction with online learning of faculty members with online learning of 913 faculty members in the State University of New York (SUNY) Learning Network. A study by Wingo, Ivankova, and Moss (2017) took a different approach and used the Technology Adoption Model (TAM) to organize a review of the research to discuss what is known about faculty perceptions about teaching online. These theorydriven research approaches to understanding the experiences of instructors at higher education institutions with professional development for online and blended learning, though useful, remain limited in the current literature body.
Nevertheless, there remains a paucity of, and yet increasing interest in, research focused on how universities support e-learning efforts to improve online and blended teaching and learning. As evidence, Mohr and Shelton (2017) conducted a four-survey-round Delphi study of higher education leaders of online learning initiatives to determine best practices for online faculty professional development. Mohr and Shelton found that professional development topics should include training in faculty roles, classroom design, learning processes, and legal issues. This research is compelling but does not bring to light the lived experiences of the stakeholders.
A limited number of existing studies of online professional development focus on training faculty for blended course delivery (Childre & Van Rie, 2015;Linder, 2017;Littlefield, 2012;Varkonyi, 2012), training faculty for online course delivery (Barker, 2003;Gunay, 2013;Keengwe & Georgina, 2011), understanding factors that influence faculty satisfaction with asynchronous teaching and learning (Fredericksen, Pickettt, Shea, Pelz, & Swan, 2000), and student engagement in online learning (Chen et al., 2010). Few studies addressed both online and blended course deliveries (Powell, 2010). Some studies take an anecdotal approach and explain how a given training was conducted and what worked or did not work in that training (Linder, 2017;Terantino & Agbehonou, 2012). Nevertheless, these studies lack a theory to drive the investigation.
Alas, the ever-changing nature of online and blended learning, coupled with a broad conception of professional development, makes comparing studies difficult. For example, studies of professional development around distance education in higher education institutions include ondemand training (Sullivan, Burns, Gradel, Shi, Tysick, & van Putten, 2013), traditional seated courses (Linder, 2017;Littlefield, 2012;Powell, 2010), workshops (Keengwe & Georgina, 2011), and faculty mentorship programs (Barker, 2003;Childre & Van Rie, 2015). Despite these efforts, there is a lack of empirical research that connects faculty experiences and perceptions of their professional development with e-learning and the resultant shifts in their attitudes and teaching approaches with regards to online and blended learning.
Given the lack of empirical research published in peer-reviewed journals on this topic, it is possible that this knowledge remains contained within universities as internal evaluations. Thus, it is likely that most e-learning program evaluations are reported internally within a given university and not shared with the outside world. Another complication is that professional development opportunities might be constrained to the implementation in a specific college or department, rather than a university-wide implementation. The few published works that exist typically take the approach of anecdotally explaining how a given training was conducted and what worked or did not work (e.g., Linder, 2017;Terantino & Agbehonou, 2012) or understanding motivators and demotivators to teaching online (Shea, 2007). Success is typically based on an internally developed self-reported survey instrument that has not been analyzed for validity or reliability. Concomitantly, these studies are often devoid of a theoretical approach. Thus, there is a need to disseminate research on e-learning professional development that is theoretically driven, situated in institutions of higher education, and captures the lived experiences of the stakeholders. In this study, this multilayered approach is taken.

Theoretical Framework
The diffusion of innovations (DOI) theory was used to guide the current research. The primary focus of diffusion research is to understand the adoption of a given innovation (Rogers, 1962). This theory was chosen as it is prominent in research studies situated in instructional technology as well as general postsecondary faculty development (Drape, Westfall-Rudd, Doak, Guthrie, & Mykerezi, 2013;Grosz, 2012;Huun & Hughes, 2014;Jordan et al., 2012;Lewis & Slapak-Barski, 2014;Martin, Parker, & Allred, 2013;Molina, 2013;Soffer, Nachmias, & Ram, 2010). This theory has also been used to understand technology initiatives such as massively open online courses (MOOCs) (Claffey, 2015), technology policy diffusion (DeRousie, 2014), teambased learning (Freeman, 2012), mobile campuses (Han & Han, 2014), personalized learning (Karmeshu & Nedungadi, 2012), adoption of online education by traditional liberal arts colleges (Hollis, 2016), and technology in the education systems of developing countries (Richardson, 2009(Richardson, , 2011. The theory has also been used to understand changes in organizational culture (Shiflett, 2013). Additionally, the DOI theory has been applied to determining barriers to the continued growth of online teaching based on faculty satisfaction in the entire SUNY Learning Network (Shea, Pickett, & Li, 2005). According to Meyer (2004), Rogers' theoretical model has been used in thousands of studies across many fields including education and technology (e.g., sociology, marketing, public health, economics). Rogers (2003) defined an innovation as "an idea, practice, or object that is perceived as new by an individual or other unit of adoption" (p. 12) and noted how "diffusion is the process in which an innovation is communicated through certain channels over time among the members of a social system" (p. 5) through four fundamental elements: innovation, communication channels, time, and social system. This definition indicates a critical point-the newness of the "idea, practice, or object"-is not objectively measured but rather based on the perception of the adopter. DOI seeks to explain the processes through which ideas, practices, or objects are communicated and thereby adopted by members of a particular social system.
There are five characteristics of innovation that explain differences in adoption rates: relative advantage, compatibility, complexity, trialability, and observability. These five attributes account for most of the variance (between 49-87%) in the rate of adoption of an innovation (Rogers, 1962). Subsequently, research regarding these attributes has been further conducted, modified, operationalized, and expanded by Moore and Benbasat (1991), who generated three additional adoption constructs (see Table 1). The authors included: image (the degree to which the use of a system is perceived to enhance one's image or status in one's social system); voluntariness (the degree to which use of the innovation is perceived as being of free will); and result demonstrability (the ability to show results of using an innovation).
While Rogers (1962) provided a general approach to the theory, Moore and Benbasat (1991) focused specifically on the adoption of information technology innovations. As such, Moore and Benbasat created an instrument to measure the eight characteristics. Given the increasing demand for online and blended courses, the limited body of literature on e-learning professional development in higher education, and the need to use theory to understand this innovation in higher education, this study is both timely and needed.

Method
A mixed method sequential explanatory design (Creswell & Plano-Clark, 2018) was used in this study so that quantitative results could be further explored through the collection and analysis of qualitative interview data. An initial survey was used to measure the perceived characters of innovation as defined in the theoretical framework. Following the survey, one-onone interviews that were linked to the DOI theoretical framework were conducted to better understand those characteristics. The research questions guiding this study were: 1. What pedagogical changes occurred as a result of the professional development and subsequent year-long faculty learning community?
2. How did the perceptions of the diffusion of innovations characteristics influence the level of adoption of online/blended teaching by participants?

Project Background
The University of Kentucky launched the eLearning Innovation Initiative (eLII) in 2014. The eLII provided funding for the creation of new online or blended degree programs and the innovative redesign of large-lecture courses. Recruitment for participation in this training initiative occurred via email. The Center for the Enhancement of Learning and Teaching (CELT) emailed all faculty and instructors at the university through an open call for applications. Participation was open to anyone who wanted to participate. Thirty-six faculty members received eLII professional development funding and agreed to participate in two training initiatives. Phase 1 of the initiative was a week-long, face-to-face professional development workshop that occurred in the summer. Phase 2 required faculty members to participate in monthly, face-toface faculty learning communities (FLCs) for one year. These FLCs consisted of eight to ten faculty members and were led by an instructional coach from CELT. The year-long FLCs were designed as opportunities for small groups of faculty members to come together monthly to share their experiences with their own online and blended efforts. Each FLC was tasked with creating a resource that would be of service to the other learning communities. This practice allowed each group to work on a given topic while discussing the challenges and successes experienced by individual faculty members.

Participants
After Institutional Review Board approval, all 36 course instructors who participated in the training were emailed a link to the DOI survey on January 8, 2015. Of the possible participants, 31 out of 36 completed the online survey thus yielding an 86.1% response rate. The last question on the survey linked to a new survey where participants were asked to volunteer to engage in an interview. Thirteen of the 31 survey completers indicated their willingness to be interviewed. The interviews were conducted via Uberconference. The interviews ranged from 30 to 45 minutes long.

Measures
Survey instrument. The survey used to measure DOI characteristics was a slightly altered version of the Moore and Benbasat (1991) survey (see Appendix A). The survey used a 4-point Likert-type scale and consisted of eight scales with a total of 25 items. Items were reworded for the eLII professional development program such that "personal work stations" was replaced with "skills gained from the eLII professional development." This initial instrument was developed and tested by Moore and Benbasat in three stages: item creation, scale development, and instrument testing in two pilot rounds and two field test rounds. The parsimonious instrument was developed with "a high degree of confidence in their content and construct validity" (p. 210).
In addition to the survey, three 5-point Likert-type scaled questions were used for participants to self-assess their level of adoption of the training techniques. In this study, this score is referred to as an innovation score. Here, participants rated their level of adoption using digital technology, blended learning, and online learning. The scale ranged from 1 (last to adopt) to 5 (first to adopt). Each participant received one innovation score that was calculated by averaging answers to the three items.
Semi-structured interviews. Interviews were conducted to explore survey responses further, providing concrete examples about the experience. This additional investigation allowed the exploration of latent themes and underlying trends that may not have been immediately evident. Questions for the semi-structured interview protocol (see Appendix B) were designed to explore the constructs on the Moore and Benbasat (1991) survey. Hence, interview questions were designed to understand better the eight theory-driven constructs detailed in Table 1.

Data Analysis
Analysis of the quantitative data began with running tests for reliability to determine if this population responded to the survey differently than tested in the construction of the original instrument. ANOVAs were conducted to determine if and how characteristics of innovation accounted for innovation uptake. Next, t-tests were run to determine if the instructors who completed the survey and then were interviewed differed on the eight perceived characteristics of innovation from those who only completed the survey. This was done to ascertain if selection bias existed for the individuals interviewed.
The quantitative analysis was followed by the analysis of the interviews. Analysis of the qualitative data began with an a priori coding scheme that was restricted to the eight characteristics defined by the DOI framework (see Table 1). As a first step, one coder coded all data within the eight constructs. After coding for these constructs, the codebook was expanded by the team to include codes related to perceptions of professional development as they related to the theoretical framework. As a second step, using inductive coding, one researcher coded all the transcripts. A second and third researcher confirmed all codes. This allowed the team to capture deep rich details about the professional development as it related to the theory-driven characteristics.

Results
Internal consistency of reliability was investigated for the eight individual characteristics using coefficient alpha (see Table 3). Most characteristics had a Cronbach's alpha of greater than 0.80, with only visibility (⍺ = 0.79) and trialability (⍺ = 0.69) falling below this level. The internal consistency of the trialability characteristics being the lowest of all constructs is similar to what was reported by Moore and Benbasat (1991). The internal consistency of reliability for the entire instrument was considered suitable (⍺ = 0.92). A one-way ANOVA was used to compare the effect of innovation score on the DOI characteristics for the 31 participants who completed the survey to determine if there were group differences. There was not a significant effect of innovation level on any characteristic at the p < 0.05 level. These results indicate that the survey did not accurately capture the degree to which the participants adopted this innovation, which could be attributed to the small sample size (see Cohen, 1992). Table 4 provides the innovation score for each of the 13 interview participants. The results from an independent samples t-test were used to determine if interview participants differed from the rest of the population on scales (see Table 5). No statistically significant differences were found between interview participants (n = 13) and participants who only completed the survey but did not interview (n = 18). Thus, it is believed that selection bias was not an issue. The qualitative results reported below are constrained to only those faculty members who completed the survey and participated in the interviews (n = 13). Based on interview results, participants most frequently discussed their experiences related to relative advantage, compatibility, and trialability. Faculty also shared experiences regarding online teaching in general and professional development specifically. Their innovation scores were taken into consideration when interpreting the interview. The three adoption classifications previously used were carried forward into this analysis and were determined based on the rounding of each participant's innovation score. Innovation scores that rounded to 5 were considered early adopters. Moderate adopters were those who had a rounded score of 4. Individuals with a rounded score of 3 were considered late adopters. These classifications were considered acceptable based on the idea of a normal distribution or a bell curve of innovation adoption discussed by Rogers (2003). The following sections outline how the perceived characteristics of innovations were discussed among participants through the interviews.

Relative Advantage
While most participants (10 out of 13) found the professional development and FLC personally advantageous, only two instructors (moderate adopters) stated that the weeklong professional development was not beneficial. Instructors who did benefit noted advantages related to social factors, convenience, and personal satisfaction.
The desire to increase student engagement was brought up by three participants across different adoption levels. One participant noted an effort to increase instructor presence in discussion boards stating, "I respond to them more frequently. I just want to make sure the students realize that I'm responding, and they don't feel like I left them hanging" (Instructor M). Another participant said, "I think that we learned things that will allow my students to be more engaged" (Instructor K). Another social factor that was mentioned was the willingness to utilize web conferencing technologies to hold meetings. Instructor M noted, "I've been more open to it, but I've only had one or two students taking me up on Skype meetings or virtual meetings." Moderate and late adopters (n = 9) seemed satisfied with the specific pedagogical lessons gleaned from the professional development. One participant was particularly satisfied with the training regarding the alignment between learning outcomes and course activities which included assessments. Instructor G stated "We really talked about ...what those outcomes are and what's going to really work best in an online environment and what's going to work best in a face-to-face environment." A late adopter, Instructor L, shared "the workshop really gave me insight into ways that I can use a lot of different modes of delivery. When I'm delivering a single topic, I'm using video, I'm using some writing, I'm using Prezi presentations, I'm using discussions, I'm using open-ended quizzes…all just to deliver one idea." Four participants who were across all adoption levels found that learning how to leverage a learning management system (i.e., Canvas) was the most advantageous element of the professional development. "For grading and project submittals, I do a lot more of online submittals and online grading and doing assessments and rubrics through Canvas. But I also use the anonymous survey tool in Canvas to get reflective feedback from the students" (Instructor J). Similarly, Instructor D said that "My face-to-face [courses] continue to improve because I can now put the very important key pieces of material or expectations in a user-friendly manner online so the students have access to it 24/7 regardless of the mode of implementation, faculty members found learning about tools and how to deliver content beneficially."

Compatibility
Participants (n = 13) discussed the level of compatibility of the professional development with their needs, teaching styles, and pedagogical preferences. These participants discussed how networking, with either new or veteran colleagues, proved to be helpful. Instructor C remarked "We get to network together and share practices on how to do things better. I enjoy that part." Similarly, Instructor H noted that "hearing how other people have gone about it and attending some of the meetings that we have had within our faculty learning communities have been pretty good because we were able to talk about what worked and what isn't working in others' courses." Consistency with teaching approach. More than 61% (8 out of 13) of participants noted that components of the professional development and subsequent FLC were incongruent with their preferred teaching approach. Instructor E remarked that "The pedagogical instruction was completely disconnected from the way I teach. It was all directed at lecture teachers. I'm not a lecture style teacher." Another participant shared similar feelings in saying that "It's not really helping so much because...the challenge I have is with the large class size. And, my teaching style involves mostly interaction with my students. I don't do lecturing" (Instructor C). According to four of the participants, the focus of the initial professional development was how to convert lecture-based instruction into an asynchronous online learning environment. This approach was incompatible for instructors who were not going to teach in an asynchronous format and created a schism between participant needs and training objectives. Instructor B, an early adopter, highlighted this issue by stating "The professional development was more focused on asynchronous teachings, but all of my courses are synchronous so there's a little bit of disconnect there." Late adopters also noticed this disconnect. As Instructor M noted, "They threw together synchronous and asynchronous. I think those crowds are a bit different." Various benefits of the training were also recognized. Both early and moderate adopters (n = 8) found the range of topics beneficial, noting that exposure to different technologies allowed them to find the tool that would best address their teaching needs. Instructor C noted how "the workshop actually opened my eyes. I can see it as a good way of helping me to make the online course more interactive. In addition to the content, how I can use it to bring more interaction with the participants was useful" (Instructor C). Another participant pointed out, "They presented all kinds of different options...You can pick what you need and what works for you. That really worked well for me" (Instructor F). One late adopter discussed how her teaching strategies improved as a result of learning new online teaching strategies and techniques. "I think it really helped my teaching style. I try to use technology and social media in the classroom to gain awareness" (Instructor J).

Consistent with expectations.
Several participants (n = 7) expected more individualized and tailored instruction to assist with the design of their own courses. Instructor L stated, "It was not really tailored to individual needs." Additionally, Instructor K shared "For me, I'm a very hands-on learner and so not being able to actually implement what we're learning didn't really work for me. But for people who learned by watching someone else do something, this may have been helpful for them...but it wasn't for me." This less hands-on approach led some participants to feel less confident in executing delivery strategies that were discussed. For example, Instructor E shared that "I just don't know how to do it myself. So, I feel like I'm back at square one with just a lot more knowledge about what's out there." Likewise, another participant commented "Some sessions just kind of talk about technology and we didn't actually try it. I prefer trying it" (Instructor F).
In addition to the less hands-on training approach, moderate and late adopters tended to feel that a one-size-fits-all approach was utilized. Instructor H commented "I probably would have benefited from having us grouped by level of experience or level of interest in certain topics...I probably could have utilized my time a little bit better if there had been stronger sessions offered for different things." Two participants perceived that prerequisite knowledge was presumed. "I felt like sometimes the [professional development] instructors almost assumed prior knowledge-at least for me... I think there were too many assumed knowledges about what you knew for teaching online" (Instructor M). Instructor M continued by stating "I think that if the talks or workshops have been individualized to certain interest groups, and more hands-on...that would have been a lot more helpful." Conversely, four early adopters like Instructor D, articulated that "I think participating in that kind of hands-on, pretty intense professional development helped me find the things that I could implement and find the things that could apply to me specifically and then go to it." These two opposing viewpoints might point to a disparity between the training needs for early adopters versus moderate and late adopters.

Trialability
Nearly half (46%) of the participants indicated they practiced using some online tools, skills, and strategies presented in the training. Some participants (n = 7) reported that trying to use new tools and techniques was vital. For example, "I think we had class time to practice and ask questions. Some things that interest me, I would practice more than others. I also think I didn't have a clear enough understanding of what I wanted to know and what I needed to practice" (Instructor J).
Additionally, Instructor A stated, "I brought my laptop. I did everything as we were learning. I was able to try out as we were learning it." As an example, Instructor F created a blog during the training. "I put all the proctoring websites that I've used on a blog and shared with the other faculty. So, that was very productive, and I actually got to do it hands-on." Five participants commented that they ended up practicing on their own. Instructor M commented "I think I actually practiced with students or other faculty. I've done that with a few faculty or a couple of faculty where I'm able to show what I've created or show them how I created it and how to put it online. That's how I'm able to practice it." Along the same line, Instructor H stated that "implementing Adobe Connect and just doing that trial and error, trying to see what works... I didn't do that with the eLII staff. I did that on my own with our information technologist over in my own college. But I definitely practiced." One participant even practiced with family members. "I tested out Adobe Connect with my wife who just acted like a pretend student. That tool is really easy" shared Instructor E.
Practicing on their own after the training was also noted by Instructor F, who commented "I learned to use Adobe Captivate and I practiced that on my own." Likewise, Instructor A remarked "I tried a lot of different things...I have a lot of accounts to try to find out more and see what would really work. It took me getting in there, signing up for it and everything to really start playing around with it to really understand what was going to work best." Independent experimentation and exploration of new tools was more common for early and moderate adapters.

Ease of Use
Most participants (9 out of 13) found the skills gained from the professional development easy to implement. The remaining participants either claimed that implementation would be too difficult or too time-consuming. One instructor noted that they did not gain any skills and did not have an opportunity to use the skills. When participants were asked to comment on the ease of implementation, two participants shared how selectiveness is important when thinking about what to implement in blended and online courses. Instructor D said "I think one thing I did take away from it is that you can't do all of it. You must pick one thing and try to make it work this time. And if it doesn't, then try something different. So, I find that every time I try a new platform or a new app that it seems to work, but I can't do everything." Along the same line, Instructor G commented "I try to be selective in the type of things that I'm going to try to implement in my classes. If I don't think I can do it, or I think that I'm not going to be able to figure it out and do it well with my students, then I don't do it. I think that's probably the better way that I handle it." This approach really speaks to the classification of implementation as either "easy" or "difficult." Six participants from each adoption level commented that incorporating video and web components into a course would be difficult and time-consuming. Synchronous video components, such as using Adobe Connect, or recording and editing lectures using Echo 360 or Camtasia, were specifically mentioned as challenges. "It's such a simple thing, but I didn't learn how to use it during our training. I think that it's such a basic thing that we should have known. We really should have learned how to use it" remarked Instructor K. Another participant asserted, "Everything is very time-consuming. Even though Captivate is cool, there's so much to it, and as I try to explore it takes a lot of time" (Instructor C). Similarly, Instructor J shared his experiences with video creation, "Well, I think that it was challenging-creating, adding, and coming up with video stuff. I just didn't understand. Maybe I didn't have a clear idea of what were the best or most effective practices, but I didn't know enough." Another participant shared how initial difficulty resulted in long-term benefits. "What I've learned about all of this, any time you create something digital, you have to keep at it! So, I don't mind putting a lot of work into something that I can use every semester over and over," proclaimed Instructor F.

Voluntariness
Out of the 13 interviewed instructors, only three (23%) of the adopters reported being required to teach online or hybrid. Each of the three was classified as a moderate adopter. The requirement to teach online appeared to be most closely associated with their rank and title. Those participants with full faculty rank did not express administrative pressure, while lower faculty rank individuals felt that demands from their superiors made participation involuntary. One participant discussed how her rank as lecturer contributed to the requirement of teaching online. "The Dean asked me to develop the online class. So now that it's developed, I guess I'm kind of required to teach it. I'm a lecturer, so a lot of this distance learning falls on the lecturers," commented Instructor F. Likewise another lecturer expressed how her contract called for her to teach online during the summer. "I'm on a twelve-month contract as opposed to a nine-month contract. The first time they [the department] needed somebody to teach online was during a summer when people weren't around. So, basically, it was given to me" (Instructor G).
The remaining ten participants reported that they teach blended courses on a voluntary basis. Instructor I stated, "I'm a tenured faculty member so there would not be any requirement per se to teach online. There are certainly opportunities provided from my department. I'm interested in experimenting and trying to figure out new and compelling ways to incorporate [technology]." Similarly, Instructor H shared, "There is no requirement to do that [teach online]. It's encouraged, but it's not required. Honestly, it wouldn't work for all of our classes." Many of the participants commented that they were just interested in learning more about online and hybrid teaching practices.

Image
Like voluntariness, image appeared to be unrelated to adoption level. Participants were neutral (n = 8) on how the implementation of skills was related to image or reputation, or positive (n = 5) that the training improved their reputation and image with peers. For example, Instructor K said "There's not a perceived difference between people who participated in the training. I don't think people in my department even know that I participated in it." Likewise, Instructor H shared "In my division, honestly, it's not really a big deal. I mean I think people are like, 'Oh, that's cool. Tell me how it goes.' But it's not this prestige thing." In contrast, another participant shared "I'm sure that the faculty who are not part of the eLII process see it as perhaps a good thing and something that we should be doing. We should be training new cohorts of faculty" (Instructor I).
On a similar note, Instructor A commented "I'd say on the university level, it's perceived as what's going to push the university forward and progress the university." Another participant shared how her involvement in this professional development lead to speaking engagements. Instructor G shared "From my perspective people are perceived pretty well. As a result of my involvement with this program, I've been invited to give professional development sessions not only for my own college, but also for other colleges around the university for the eLII program. I reviewed some of the new rounds of eLII grants because of my experience. So, it seems like we're perceived in a positive manner." Those participants who reported a positive impact on their image (n = 5), tended to note knowledge gained and the status of being an early adopter of online teaching. Instructor D remarked "The perception is that we're the most tech-savvy people. However, it seems that I've always been the person that if anybody has problems with clickers or with Blackboard or with Echo 360 or with any of other technology, they'll come find me." Similarly, Instructor M stated "I think people probably perceive it positively." Instructor F shared a similar experience. "My chair sent another faculty to me who had a question about recording lectures and that kind of thing. So, I guess we are perceived a little bit as the experts in the area." When asked about their improved image, the same five participants indicated positive perceptions about peers who participated in the professional development. "All of them are pretty motivated regarding wanting to be better teachers online, so I think of them positively in that sense. They are motivated to be good teachers" commented Instructor K. Likewise, Instructor C shared that "It's nice to know others are so excited about teaching because we are research school. And so most of the time we're excited about research, but the teaching part is so fun on each side. So, I'm very happy to see that so many of us also have a heart for how our students learn and how can I do a better job for them and for me."

Visibility
Participants (n = 11) discussed being more aware of instructors teaching online as a result of the training. Instructor B commented, "I hear about what some people do, but I have no idea whether it is connected with eLII or not...Sure we kick around stuff in our departments, and some of those folks were involved in eLII stuff, but they were doing this stuff already anyway." On a more global level, Instructor C asserted "I hear about more people teaching online now I think just because that's where the market is going, and we're going to have to respond to that." Instructor A shared her experience: I've seen it [online learning] across our department...I would say a positive outcome is the fact that if other people want to do it. This friend of mine over in [another department], we talk all the time. She tells me about how she is implementing flipped learning. She does more of the traditional flipped classroom where she does the lectures outside of class and then they do the problem working inside of class.

Discussion
Findings from the current study illustrated some of the changes that occurred as a result of the year-long professional development initiative at a single research university. The results suggest that early adopters benefited from a wider exposure to tools and required a much lessformal hands-on approach. In contrast, instructors who were moderate or late adopters of online and blended learning benefited from a step-by-step training approach that walked them through the integration of digital tools based on their specific teaching needs.
The current study is a tale of a single university and provides details on barriers, challenges, and success of a small group of instructors. Nevertheless, this study demonstrated the benefits of combining a qualitative and quantitative approach when the sample size is small. In this case, the quantitative results (i.e., the survey) provided a baseline on a point in time, but the data were inadequate to make comments about group and individual differences. Likely due to this limited sample size, no significant statistical findings were found regarding differences by innovation level. However, the qualitative data illustrated nuanced differences and gave voice to the experiences of the instructors.
As detailed in the literature review, few studies are situated in higher education institutions that focus on online and blended learning and that use a theory to ground the methodology. The current study was grounded in Roger's (1962Roger's ( , 2003 innovation model and Moore and Benbasat's (1991) conceptualization of the perceived characteristics of innovation. Using this theory to guide the current inquiry helped to better understanding how innovation characteristics influenced one another in the context of preparing instructors at a research university to teach in online and blended environments. Results of this study indicate that faculty members most frequently mentioned experiences that fell within the perceived characteristics of innovation of relative advantage, compatibility, and trialability. The characteristics of voluntariness or visibility were interpreted as having little influence on adoption levels. The fact that voluntariness did not influence innovation adoption is likely because instructors at research institutions, on the whole, do not choose which courses they will teach and in which format those courses will be taught. With the caveat that rank (i.e., lecturer, assistant, associate, or full) might provide the individual with leverage in these decisions. The fact that image did not influence adoption rates is likely a result of the siloed nature of research institutions. At these types of institutions, instructors rarely interact across departments and might never interact with others across colleges. Thus, an instructor at a research university might be unaware to what is happening outside of his/her own department.
The current study furthers the research that has been conducted on faculty development for online and blended learning in institutions of higher education. For example, a study conducted by Shea, Pickett, and Li (2005) focused on satisfaction with online teaching of instructors across 33 unique and diverse campuses that include community colleges, technical colleges, four-year colleges, doctoral universities, as well as university centers. Although those findings were also theoretically situated in the DOI, those researchers focused on satisfaction with online learning within a network. In the current study, the findings are focused on the story of one researchintensive university and pedagogical changes that resulted around the eight perceived characteristics. The current study also took a more theoretical approach than previous research by using the perceived characteristics of the DOI theory as the measurable constructs, both qualitatively and quantitatively. Thus, by focusing on accepted theoretical constructs in the research design, the study was able to go deeper into the theoretical levers that may impact the adoption of online teaching and learning, not just overall satisfaction.

Limitations
Limitations of the current study include a lack of distinction between online and blended delivery. This lack of distinction may have resulted in a feeling of mismatch between the purpose of the training and faculty expectations. There was also no presurvey data from faculty participants. The inclusion of presurvey information would have been helpful in determining if the training assisted in increasing an individual's self-reported innovation level. Changes in faculty perceptions of the innovation characteristics may have differed between the initial week-long training versus the follow-up meetings. Lastly, the relatively small sample size hindered the use of advanced quantitative analysis.

Conclusion
It is important to note that networking through the initial professional development, and later in the faculty learning communities, was an unexpected beneficial aspect of the professional development training. The creation of the learning communities with small groups of participants allowed faculty members with differing expertise to support one another through the learning process over a longer period beyond the initial week-long training. This direct application of skills and networking with peers may result in increases to some innovation characteristics (e.g., results demonstrability, relative advantage) in the context of a specific endeavor.
The research presented in this article details how one research university used professional development training to increase the quality, and quantity, of online and blended courses. As research-intensive universities shift more resources from the brick and mortar classroom into an online or blended learning environment, professional development of the course instructors will be imperative. This research highlighted one approach taken to the professional development as well as the method taken to evaluating the outcomes of that professional development. The lessons learned can be of service to future instructors, learners, and leaders. They have a higher profile. Strongly Disagree Disagree Agree Strongly Agree They are a status symbol in my organization.
Strongly Disagree Disagree Agree Strongly Agree Q4 Please rate how much you personally agree or disagree with these statements regarding implementing the skills you gained in the eLearning Innovation Initiative professional development.
The skills are compatible with all aspects of my work.
Strongly Disagree Disagree Agree Strongly Agree The skills fit well with the way I like to work.

Strongly Disagree Disagree Agree Strongly Agree
The skills fit into my work style. Strongly Disagree Disagree Agree Strongly Agree Q5 Please rate how much you personally agree or disagree with these statements regarding implementing the skills you gained in the eLearning Innovation Initiative professional development.
Using the skills is clear and understandable.
Strongly Disagree Disagree Agree Strongly Agree I believe it is easy for me to do what I want to do with the skills.
Strongly Disagree Disagree Agree Strongly Agree Overall, I believe it is easy for me to implement the skills.

Strongly Disagree Disagree Agree Strongly Agree
Learning the skills is easy for me. Strongly Disagree Disagree Agree Strongly Agree Q6 Please rate how much you personally agree or disagree with these statements regarding demonstrability implementing the skills I gained in the eLearning Innovation Initiative professional development.
I would have no difficulty telling others how I implemented the skills I learned.
Strongly Disagree Disagree Agree Strongly Agree I believe I could communicate to others the consequences of implementing the skills.

Strongly Disagree Disagree Agree Strongly Agree
The results of implementing the skills are apparent to me.
Strongly Disagree Disagree Agree Strongly Agree I would have no difficulty explaining why implementing the skills may or may not be beneficial.
Strongly Disagree Disagree Agree Strongly Agree Q7 Please rate how much you personally agree or disagree with these statements regarding visibility.
In my organization, I see other eLearning Innovation Initiative professional grant recipients using the skills I gained.

Strongly Disagree
Disagree Agree Strongly Agree People who use the skills from the eLearning Innovation Initiative grant are not very visible in my organization.

Strongly Disagree
Disagree Agree Strongly Agree Q8 Please rate how much you personally agree or disagree with these statements regarding the skills you gained in the eLearning Innovation Initiative professional development.
Before deciding whether to use any of the skills, I was able to adequately practice those skills.

Strongly Disagree
Disagree Agree Strongly Agree I was permitted to use the skills on a trial basis long enough to see what I could do.

Strongly Disagree
Disagree Agree Strongly Agree Q9 Please rate your adoption level on a scale from 1-5 with 1 being the last person to adopt and 5 being the first person to adopt.
How would you rate your adoption level using digital technology? 1 2 3 4 5 How would you rate your adoption level with regards to teaching blended courses?* How would you rate your adoption level with regards to teaching fully online courses?** 1 2 3 4 5 * Blended courses are courses that have traditional face-to-face on campus instruction and some on campus activities have been replaced by online learning activities. **Fully online courses are courses that have all content and course activities online. There is no traditional face-to-face on campus instruction.