The Development and Validation of the Distance Doctoral Program Integration Scale

Research indicates academic integration and social integration are predictors of doctoral student persistence at any program stage. However, researchers have not defined, operationalized, and measured academic or social integration consistently. Further, no instruments exist that measure academic and social integration of doctoral students in distance programs. This research aimed to define distance doctoral program integration and develop and analyze the structure, validity, and reliability of the Distance Doctoral Program Integration Scale. Instrument development followed a multi-step process, including expert review, pilot test, and exploratory factor analysis. Instrument reliability was assessed using Cronbach’s alpha and test-retest. The results indicated a three-factor structure (i.e., faculty integration, student integration, and curriculum integration). The 32-item instrument is valid and reliable, measuring program integration of doctoral students studying at a distance.

social integration, capture the idea that both personal and institutional variables and the interaction of the two influence a doctoral student's choice to persist. The purpose of this study was to apply Tinto's constructs to doctoral education by defining, operationalizing, and developing an instrument to measure academic integration and social integration of doctoral students in Distance Education (DE)programs.
Academic integration refers to interaction among students and faculty within the formal academic domain, and social integration refers to interaction among students and faculty outside the formal academic domain (Tinto, 1993). Tinto's (1975Tinto's ( , 1993 model and constructs of academic integration and social integration may be considered the most respected, tested, confirmed, and widely-cited work on integration and persistence (Kember, 1989(Kember, , 1995Simpson, 2003). Researchers have used Tinto's (1975Tinto's ( , 1993) work on student integration as foundational in the development of models for doctoral student persistence (Wao & Onwuegbuzie, 2011) and DE student persistence (Rovai, 2003).
Further, researchers have noted that academic integration and social integration may not be mutually exclusive constructs (Braxton & Lien, 2000;Braxton, Sullivan, & Johnson, 1997) for doctoral students. At the doctoral level, research suggests academic and social circles become the same (Lovitts, 2001;Tinto, 1993). In other words, doctoral students' academic and social interactions are often intertwined with many of the same students, faculty, and staff, making differentiation difficult (Lovitts, 2001;Tinto, 1993Tinto, , 2017. There are currently validated instruments that measure aspects of academic integration and social integration. For example, the College Persistence Questionnaire (CPQ) contains items that measure aspects of academic and social integration of traditional undergraduate college students (Davidson et al., 2009, Davidson, Beck, & Grisaffe, 2015. The Classroom Community Scale (CCS) contains items closely aligned with academic and social integration but was designed specifically for single classes (Rovai, 2002a). The Doctoral Student Connectedness Scale (DSCS) (Terrell et al., 2009) also contains items that closely align with academic and social integration but was designed for students in their program's dissertation stage. None of these instruments was found suitable for fully measuring the identified elements of academic integration and doctoral students' social integration in DE programs.
This study's impetus was the lack of understanding of academic integration and social integration of doctoral students in DE programs. The following research questions guided this study: (a) What are the underlying factors that explain the integration of distance doctoral students? (b) Is the instrument valid and reliable for measuring integration of distance doctoral students?

Review of Relevant Literature
A thorough literature review indicated that doctoral students' integration in DE programs comprises elements of academic integration and social integration, thereby forming the foundation for instrument development. The literature suggested that doctoral students' academic integration in DE programs includes satisfaction levels with the academic program, student-faculty academic interactions, and student-student academic interactions. The literature suggested that doctoral students' social integration in DE programs includes satisfaction levels with the nature and quality of student-student and student-faculty nonacademic interactions within the doctoral program. We, therefore, used this literature as the basis for initial instrument development.

The Importance of Integration for Doctoral Students in DE Programs
The doctoral journey is marked with "challenges and demands of doctoral study" (Smith, Maroney, Nelson, Label, & Abel, 2006, p.17). The doctoral journey is unique because development as an independent scholar is an essential, albeit one of the most difficult parts of the process (Gardner, 2008;Walker, Golde, Jones, Bueschel, & Hutchings, 2008). Gardner (2008) described the paradox in doctoral education: "If someone holds your hand too much, you'll never learn to think for yourself, and if someone doesn't hold your hand enough you'll fall flat on your face" (p. 327). Success in the doctoral journey is rooted in developing as an independent scholar and in integration throughout the process. Lovitts (2005) purported that three factors affect doctoral degree completion, including individual factors, the microenvironment (e.g., department, program, peers, and faculty), and the macroenvironment (e.g., the culture of education and discipline). The actions both faculty and students take to integrate doctoral students into the program, and department with peers and faculty are significant in the microenvironment. Golde (2005) agreed that integration leads to doctoral student persistence. In fact, integration-related reasons have been cited as the most common reason doctoral students continue (Lovitts, 2001).
During the doctoral journey, students integrate within and throughout their program. Students begin to develop academic and social circles during initial coursework. As they progress, interactions become much more localized and influenced by the faculty and student communities existing in their respective fields of study (Tinto, 1993). These interactions are often intertwined with many of the same students, faculty, and staff. During the dissertation, the sphere of integration shrinks significantly, generally to the few faculty involved in the dissertation process (Tinto, 1993). The ability to integrate and develop positive working relationships within the program at this stage is so critical to persistence "that it may hinge largely if not entirely upon the behavior of a specific faculty member" (Tinto, 1993, p. 237). Understanding the elements of academic integration and social integration of doctoral students in DE programs is vital for researchers, faculty members, and administrators, given that integration is predictive of persistence, and persistence is a problem.

Academic Integration for Doctoral Students in DE Programs
Though the definition and measurement of academic integration have varied even within doctoral studies, critical lines of doctoral education research for both distance and residential students have consistently described academic integration as important in understanding doctoral student persistence (Bair, 1999;Ivankova & Stick, 2007;Lovitts, 2001;Rockinson-Szapkiw et al., 2016;Rovai, 2003;Spaulding & Rockinson-Szapkiw, 2012;Tinto, 1993;Wao & Onwuegbuzie, 2011;Wyman, 2012). The level of academic integration has been linked to satisfaction, and the literature suggests higher satisfaction levels positively influence doctoral student persistence (Bair, 1999;Ivankova & Stick, 2007) and time to degree (Wao & Onwuegbuzie, 2011). For doctoral students, academic integration happens and is important in all phases of their program (e.g., coursework, comprehensive examinations, or dissertation) (Bair, 1999;Golde, 2000;Rockinson-Szapkiw & Spaulding, 2014;Tinto, 1993). Drawing from a thorough review of the literature, distance doctoral student academic integration was defined as the student's satisfaction with (a) the academic program, (b) student-faculty academic interactions, and (c) student-student academic interactions. Items were developed to encapsulate these elements.

Academic program
Doctoral student's satisfaction with the academic program has been identified as being positively associated with doctoral student persistence in both the traditional and distance environments (e.g., Bair, 1999;Girves & Wemmerus, 1988;Ivankova & Stick, 2007;Lindsay, Kerawalla, & Floyd, 2018;Lovitts, 2001;Rockinson-Szapkiw et al., 2016;Rovai, 2002b;Wao & Onwuegbuzie, 2011). Indicators of academic program satisfaction in both the traditional and distance environments are very similar. For example, in her meta-synthesis of nearly 30-years of residential doctoral student persistence and attrition research, Bair (1999) identified the academic program satisfaction aspects most closely related to persistence as perceived academic quality and relevancy of the curriculum and instruction to the student's work. Research has since supported Bair's (1999) findings.
In their mixed methods study of factors related to residential doctoral student time-todegree, Wao and Onwuegbuzie (2011) found that students who were satisfied with their courses, the sequencing of courses, and with the level of coursework prepared them for the dissertation tended to have shorter completion times (Wao & Onwuegbuzie, 2011). Likewise, persistence increased, and time-to-degree decreased when residential students were interested in their coursework and dissertation topic (e.g., there is good fit with personal interests, application to future job goals, application to real life, or other similar reasons) (Bair, 1999;Earl-Novell, 2006;Golde, 2005;Hoskins & Goldberg, 2005;Lindsay et al., 2018;Spaulding & Rockinson-Szapkiw, 2012;Wao & Onwuegbuzie, 2011). Research indicates similar findings in the distance environment. Doctoral students in DE programs who perceived higher levels of learning, course relevance, and course usefulness indicated greater academic program satisfaction or academic integration (Ivankova & Stick, 2007;Rovai, 2002b).

Student-faculty academic interactions
In a meta-synthesis, Bair (1999) also identified "the single most frequently-occurring finding…was that successful degree completion is related to the degree and quality of contact between a doctoral student and her or his advisor(s) or other faculty in the student's doctoral program" (pp. 67-68). Positive academic-focused relationships with faculty can decrease time-todegree (Maher, Ford, & Thompson, 2004;Wao & Onwuegbuzie, 2011). This need for positive student-faculty interactions coincides with Moore's (1989) suggestion that positive academicbased faculty interaction is essential and desirable in the DE setting.

Student-student academic interactions
Similar to the need for positive academic-based student-faculty interactions, Moore (1989) suggested positive, academic-based student-student (or peer) interaction is very important in the distance environment. Academic-based peer interactions are those related to program completion (e.g., coursework, comprehensive examinations, or dissertation) (Bair, 1999;Lovitts, 2001;Rovai, 2014;Wao & Onwuegbuzie, 2011). These academic interactions can be formal or informal (whether online or in the classroom) and can occur on a regular or irregular basis (Lovitts, 2001).
Academic interactions among peers occur using similar means as those previously described for student-faculty interactions (Moore, 1993(Moore, , 2019Simonson et al., 2012). However, the frequency of interaction does not necessarily correlate to higher satisfaction levels with interaction. In DE, interaction quality is more important than quantity (Picciano, 2002;Rovai, 2014;Simonson et al., 2012), and students with low interaction frequencies may still be satisfied with their interaction levels (Picciano, 2002).
Exacerbating the issue is that doctoral students' social integration is closely intertwined and even blurred with academic integration (Lovitts, 2001;Tinto, 1993). Researchers have referenced academic-related factors (e.g., timeliness of faculty feedback, course-related conversations outside the classroom, and interactions within the doctoral department) when describing the social integration of doctoral students (Bair, 1999;Golde, 2000;Terrell et al., 2009;Wao & Onwuegbuzie, 2011). However, researchers do agree that for doctoral students, social integration is a consequence of academic and nonacademic interactions (Bair, 1999;Golde, 2005;Ivankova & Stick, 2007;Lovitts, 2001;Rockinson-Szapkiw et al., 2016;Spaulding & Rockinson-Szapkiw, 2012;Terrell et al., 2009;Terrell et al., 2012;Tinto, 1993;Wao & Onwuegbuzie, 2011;Wyman, 2012). Drawing from a thorough review of the literature, distance doctoral student social integration was defined as the student's satisfaction with the nature and quality of student-student and student-faculty nonacademic interactions within the program. Items were developed to encapsulate these elements.
Social integration of doctoral students is developed "through informal, casual interactions between and among graduate students and graduate faculty in a variety of contexts" (Lovitts, 2001, p. 42). These interactions, in DE programs, can stem from any peer and faculty interactions (Rockinson-Szapkiw et al., 2016;Rovai, 2002a;Terrell et al., 2009;Terrell et al., 2012). Hill (1996) posited it is important to understand the contexts of interactions. Positive feelings of interactions "may not be defined in a geographical sense [and may] consist of groupings of people who…may never physically meet each other" (Hill, 1996, p. 433) such as the DE environment.
Peer and faculty interactions help develop positive relationships and feelings of being connected to others in the distance environment (Garrison, Anderson, & Archer, 2000;Ivankova & Stick, 2007;Moore, 2019;Rockinson-Szapkiw et al., 2016;Terrell et al., 2009;Terrell et al., 2012). The literature suggests when interactions are positive, students are connected or integrated with fellow students and faculty within the program (Lovitts, 2001;Rockinson-Szapkiw et al., 2016;Terrell et al., 2009;Rovai, 2002aRovai, , 2002bRovai, , 2014Tinto, 1993). Ivankova and Stick (2007) purported that doctoral students in DE programs who feel supported by and perceive encouragement from peers and faculty within a bounded system of a course or participation in online activities (academic or nonacademic) indicates good social integration.
Indicators of poor social integration of doctoral students include operant terms such as a lack of understanding, not encouraging, feelings of competitiveness and competition, neglect, and personal issues with dissertation committees and chair advisor (Bair, 1999;Bowen & Rudenstine, 1992;Girves & Wemmerus, 1988;Ivankova & Stick, 2007;Lovitts, 2001;Rovai, 2002a;Terrell et al., 2009;Wao & Onwuegbuzie, 2011;Wyman, 2012). Lovitts (2001) found terms related to feelings of isolation (e.g., lack of cohesion, social deprivation, isolated, and little personal contact) as "the most frequently cited integration-related reasons" (p. 177) leading to doctoral student decisions to exit a program. Lovitts (2001Lovitts ( , 2005 also noted that feelings of isolation and disconnectedness from faculty and their peers, especially during the dissertation phase, were indicators that social integration was not present. Terrell et al. (2009) suggested that doctoral students in DE programs who do not interact face-to-face with peers and faculty on campus may experience feelings of isolation and disconnectedness at an exacerbated level.

Participants
Participants for this study consisted of a snowball sample of 282 DE students enrolled in educational doctorate programs in late 2018 across multiple higher education institutions. Snowball sampling included emails sent to industry professionals with access to potential participants and posting an invitation to participate in professional organization listservs (e.g., AERA,VACES). The researchers used snowball sampling to access participants from multiple institutions as a means to increase participant demographic variability and increase generalizability (Warner, 2013). The researchers limited participants to those enrolled in only educational doctorate programs (either EdD or PhD) as a means to minimize the effects of variability across multiple doctoral program disciplines (Gall, Gall, & Borg, 2007).
This study focused on distance education programs considered online as at least 80% of the program was delivered online (Allen & Seaman, 2014). This definition of distance education is consistent with the Integrated Postsecondary Education Data System's (IPED) two categories of distance education enrollment used in Seaman, Allen and Seaman's (2018) distance education report. The term distance education included exclusively distance education (100% enrollment in online courses) and some but not all distance education (e.g., enrollment in courses including mixed modalities with some online courses).

Instrumentation and procedures
A literature review suggested that integration, regardless of setting or program level, is inclusive of both academic and social integration. However, research demonstrating the links between persistence, academic integration, and social integration are sometimes not clear (e.g., Braxton & Lien, 2000;Braxton et al., 1997), and these two constructs at the doctoral level are closely intertwined (Lovitts, 2001;Tinto, 1993). Thus, drawing from the literature on social and academic integration of doctoral students in DE programs, including previously developed instruments such as the CPQ (Davidson et al., 2009;Davidson et al., 2015), the CCS (Rovai, 2002a), and the DSCS (Terrell et al., 2009), 50 items were developed for the instrument.
All items were positively worded and asked respondents to rate their level of satisfaction with the potential responses of very high (5), high (4), medium (3), low (2), or very low (1). The scores for each identified subscale were computed by adding the item points and averaging them. Higher scores reflect stronger integration. The initial instrument items were assessed for content and face validity by a subject matter expert (SME) panel (Warner, 2013). The SME panel was comprised of four experts who had published on doctoral persistence, online persistence, or online education. All had experience teaching within online doctoral programs. The SME panel review consisted of two reviews.
During the first review, the SME panel examined the instrument items for the following criteria: content validity, face validity, clarity, conciseness, and reading level (Worthington & Whittaker, 2006). The experts rated each item for each criterion using a five-point scale (one = very poor; five = very good). The experts also provided open responses explaining ratings of items and providing suggested improvements. Mean scores for each item were computed and comments were analyzed. Any item that did not have a score of 4 out of 5 was adjusted or deleted. Suggestions were used to modify items. The items were then again provided to the reviewers. During the second review, the SME panel selected the instrument items in aggregate that appeared to fully measure distance doctoral student integration. The panelists reached a consensus that 34 of the items appeared to fully measure distance doctoral student integration.
The 34-item scale was then assessed in a pilot study. The pilot study was conducted with sample participants (n = 8) to assess the scale for face validity, item relevancy, and obtain an estimated time-to-complete (Warner, 2013). Feedback and evaluation from the pilot participants indicated the 34-item scale was ready for further evaluation.
Snowball sampling (Gall et al., 2007) was used to get participants for the next step. Following Dillman, Smyth, and Christian's (2009) recommendation, an invitation email to participate in the study was sent to students and faculty associated with distance doctoral education programs at nine institutions. Additionally, an invitation to complete an online survey consisting of demographic questions, program experience questions, and the 34-item scale was sent to doctoral students via professional organization listservs. Within the email invitation, the participation criteria was defined as enrollment in an education doctorate program in which 80% or more of the course work was completed online. Initially 322 students responded to the survey.
A small amount of the cases (n = 34) were deleted due to missing a large amount of data. An additional five cases were disqualified as the respondents indicated they were not in a distance doctoral program. There were also 15 cases with what appeared to be data missing completely at random that we chose to retain by imputing the missing data using mean substitution (Tabachnick & Fidell, 2007). The final sample consisted of 282 cases with valid and complete responses. This sample size was well within the acceptable limits for the exploratory factor analysis (EFA) (Comrey & Lee, 1992;Kass & Tinsley, 1979;Warner, 2013).

Results
Data were assessed and found suitable for analysis. Inspection of the correlation matrix indicated many of the coefficients were greater than the threshold of .3 (Tabachnick & Fidell, 2007). The Kaiser-Meyer-Olkin Measure of Sampling Adequacy was 0.961 and exceeded the needed .6 critical value (Kaiser, 1974). The Bartlett's Test of Sphericity was statistically significant (p < .001; c 2 = 8001.279), supporting the factorability of the correlation matrix and assumption of multivariate normality (Tabachnick & Fidell, 2007). Thus, to investigate the instrument's validity and structure, a maximum likelihood method of EFA with oblique rotation was conducted. Maximum likelihood is the preferred method when data are suitable and are generally normally distributed (Fabrigar, Wegener, MacCallum, & Strahan, 1999). The decision to retain a three-factor solution was made based on analysis of the eigenvalues inspection, Cattell's (1966) scree plot inspection, parallel analysis, interpretability criteria, and consideration of conceptual understanding of the literature. The correlation matrix (see Table 1) contained numerous underlying correlations greater than .3, supporting the use of oblique rotation (Fabrigar et al., 1999;Tabachnick & Fidell, 2007).
All but 2 of the 34 items (items 9 and 30) loaded on one of the three factors. Many items loaded strongly on a primary factor (i.e., above a .5; Tabachnick & Fidell, 2013). Two items (items 1 and 4) had communalities (h2) with values below .4 (see Table 2). However, the matrix (see Table 2) indicated all items were above the higher cutoff threshold of .5 (Kahn, 2006). Therefore, the decision was made to retain 32 items (items 9 and 30 were removed). The factors were named: (a) faculty integration, (b) student integration, and (c) curriculum integration. Mean scores for each factor are also in Table 2.
The internal consistency of the 32-item instrument was assessed using Cronbach's alpha coefficient. The Cronbach's alpha coefficient for the instrument was .966, indicating excellent reliability (George & Mallery, 2003). The Cronbach's alpha coefficient for the faculty integration factor and student integration factor was .937 and .957 respectively. Both factors indicated excellent reliability. The Cronbach's alpha coefficient for the curriculum factor was .899 indicating good reliability. Test-retest reliability was also calculated approximately four weeks after the initial round of participation using data from 109 participants (Warner, 2013). The Pearson correlation for the instrument was r(107) = .855, p < .01. The faculty integration factor was r(107) =.780, p < .01, the student integration factor was r(107) = .810, p < .01, and the curriculum factor was r(107) = .842, p < .01. These results were above the reliability measurement criteria of .70 suggested by Warner (2013), providing further evidence that the instrument is reliable.

Discussion
This study examined the dimensionality, validity, and reliability of an instrument created to measure distance doctoral students' integration. In this study, the instrument was developed, refined, and tested with 282 students enrolled in doctoral programs in education that were offered online. Evidence from the exploratory factor analysis and internal consistency analysis demonstrated that the 32-item self-report instrument has both validity and reliability. The final scale was found to have three dimensions. These results surprised the researchers.
The researchers designed the instrument to measure elements of academic integration (satisfaction with the academic program, student-faculty academic interactions, and studentstudent academic interactions) and social integration (satisfaction with the nature and quality of student-faculty nonacademic interactions and student-student nonacademic interactions). The curriculum related items loaded as expected. However, the rest of the items loaded differently than expected. Using interpretability criteria (O'Rourke & Hatcher, 2013), it was clear all facultyrelated items loaded on one factor, all student-related items loaded on a second factor, and all curriculum items loaded on a third scale. These loadings indicated the importance was who the interaction was with, not the type of interaction.
The identified dimensions appear to more accurately describe the integration of distance doctoral students than academic integration and social integration, as described in the literature. The literature described that, at the doctoral level, academic integration and social integration become intertwined (Lovitts, 2001;Tinto, 1993). However, the results of this study indicate the term intertwined may not go far enough. Perhaps a better term is conjoined. Merriam-Webster's (2018) thesaurus recommends the use of conjoining to describe how separate items "come together as a single unit" (para 1). In this research, items designed to separately measure academic integration and social integration conjoined by who the interaction was with (faculty or peers), not the interaction type (academic or social). These findings suggest the terms academic integration and social integration as used in the literature, do not adequately explain the integration of doctoral students studying at a distance.
Therefore, in lieu of the separate terms academic integration and social integration, we suggest the term program integration be used, and offer the following as a more accurate definition of program integration for doctoral students in distance programs: the satisfaction level with faculty integration, student integration, and curriculum integration. We also suggest the following definitions for the three identified dimensions. Faculty integration is the satisfaction level with the nature and quality of academic and nonacademic student-faculty interactions that take place during the distance doctoral program. Student integration is the satisfaction level with the nature and quality of academic and nonacademic student-student interactions that take place during the distance doctoral program. Curriculum integration is the satisfaction level with the quality and relevancy of the curriculum in the distance doctoral program. We also aptly named the instrument the Distance Doctoral Program Integration Scale (DDPIS).

Implications
This research may help narrow the gap in understanding program integration of doctoral students in DE programs. The literature is clear there is a link between integration and the persistence of doctoral students in DE programs (Bair, 1999;Golde, 2005;Ivankova & Stick, 2007;Lovitts, 2001;Rockinson-Szapkiw et al., 2016;Spaulding & Rockinson-Szapkiw, 2012;Rovai, 2003;Terrell et al., 2009;Terrell et al., 2012;Tinto, 1975Tinto, , 1993Wao & Onwuegbuzie, 2011;Wyman, 2012). This instrument may be used to further understand the importance of program integration and may also help decision makers identify and mitigate program integration issues at any stage in the doctoral student's journey, thereby increasing persistence.

Limitations
This study is empirically significant and has practical value; however, the study is not without limitations. It is understood that EFA is an exploratory method. While multiple methods of factor extraction and interpretability criteria (O'Rourke & Hatcher, 2013) were used to identify the best factor solution, "decisions about number of factors and rotational scheme are based on pragmatic … criteria" (Tabachnick & Fidell, 2007, p. 611). Although it is not likely given the multiple high variable loadings that were statistically significant, a thorough review of the literature to inform item development, and the use of a SME review, false correlations could still be a limitation of this study (Tabachnick & Fidell, 2007). The sample size also brought a potential limitation. For factor analysis, many (Comrey & Lee, 1992;Kahn, 2006;Warner, 2013) recommend a sample size of at least 300. The n of 282 could be considered a small sample size and reduced reliability of correlation coefficients. This study also used a convenience sample delimited to a specific population (e.g., doctoral students in an asynchronous online school of education program with 80% of the program delivered at a distance). Narrowing the sample to a specific population is a delimiter that reduced the ability to generalize results (Warner, 2013) to a larger population of doctoral students across various disciplines (e.g., technology, engineering, and math [STEM] degrees). These limitations provide impetus for future research.

Recommendations for Future Research
The results of this study indicate the DDPIS is a valid and reliable instrument for measuring distance doctoral student program integration. Given the exploratory nature of this study, there is certainly the need to continue research on the DDPIS and doctoral program integration. Recommendations for future research include the following: • Conduct a confirmatory factor analysis (CFA) on the DDPIS to confirm the factor structure.
• Conduct research to increase generalizability. The eventual goal is for the DDPIS to be a valid and reliable instrument for doctoral students in additional non-STEM and STEM DE programs.
• Conduct a longitudinal study to determine if the DDPIS is able to predict persistence and time-to-degree of doctoral students in DE programs.
• Conduct prediction studies to determine integration differences of doctoral students in various program stages.
• Conduct studies using the DDPIS in targeted populations to see how demographic variables may be associated with integration and persistence.
This research provided strong evidence that academic integration and social integration may actually be conjoined. It was evident through this research that distance doctoral program integration is important to doctoral students but is inclusive of the factors of faculty, student, and curriculum integration. For distance doctoral students, in addition to satisfaction with the curriculum, the level of satisfaction with their interactions with both faculty and peers-regardless of whether academic or social-is what appears important at all stages of the doctoral journey.
The DDPIS was developed to measure integration of distance doctoral students at any stage of their program. As students navigate a doctoral program, their needs and abilities to integrate may change (Tinto, 1993). For example, in the early stage of their program, students attempt to find their place as they try to integrate into their program's communities (Tinto, 1993). Later in the program, integration tends to become more localized within smaller communities and eventually narrows to the few (e.g., student cohort, committee, and chair) involved in the dissertation process (Tinto, 1993). Therefore, the DDPIS may be used as a formative assessment at any stage to provide information about integration and address integration-related issues that may lead to attrition.
Universities have a responsibility to identify factors that promote doctoral student persistence (Bair, 1999), and the DDPIS has substantial utility for faculty and administrators of distance doctoral programs to identify program integration issues or at-risk students. Armed with the ability to identify integration shortfalls associated with program persistence, universities can develop and implement policies and targeted initiatives that promote doctoral student program integration. Research indicates students who are satisfied with their integration are more likely to persist. Tables   Table 1 Correlation Matrix of DDPIS Items (n = 34)
-0.600 0.369 1.847 Removed 9 How the dissertation process is preparing you, or will prepare you, for your goals.

1.968
30 Your level of trust in the faculty. 1.776 Note. 1 = Faculty Integration, 2 = Student Integration, 3 = Curriculum Integration, h2 = communalities. Sorted by size and only the highest loadings for each factor retained for ease in viewing.