Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

With online learning becoming a more viable option for teachers to develop their expertise, our report shares one such effort focused on improving the teaching of statistics. We share design principles and learning opportunities in an online course developed specifically to serve as a widescale online professional development opportunity for educators, thus deemed as a massive open online course for educators (MOOC-Ed). In this report we focus on a subset of 412 participants who identified themselves as classroom teachers. We use multiple data sources, quantitative and qualitative, to characterize changes in teachers’ beliefs and perspectives about statistics and identify triggers in the course that appear to influence teachers’ sense making about issues related to teaching statistics. Implications about specific course experiences that served as triggers for critical reflection and change are discussed.

Innovations in online learning environments and changes in K-12 mathematics curricula have created new opportunities to think creatively for how technological solutions could be used for providing professional development for teachers. Indeed, in 2013 Marrongelle, Sztajn, and Smith proclaimed it was "incumbent on the field to capitalize on emerging technologies in the design and delivery of effective professional development" and emphasized the need for "research that focused on teacher learning in these environments" (p. 208). The past several decades have included an increased emphasis on student-centered, investigative approaches to learning and teaching content within science, technology, engineering, and mathematics (STEM) classrooms (Granger, Bevis, Saka, Southerland, Sampson, & Tate, 2012;National Research Council, 2000). Changes in mathematics standards over the past twenty years have given the topic of statistics a prominent place in secondary curricula in the U.S. and many other countries.
Across the globe, platforms, tools, and internet access paved the way for many Massive Open Online Courses (MOOCs) and other distance course offerings related to STEM content, especially statistics. For learning statistics, options abound for courses in which a learner can develop knowledge in statistics. Two examples include the Data to Insight course at University of Auckland in New Zealand (www.futurelearn.com/courses/ data-to-insight), and a five course sequence developed at Duke University in the U.S. (www.coursera.org/specializations/statistics). However, online courses designed for learning to teach STEM content, particularly teaching statistics, are relatively rare. Franklin et al. (2015) call for greater attention to the statistical education of teachers, including practicing teachers. Professional development (PD) for secondary mathematics teachers to develop their statistical content and pedagogy are being offered across the world, typically in local small settings in schools or districts. While such efforts may effectively impact the practices of teachers in these small settings, the need for preparing teachers to teach statistics is much bigger than what can be addressed only by local programs. For example, in Germany, Biehler (2016) led development and implementation of PD for secondary teachers that started on a smaller scale and expanded to reach many more math teachers in Germany. Two efforts to offer MOOCs on learning to teach statistics, with very different approaches, have been developed in the U.S. The design of these courses and lessons learned have been shared by Lee and Stangl (2015;. One of these courses, Teaching Statistics with Data Investigations (TSDI), is the focus of this paper.
With an online solution at a much larger scale, methods for examining impacts must also evolve. While research on face-to-face PD can examine teachers' development in-situ and their local classroom practices, PD done at a distance online adds challenges for examining such development. We offer a glimpse at one effort to use participants' online activity, forum discussions, and self-reported changes on surveys to measure impact.
Specifically, our focused questions are: 1. Which resources and experiences in the course seem to trigger critical reflection? 2. What evidence is there that engaging in the MOOC-Ed impacted teachers' beliefs and perspectives about teaching statistics, that could in turn impact teaching practices?

Review of Related Literature
The intent of this section is to provide background information critical in the domain of STEM teacher education, especially statistics teacher education. However, we then quickly focus the literature on broader issues of designing online professional learning experiences and how to frame our study to examine impacts of an online PD course for teaching statistics.

Teaching Beliefs, Perspectives, and Practices
The success of reform movements in STEM education are contingent on changes in teachers' classroom practice (Milner, Sondergeld, Demir, Johnson, & Czerniak, 2012). Many researchers in STEM education agree that understanding teachers' beliefs is critical to integrating reforms in classrooms (e.g., Yasar, Baker, Robinson-Kurpius, Krause, & Roberts, 2006) as teachers' beliefs are an important factor in influencing their practice (Grossman, 1990). According to Stipek, Givvin, Salmon, and MacGyvers (2001), most teachers believe mathematics is a static body of knowledge that involves rules and procedures that lead to one right answer, whereas inquiry-oriented mathematics teachers view mathematics as dynamic and as a tool for problem solving. They found that teachers' beliefs were associated with their classroom practices in predicted directions (i.e., more traditional beliefs were associated with more traditional practices). Caps and Crawford (2012) found that even well-qualified, highly motivated teachers had difficulty enacting reform-based teaching in science; particularly, teachers held limited views of inquiry-based instruction and the nature of science where these perspectives were reflected in their practice. However, there is evidence to suggest that teachers are able to shift from a perspective that learning is about rules and procedures to one of inquiry, investigation, and critical thinking about key STEM concepts (e.g., Seung, Park, &Narayan, 2011). De Vries, Jansen, andVan De Grift (2013) found that the more teachers engaged in continuing PD, the more student-centered they became, shifting from more traditional orientations.
Beliefs and perspectives that teachers may hold specifically related to statistics include ideas about the nature of statistics, about themselves as learners of statistics, and about what they perceive as important goals for students' learning of statistics (e.g., Eichler, 2011;Pierce & Chick, 2011). Statistics beliefs and perspectives include how teachers view themselves as learners of statistics, which often include memories of lessons focused on graphing or using formulas to generate statistical measures, often without the aid of technology (Lovett & Lee, 2017). Such experiences may lead teachers to believe statistics is about performing a set of procedures. However, teachers may also feel that reasoning with context-rich data and uncertainty in statistical claims can make statistics difficult to learn and teach (e.g., Lovett & Lee, 2017;Leavy, Hannigan, & Fitzmaurice, 2013). One's confidence to teach statistics is then influenced by beliefs and perspectives about statistics, prior experiences in learning and teaching statistics, and understanding of statistical content (Lovett & Lee, 2017;Harrell-Williams, Sorto, Pierce, Lesser, & Murphy, 2015).
Teachers' beliefs and confidence levels would likely lead to different teaching practices. For example, if a teacher believes that statistics is a way of quantifying data and that procedures for computing statistical measures lead to such quantification, they may be quite confident in teaching statistics and their teaching practices may favor a focus on statistical procedures. Such teaching would likely have less emphasis on the rich contexts of data, the process of ensuring good data is collected and available, and making claims about data that are uncertain in nature (Pierce & Chick, 2011). Eichler (2011) posited that the focus of teachers' intended curriculum in statistics can be considered on a continuum from traditionalists (focused on procedures absent of context), to those wanting students to be prepared to use statistics in everyday life (focused on engaging in an investigative process that is tightly connected to contexts of real data). A goal in statistics teacher PD is to move teachers along this continuum towards a focus on investigative processes, which requires impacting teachers' beliefs about the nature of statistics and learning goals for students related to statistics. Seaton and colleagues (2015) found that teachers (university and K-12) were enrolling in content-focused MOOCs on the edX platform and that they were highly engaged as participants in discussion forums. The teachers, representing only 4% of MOOC participants, contributed 22% of posts in forums. This suggests that an online community in a MOOC may attract and support teachers as they learn new content and pedagogy. Designing PD in a MOOC context, though, should be based on effective practices for teachers' learning, on and offline.

Designing Online Professional Development
The Conference Board of Mathematical Sciences (2012) recommends that PD engages teachers in solving problems and deeply exploring content in a professional learning community, analyzing authentic student work, and participating in collaborative task design. PD that includes accessible, personalized, and self-directed elements can provide increased opportunities for sustained, collaborative, and meaningful work among teachers that can affect their knowledge, beliefs, and practice (e.g., Vrasidas & Zembylas, 2004). Online PD that addresses the varied needs and abilities of its participants has been shown to be effective in changing teachers' instructional practice (e.g., Renninger, Cai, Lewis, Adams, & Ernst, 2011). Many designers of online PD emphasize that activities should be meaningful, accessible, and relevant so participants can apply their professional learning to their individual educational context (e.g., Luebeck, Roscoe, Cobbs, Diemert, & Scott, 2017;Vrasidas & Zembylas, 2004). While research on impacts of MOOCs often examine click logs as an indicator of whether or not educators are accessing important learning material, Jacobsen's (2019) work clearly illustrates how busy professional educators that appear to have "dropped out" of a PD MOOC indeed accessed and utilized selected resources they perceived as relevant to their educational context that in turn had an impact on their teaching perspectives and practices.
Active learning experiences and peer interactions are hallmarks of most PD experiences for teachers and can help build a community among participants. Just as communities can form in face-to-face PD, online PD should facilitate an online community. Designers of online courses should build infrastructure to support active learning and peer interaction across geographic and time zone boundaries. Within online PD for educators, asynchronous discussion forums, for example, provide opportunities for participants to reflect on practice, exchange ideas, and discuss ways to improve on their own schedules with colleagues with whom they may not otherwise interact (e.g., Treacy, Kleiman, & Peterson, 2002). Researchers have highlighted benefits of such communities that are not always afforded in traditional face-to-face PD. For example, Mackey and Evans (2011) argued that online communities provide members with "extended access to resources and expertise beyond the immediate school environment" (p. 11), thereby offering ongoing PD and the potential for increased application in classrooms. In order to maximize benefits, designers of online PD programs must be creative in building the infrastructure necessary to support such communities, as participants have the challenge of not being physically in the same place when engaging in online activities.

Online Course Context for the Study
In recognizing the potential for MOOCs to serve as large-scale teacher PD, we are part of teams that have created MOOCs for Educators (MOOC-Eds) to assist teachers in developing new strategies for improving teaching and forming local and global communities of educators. While MOOC-Eds have not had the "massive," large-scale enrollment of other MOOCs, they do reach larger numbers of educators than typical online PD courses. MOOC-Eds are intended to attract professional educators who are specifically looking to engage in a free, open online course that is marketed to educators beyond specific geographical boundaries. Thus, the MOOC-Ed effort at the Friday Institute for Educational Innovation at North Carolina State University includes a collection of courses built using research-based design principles of effective PD and online learning (Garet et al., 200;Darling-Hammond et al., 2009) that emphasize: (a) self-directed learning, (b) peersupported learning, (c) job-connected learning, and (d) learning from multiple voices (Kleiman, Wolf, & Frye, 2015).
In accordance with suggestions from Sztajn (2011) on aspects of PD that are necessary to understand and interpret research results based on PD, we provide details about the intent, learning goals, and specific designs of the TSDI course. The overarching goal of the course is to engage participants in thinking about statistics teaching and learning in ways that are likely different from their current practices in middle school through college-level introductory statistics (http://go.ncsu.edu/tsdi). The course did not focus on a particular grade level or specific statistical content. A major goal was for teachers to be introduced to and use a framework to consider statistics as a four-phase investigative process (pose, collect, analyze, interpret) that incorporates statistical habits of mind, and views learning statistics from a developmental perspective (Franklin et al., 2007).
The course consisted of an orientation unit and five units, each with seven components. The course was open for about 15 weeks to allow for flexibility for participants to engage while managing their busy professional lives. On September 21, 2015 the Orientation and Unit 1 opened. The Orientation unit included an overview video, survey to self-assess their confidence in teaching confidence (i.e., SETS), and a forum in which they could introduce themselves and learn about other participants. Each unit opened in weekly intervals for 4 weeks thereafter, with earlier units always remaining accessible. This allowed participants to start and engage in course material at their own pace. Once Unit 5 opened, the entire course remained active for seven more weeks. Upon closure, participants could still access material and discussion forums in a read-only format (no new posts allowed), though this activity was not included in our analysis.
Each unit began with an Introduction video of the instructor highlighting critical aspects of teaching and learning statistics that participants can learn about in the unit. The Essentials included materials to read or watch that were created by the course development team or compiled from open online resources (open journal articles, lesson plans, data, videos). Each unit included video of students and teachers engaged in statistics lessons. Teacher educators have shown how impactful video cases depicting learning and teaching in classrooms can be in focusing teachers' learning about pedagogical issues (e.g., Wilson, Lee, & Hollebrands, 2011;Sherin & Van Es, 2005). However, when rich examples were available in statistics education literature, animated illustrations of real students' work were created (using tools like Go Animate or Powtoon) that represented students' statistical reasoning and use of technology tools. Such animations have been shown to be an effective way to include artifacts of practice in teacher education materials (e.g., Herbst, Chazan, Chen, Chieu, & Weiss, 2011;Chazan, 2018). The teachers and students in videos also brought in multiple voices that are closest to the practice of teaching.
Self-directed and job-connected learning opportunities often included a selection of statistics tasks for different grade levels (to provide choice) to engage teachers in doing statistics in ways likely different than what they have experienced before (Franklin, et al., 2015;Stein & Smith, 1998). These tasks included Dive into Data experiences for participants to use free technology tools (e.g., Gapminder, Tuva, CODAP, GeoGebra simulations) or import data into their favorite data analysis tools. These active learning experiences allowed teachers to experience investigative statistics tasks using tools accessible in their schools and connected them to relevant and free sources of data. For example, in Unit 4, Dive into Data uses the Census at School website and asked teachers to download data and engage in a cycle of statistical investigation. Extensions include extra material (e.g., datasets, lesson plans, brief articles, applets, videos) to explore content and resources of interest that may be useful in their own teaching context. Again, these extension materials provide opportunities for self-directed learning.
The design principle of learning from multiple voices also guided the decision for each unit to include a video of an Expert Panel discussion with the instructor and 3 experts in statistics education. The conversations in these videos brought forth practical experiences and researchbased suggestions in a conversational tone where listeners could feel they were part of the conversation. Peer-supported learning is a cornerstone of the MOOC-Ed experience to provide focused and ample opportunities for participants to connect with and support one another (e.g., Borko, 2004). Each unit contains two discussion forums: (a) a forum focused on discussing a specific Pedagogical Investigation about aspects of teaching statistics (e.g., analyzing statistics tasks, considering students' approaches to statistics tasks through video clips), and (b) a forum where participants Discuss with Colleagues about unit materials or other ideas related to teaching statistics.
Because of its importance in the course, we provide details about a critical framework integrated across the course. Frameworks can assist teachers in applying content and strategies learned in PD to their own instructional practices (Franke, Carpenter, Levi, Fennema, 2001;Boston & Smith, 2011). Building upon an existing framework (Franklin et al., 2007), the development team incorporated recent research on students' statistical thinking and productive statistical habits of mind (e.g., Burrill & Biehler, 2011;Wild & Pfannkuch, 1999). A habit of mind is developed when a person approaches situations in similar ways so they develop a more general heuristic over time (Cuoco, Goldenberg, & Mark, 1996). The new framework, Students' Approaches to Statistical Investigations (SASI), needed a variety of learning materials and opportunities for participants to develop an understanding of its importance and potential ways it can influence their classroom practices. Both a static and interactive version of a diagram was created to communicate the investigative cycle, reasoning in each phase at three levels of sophistication, and an indication of productive habits of mind ( Figure 1). Two brief documents described the framework and how to apply it to task design. In a video, the instructor illustrated the framework using example student work, and other videos featured expert discussions and interviews, including one expert statistics educator illustrating the development of the concept of mean across levels of sophistication. Participants could also engage in a simulation task and watch two animated video illustrations of students' work that highlighted how students approach an investigation using different levels of sophistication. See Appendix for a list of URLs to these openly accessible resources.

Theoretical Framing of the Study
While making changes in teachers' statistics teaching practices is a major goal, our research is framed by an integrated model for teacher learning in PD proposed by Clarke and Hollingsworth (2002). Their model represents a change process for teachers as including reflection and enactment among an external domain of PD experiences and a teacher's professional world that includes domains of personal, practice, and consequence. The external domain includes information and resources often experienced through a PD, including interactions with others. In our study, the external domain includes learning opportunities (through a variety of resources) within the course and the discussion forums within in each unit. The personal domain includes one's knowledge, beliefs and attitudes. The practice domain includes any professional experimentation a teacher may do in their classroom, with content or instructional strategies, and the domain of consequence is concerned with salient outcomes that result in sustained practice and impacts in a teacher's classroom.
Because of the massive size of our online PD about teaching statistics, we are most concerned with the reflections and enactments between the external domain (experiences and resources in the course) and the reflections and enactments we can discern concerning their beliefs and perspectives about statistics and teaching statistics in the personal domain. To aid us in considering how the MOOC-Ed experiences may impact teachers' beliefs, perspectives, and practices related to statistics, we draw upon Mezirow's (2009) theory of transformational learning in adult education, consistent with constructivist assumptions about learning. Mezirow (2009) describes how meaning schemes-comprised of knowledge, expectations, beliefs and perspectives, and feelings-are used by an individual to interpret their experiences, and through reflection on these experiences, one may transform their understandings. Peters (2014) illustrated how this theory could be used to understand statistics teachers' development of an understanding of variation. In the context of our study, our intent is that a teacher might transform their meaning schemes for teaching statistics by rejecting prior conception of what it means to teach statistics. Transforming meaning schemes often begins with a stimulus, a disorienting dilemma, which requires one to question their current understandings and beliefs that have been formed from previous experiences (Mezirow, 2009). Specifically, we are interested in what stimuli and experiences within the TSDI course may act as triggers to evoke disorienting dilemmas (or cognitive dissonance) for teachers where they engage in critical reflection and question their current understandings or perspectives.

Participant Demographics
Though the course has been offered multiple times, this paper focuses on the Fall 2015 section. To attract a broad audience, the free course was advertised through websites and listservs of many different educational organizations (NCTM, ASA, CAUSEweb, IASE), social media posts, emails to past participants in any MOOC-Ed, state-level leaders in mathematics education in the U.S., and personal contacts. For the purpose of the research reported in this paper, we are only interested in the potential ways the course experiences could be impacting the beliefs and perspectives of K-12 classroom teachers. Of the course's total enrollees (n = 829), over half self-classified as classroom teachers (n = 489). In this study, we focus on these 489 teachers. The enrolled classroom teachers resided in 46 different states and 29 different countries, with most teachers in the U.S. (n = 380) and New Zealand (n = 48). The majority of the 489 classroom teachers were female (67.5%) and 72.8% had a master's degree or above. Their years of experience in education, however, was fairly evenly distributed, creating a diverse pool of participants with varied teaching experiences that impact their starting perspectives and growth opportunities during the course. Of those 489 self-identified classroom teachers, we were able to use additional registration data (e.g., organization type and name) to infer that 412 enrollees seemed to be actively working in K-12 contexts. For example, some enrollees identifying as a classroom teacher also identified their organization type as a college/university and provided a community college as their organization.

Data Sources and Analysis Methods
In our research, we needed data from a variety of sources to help us measure impact of the online learning opportunities for a broad range of active and passive teacher participants. Aside from registration data, five other data sources were used: (a) click logs; (b) discussion forum posts; (c) end-of-unit surveys; (d) an end-of-course survey, and; (e) a follow-up survey six months after course to participants who engaged in any aspect of the course. The purpose of the follow-up survey was to inquire about how they may have applied their learning and what they considered the most impactful ideas from the course.
Course activity was tracked through click logs that allowed us to examine trends in participants' engagement. We limited data to those click logs made by classroom teachers that occurred between September 21, 2015 (opening of Orientation Unit) and December 31, 2015 when the course closed. All registration and click log data were merged and displayed in a dashboard that allowed investigators to visualize participants' engagement over time and with certain types of resources. Descriptive statistics and graphical displays were used to examine overall engagement patterns.
Our qualitative analysis initially focused on teachers' discussions in forums. Because the needs of a community college classroom teacher may differ than that of a K-12 teacher, we focused our qualitative analysis of discussion forum data on posts made by those we had inferred were K-12 teachers. There were 2,097 total posts made by all participants in the course (after removing instructional team), across 12 forums. We eliminated the introduction forum in the Orientation unit and the project discussion forum, leaving 10 forums across the five units. Of the remaining posts, 977 were made by 206 participants classified as classroom teachers. For this study, since we were only interested in beliefs and perspectives of K-12 classroom teachers, only these 977 posts were analyzed, with each post considered a unit of analysis. The posts by teachers were first analyzed using open coding (Strauss, & Corbin, 1998) guided by our focus on cognitive dissonance and critical reflection that may lead to change in beliefs, perspectives and practices related to teaching statistics. Posts were tagged for evidence of what course elements seemed to be triggering critical reflection and any evidence that a teacher may put forth in their written post that may indicate a reflection on, or shift in, their perspectives or beliefs related to teaching statistics. We documented which triggers were the most prevalent and only kept triggers that were associated with many instances of critical reflection. The occurrences of triggers were quite skewed, with many occurring an abundance of times, and a few occurring once or twice. Thus, it was a clear distinction to identify major triggers for impacting changes. Codes for describing perspectives and beliefs about teaching statistics were sorted and collapsed into broader themes.
In accordance with Loizzo, Ertmer, Watson and Watson (2017), to more deeply understand aspects of the external domain that triggered critical reflection and impacts on the personal and practice domains, we examined open-ended responses to end-of-unit and end-of-course surveys, as well as the follow-up survey. The themes generated from the analysis of the discussion forum data-related to changes in beliefs and perspectives, and triggers that seemed to impact such critical reflection-were used as initial codes to examine K-12 classroom teachers' open-ended responses on the end-of-units, end-of-course, and follow-up surveys to questions related to what they appreciated most in a unit and what they considered to be the most impactful learning experiences. While we were looking for confirming and disconfirming evidence of themes and triggers, disconfirming evidence was not evident, and no new themes or triggers were documented.

Results
We first briefly describe teachers' participation in the MOOC-Ed (external domain) to help situate our findings. We then present our results related to the four elements of the course that teachers identified that triggered critical reflection. We discuss each element and provide evidence to illustrate the critical reflection the element triggered. Then, we discuss ways that engagement with, and triggers from, elements of the external domain seemed to impact teachers' perspectives and beliefs about teaching statistics in the personal domain.

Teachers' Participation
The purpose of this section is to briefly describe how classroom teachers chose to participate in the course and engage with resources (external domain). The click log data used in this analysis included all 489 enrollees who self-classified as classroom teachers at any level at registration.
Overall, a majority of enrolled classroom teachers (n = 370, 75.6%) engaged in various aspects of the course (e.g., accessing a page, viewing a video, downloading a document, posting in a forum). While some started in Orientation, others started in Unit 1. There were 293 classroom teachers who engaged in Unit 1, with an assumed intent to engage in PD through accessing learning material. Participants did not have to view Orientation or earlier units to access later ones, though almost all traversed the course linearly once they engaged in Unit 1. Figure 2 shows the sharp drop in teachers' participation between Units 1 and 2. By Unit 5, 31.4% (n = 92) of classroom teachers that began Unit 1 were still engaging in the course. Over half of classroom teachers who began the course posted to a discussion forum (n = 206, 57.5%). The frequency of posts per teacher was a skewed distribution, with 57% of teachers posting 1-3 times (typically in Orientation and Units 1-2), 38% of teachers posting 4-14 times across several units, and 11 very active teachers posting 15-45 times. The levels of engagement in discussion forums by classroom teachers was highest in Units 1-3.
The examination of the click log data provides a strong indication of how classroom teachers took advantage of learning opportunities in the course through accessing resources and participating in discussion forums, with about a third of them finishing the course. A deeper dive into the qualitative data highlights which of the learning experiences in the course (external domain) seemed to trigger pedagogical dilemmas for them.

Course Features Triggering Critical Reflection
Four elements from the external domain emerged as often cited for triggering critical reflection. We briefly discuss each trigger and use examples from classroom teachers to illustrate the types of dilemmas or critical reflection they engaged in. SASI framework. By far, the SASI framework (and all documents and multimedia associated with it, see Appendix) was the most dominant trigger for change. For example, in Unit 5, upon reflecting on why their confidence to teach statistics had increased, some teachers noted how the framework triggered changes. Triggers are bolded. A teacher posted, The most important point that I got from this course is being able to develop habits of mind that will help students to build conceptual frameworks for statistics. … We should be interested in the students' reasonings (as opposed to the result).
In the same discussion thread, a teacher responded, "I have found the frameworks for statistical thinking presented in the videos and materials to be very helpful in articulating the essence of statistics to my students." These teachers view statistics as more than a set of procedures and describe how the SASI framework impacted their perception of teaching statistics. Also in Unit 5, another teacher reflected on how the framework will help to improve her lessons.
I feel more confident as well. It is my first time teaching stats and I was overwhelmed with ideas of how to approach it. This MOOC has supplied us with a framework to base our classwork on. I am developing a set of tasks for my class using the A-B-C levels as a way for me to differentiate instruction because I have a wide variety of ability. I knew I wanted to go in this direction but … the framework gave me the perfect guidelines to do this.
More specifically, this participant indicated that this framework guided her in developing several tasks to differentiate instruction and support students at different levels of statistical sophistication. Another participant indicated, "The SASI framework instilled in me a new mind-set. It showed me the study Statistics under a different light. It allowed me to view it from a different angle and really excited me to start applying and implementing it." Engaging with the SASI framework in the course not only led to teachers expressing a different perception of statistics, it supported them in imagining ways to change their practice.

Expert panel videos.
The discussions among experts within the expert panel videos were another main trigger to assist teachers in reconsidering prior experiences in learning and teaching statistics. In Unit 2, a teacher began a discussion thread detailing a dilemma about prior teaching practices because of points made in a video by the expert panel. The extensive post began as: I had a "lightbulb moment." Although I have been teaching HS math for 24 years, I have never actually taught "statistics" as defined by the members of the expert panel. I have taught units that I THOUGHT were statistics, but I was merely providing students with a few mathematical tools that statisiticians [sic] can use (e.g. finding a mean, making a histogram, calculating a standard deviation, etc.) ... Twelve participants joined that discussion, 10 of which were teachers. They echoed that they were "guilty" of teaching statistics this way and that their own prior experiences in learning statistics treated the subject in a procedural manner for computing measures and creating graphs. Similar discussions and replies about this issue were also started by several others. To complete the first shift in perspective, teachers also recognized that attending and engaging in all parts of an investigation would give students opportunities to make sense of how statistics is used to answer questions and how important data collection (or experimental design) is to the process. Many admitted they spent little time on this with students and aimed to improve.
In their reflections in discussions and on surveys, several teachers referred to a Unit 3 video where one expert illustrates developing the concept of mean through tasks at different levels of sophistication.
Wow-that whole idea around how to introduce the idea of variability as seen in the 'Number in your family activity' at level A through to C is fantastic. Loved the video of [Expert Name]. I can see what an advantage it is when they get to high school level to have been introduced to the concept [of mean] in this way.
The expert panel videos evoked critical reflection and many opportunities for teachers to consider different perspectives and learn how statistics learning and teaching could be conceived of as something different from their own experiences as teachers and learners.
Classroom-based videos. The videos of students and teachers engaged in statistics tasks, both those of real classrooms, and the animated videos depicting real students' work, also triggered critical reflection about how students and teachers engage in statistics, helping them envision a different outcome for their students if they change their practices. In Unit 4, several teachers discussed the use of hands-on projects and experiments.
I loved the Gummy Bears In Space Video. It was short, and to the point but I loved the activity … The students in this video were able to conduct their own experiment, collect data, and really analyze what was going on... A common theme I am seeing with statistics is that it is very project based friendly and can be an extremely engaging classroom! Another teacher in Unit 4 shared their reflection after watching two animated videos of representations of students' work with a sample of messy Census at Schools data with technology and how they envisioned using such an approach with their students. I had several "aha" moments throughout these two videos. It occurred to me that cleaning up data is a valuable lesson that students must know in order to correctly interpret their findings and draw conclusions to answer their questions. If my students were to work with Census at School data to investigate a question of interest to them, I think they would struggle with cleaning up their data to interpret their results... I would think my students would accept the data as is, and begin to draw conclusions using the raw messy data. I think this tool would be a great resource for teaching this type of lesson, and showing students how to make sure their data is meaningful in accordance to the context. These quotes represent typical posts where teachers reflected on and discussed videos of students and teachers engaging in statistics and made connections to their own classroom practices.
Dive into Data activities. The use of technology in the Dive into Data activities for investigating real data that were multivariable and sometimes "messy" served as an additional trigger that seemed to impact teachers' perspectives. Technology experiences directly influenced their ideas that engaging in statistics is enhanced by using dynamic technology tools and realworld messy data. As illustrated in quotes from teachers in the above section on the impact of viewing videos of students' work with data, experiences that triggered reflection on the usefulness of technology came from learning opportunities that included videos of students using technology, discussions in expert panel videos, and opportunities to Dive into Data themselves.
Two prominent triggers were using the Gapminder tool in Unit 1 and engaging with Census at School for gathering and sampling data from students in Unit 4. In a Unit 5 discussion, teachers were prompted to discuss course impacts and share ideas for their classrooms. One teacher posted, "I loved the Gapminder site! I spent three very engaging days doing activities with the site and my students were simply shocked at some of the numbers. What an eye-opener!" Another indicative post mentions Census at School, The School Census [sic] data is very interesting and serves as a great resource for teaching. This type of data is applicable to our students and since it is real data, not simply some fabricated textbook example, it has more power to influence learning and thinking.
The teacher discussing Gapminder used this new resource and implemented it in his classroom; whereas we cannot tell from the teacher discussing Census at School if he intends to use it with students, but it seemed to trigger the notion of using real data as an important aspect of statistics.
On a follow-up survey that asked participants the most valuable thing they learned, teachers often identified one or some of the four previous triggers. The following is an example of a teacher reflecting on the MOOC-Ed holistically and identifying several triggers.
The most valuable aspect of the MOOC was obtaining resources for the improved use of technology to make instruction come to life and be more meaningful to students. I was able to see the statistical process in action and now have an idea of what it should look like in the classroom." For this teacher, a combination of learning about new technologies to use in statistics (Dive into Data) and engaging with videos that showed students and teachers using technology in statistical investigations seemed to make a lasting impact.

Impact on Perspectives and Beliefs
In accordance with our guiding framework, we are interested in ways that engagement with, and triggers from, elements of the external domain impact teachers' perspectives and beliefs in the personal domain. Here we describe evidence of impact on teachers' perspectives and beliefs related to teaching statistics. Because we saw comments related to these themes in discussion forums in Units 1-2, on unit and end-of-course surveys, and on the follow-up survey from participants who had only engaged in early units, the impacts on perspectives and beliefs seemed to occur with both classroom teachers who completed the course as well as those who only engaged in early units. It is beyond the scope of this paper to include a deeper analysis about differences between these groups of participants.
We found four major ideas related to how teachers' beliefs and perspectives about teaching statistics may have changed: • viewing statistics as more than computations and procedures, • engaging in statistics is enhanced with technology, • engaging in statistics requires real data, and • statistical thinking develops across a continuum.
Each perspective is described below highlighting teachers' beliefs and implied changes they would need to make in their teaching practices.
We noticed a shift in thinking about statistics as more than computations and procedures that began in discussion forums in Unit 1 and expanded in later units. This was also evident in responses to surveys. There were two aspects to this shift in perspective. The first can be characterized as a realization that the statistics they experienced and tended to teach was too focused on procedures. This was illustrated above as a teacher who had a "lightbulb moment" when listening to an expert panel video. Further, teachers recognized that a procedural approach to statistics was not aligned with their experiences in the TSDI course. For example, one teacher posted that she used to teach statistics like a pure mathematics course with a focus more on the process rather than the investigative side. This course has opened my eyes to the variety of statistical methods you can demonstrate using data investigations.
This shift in beliefs about statistics appeared in teachers' responses to the follow-up survey, where one teacher suggested that, "The MOOC prompted me to rethink what sorts of questions I ask students, shifting more to statistical reasoning questions and away from statistical processes." One teacher summarized what she learned in the course.
The statistics that I got in high school and higher education was only based on direct teaching of formulas and drill learning. After going through all the simulations, videos, and technological tools that are provided here I came to realize what statistics really is. It is much more than just the ability to read graphs or compute numerical results, but it is more about quantitative reasoning, figuring/analyzing the messy data, and building critical arguments.
The second theme that emerged is that teachers recognized that engaging in statistics is enhanced with technology. For some teachers, using statistical software was also intertwined with using real data. An example of this perception is expressed by a teacher who stated on a follow-up survey "I use more technology throughout my semester to help intergrate [sic] my lessons that help intertwine real world applications." Another teacher joined in a discussion in Unit 4 started by another participant to express gratitude (subject: "a Big thank you") for the course focused on how a particular Dive into Data experience in Unit 2 had made an impact for her. I have really enjoyed getting to know the Tuva labs website [an online graphing tool] and exploring some of the activity worksheets. I created box plots from the Pixar and Dreamworks data and got the students to try and discuss the different comparisons using the SASI levels of sophistication with median, range, IQR and LQ and UQ.
For this teacher, a combination of learning about new technologies to use in statistics and applying her understandings of the SASI framework was assisting her in creating new experiences for her students. There were certainly several posts where teachers explicitly described how they were using technology to assist themselves in learning new approaches and how they hoped to use these in their classroom. For example, a teacher in Unit 4 described: Last year, I created an account with tuvalabs, but never looked into it. So I took the data from census at schools and was able to upload into tuva labs. There, I was able to create dot plots, bar graphs, histograms, and more. The stats section is coming up here at the end of November, and I'm excited to have my students be able to use this free resource.
While she had previously accessed Tuva it was her experience using Tuva in the TSDI course with Census at School data that gave her the needed knowledge to make plans to implement this with students in her practice.
A third theme that emerged was that engaging in statistics requires the use of real (and messy) data, and in many cases datasets that included bigger data (more attributes and cases). One participant shared in a follow-up survey, the data emphasis was what I really took away from the course. There were little tidbits here and there I have "borrowed" to polish what I do-but by far I am most proud of creating more concrete data sets for my students to actually experience (say, the left/negative skew effect) rather than just showing a picture.
Teachers recognized the need to use data that included a large number of cases and multiple attributes (numerical and categorical) and that may require some cleaning (e.g., "getting real/messy data that needs to be cleaned is an important exercise in itself"). Using real data was one idea that experienced teachers contributed a lot in the community discussions. These teachers were reaffirming their pedagogies and sharing what they do for others to learn from. Consider how this classroom teacher gave glimpses into her practice, which was part of one of the longest discussion threads, in Unit 2, with 48 different posts, about the subject "Classroom experiments." I think that by having these meaningful discussions about the real world implications of statistics is what makes it real for them. Using real data sets and showing them how it relates to the world around them is not only meaningful, but is what statistics truly is. Use contexts that are real for your students. I had a class last year that was made up mostly of students who played sport. I used lots of sports datasets which are easily accessible and full of stats. This year I had a lot of students passionate about government and politics so I used a lot of governmental datasets This extended discussion is a strong example of how the online community allowed the teachers to learn from one another by discussing issues that emerged when they did classroom experiments, some sharing types of experiments they have tried, and others reflecting on their newfound bravery to try these types of experiments in their classroom.
The final theme is that teachers began to realize that statistical thinking and understanding develops across a continuum and that they could use this thinking to inform instructional decisions, use of tasks, and assessment of students. For example, one teacher indicated that, "The idea of the 4-process cycle and the different levels for different ages of each process, has helped me a lot. I understand more and feel I am a better teacher to my students." Considering statistics as developing across levels was a cornerstone aspect of the SASI framework and seemed to take hold for many teachers. After commenting on students' work in a video in Unit 3 and describing what levels she thought students may be working at on a task, another teacher noted, … with the SASI framework, I like how it never mentions age or grade level. I feel it's a continuum that students, depending on the context, can move back and forth between. If they get to a harder problem, they may not know how to exactly collect the data without bias and ensuring randomness. But with an easier experiment, that may be more obvious to them.
Some teachers indicated they would use specific tasks from the courses with their own students, suggesting they would implement tasks that included more student engagement with the four phases of a statistical investigation. For example, one teacher said, "I have done a lot of labs with my students but I really loved this one [coke vs. pepsi] to try. I can't wait to see how they react with this one." Other teachers showed evidence of applying more general pedagogical knowledge about implementing tasks that involve the investigation cycle and can develop statistical habits of mind. Some indicated they would utilize the task design resource in selecting and/or adapting and implementing tasks in their classrooms that could support students at different levels.

Discussion and Conclusion
Researchers have yet to agree on the most appropriate ways to measure participants' progress and outcomes as they engage in MOOCs (Perna et al., 2014). Despite these inconsistencies, a common way to evaluate the impact of MOOCs has been to report completion rates or retention rates. Koller, Ng, and Chen (2013) define retention rate, or completion rate, as the fraction of participants who enroll who successfully complete the course using criteria established by the instructor. Perna et al. (2014) define retention rate as the number of people who accessed the last module of the MOOC, divided by the number of participants who accessed the first module. While definitions of both vary throughout the literature, completion rates typically range between 5% and 19% of registrants (Ho et al., 2014;Koller, et al., 2013;Perna et al., 2014). Recall that Jacobsen (2019) found that educators who had only accessed a few resources in the first two modules of an online PD, reported having meaningful interactions with those resources, and how their engagement impacted their practices. Loizzo et al. (2017) found that one measurement of success of a MOOC was that participants gained new resources. The major findings from our study are discussed below to provide broader implications for research and design in online PD.
In just this one course offered over a 15-week period, almost 300 classroom teachers engaged in at least the first unit, with 31% of those teachers completing the course through Unit 5. Thus, the MOOC-Ed succeeded in reaching and engaging K-12 teachers, with evidence of high engagement by many with different resources and active participation in discussion forums. This completion rate is higher than reported with most other MOOCs (e.g., Perna et al. 2014). We know that not everyone intended to complete the course, but some teachers who only participated in Unit 1 engaged in discussions, responded to follow-up surveys, and showed evidence of reflections based on triggers such as expert video discussions about how statistics is different than mathematics and seeing students in a video using the Gapminder tool (all introduced in Unit 1). By using data from discussion forums, end-of-unit surveys and follow-up surveys that included anyone who enrolled in the course, we were able to include perspectives of teachers who may have only engaged with a few resources. Thus, our approach to data sources expands how Jacobsen (2019) examined ways online PD can impact educators' beliefs, perspectives, and practices.
One challenge in designing online PD for teachers is identifying how to leverage stimuli that has the potential to act as triggers to impact teachers' beliefs about teaching. For those who are at a crossroads facing this challenge, our identification of triggers can provide guidance as they embark on designing and implementing online PD efforts for teachers. While we have no evidence (yet) that teachers' experiences in a brief online PD in teaching statistics has impacted actual teaching practices and students' learning, our research indicates that the purposeful design elements of the course were successful in causing critical reflection through certain triggers. Having a framework that can guide teachers' ability to plan tasks and assess students can provide a way for teachers to understand a bigger picture of teaching the content beyond what is in their particular grade-level curriculum. Active learning opportunities to experience new technology tools and engaging tasks was a critical trigger. PD for teachers should include opportunities to engage more deeply, and perhaps in a different way, with content teachers are expected to teach. Designers of online PD need to continue to find ways to engage teachers in such active learning opportunities.
The use of two types of videos that appeared as triggers is important to consider in future designs. For those that work in teacher education, it is not surprising to hear that teachers can learn much from watching and reflecting on videos depicting students' thinking on tasks and teachers' pedagogical moves (e.g., Chazan, 2018). It may be surprising though, that teachers learn a lot from videos that are conversational in nature between expert educators in a domain. In a typical faceto-face PD, there is generally 1-2 leaders who engage teachers in activities and present material. Current practices in online PD may tend to feature a single instructor presenting critical information in lecture-style videos. Rarely do teachers get an opportunity to hear a discussion about critical issues related to teaching and learning. While each unit in the TSDI course had a brief video of the instructor introducing key ideas in the unit, these were rarely brought up in discussions. The exception was a video in Unit 3 where the instructor illustrated the SASI framework with examples from students' work. Quite simply, hearing from the instructor alone in videos did not seem impactful; but, hearing from the instructor engaged in discussions with experts in the field (see sample expert video linked in Appendix) served as triggers for educators to experience cognitive dissonance about their own ideas that they in turn seemed to willingly discuss in forums.
The classroom teachers not only learned from expert opinions, but also from the voices and experiences of other teachers and participants with whom they interacted with in the course. This is similar to findings from Loizzo et al. (2017) where some MOOC participants expanded their world views by engaging in forums where they shared their personal experiences. In other research on the posting behaviors of participants in this course, Bonafini (2018) found that there was one classroom teacher and three other non-classroom teachers who served as super-posters and contributed greatly to conversations through starting threads and replying to many posts by others. Peer voices along with the voices of the instructional staff in the forums acted as additional resources to support collegiality and practical exchange of ideas outside of teachers' physical school environment (Borko, 2004;Mackey & Evans, 2011). Well-designed discussion prompts focused on pedagogical issues and an open forum for sharing indeed provided opportunities for teachers to express their critical reflections and share in development of new classroom practices.
Many teachers reported increasing their confidence to teach statistics and appeared to move towards beliefs that we should engage students through investigations, not merely teach them mathematical tools to apply to numbers devoid of context. Thus, our results in this online context align with others who have done PD about teaching STEM content in face-to-face contexts (De Vries et al., 2013;Eichler, 2011;Seung et al., 2011). Like MOOC participants' in Loizzo et al.'s (2017) study, who measured one aspect of success in that teachers were able to apply things they had learned, our teachers were attracted to and made sense of how to apply a framework to their practice. Teachers learned a lot about what it means to engage in statistics, by doing it themselves, as well as from examining students' thinking in videos. Is any of this a big surprise? Perhaps not to experienced teacher educators. However, the key is to include these types of learning opportunities in online PD, whether it is to a local group or massive and open to teachers around the world. To help answer the call from Marrongelle et al. (2013), our research also supports the idea that online courses that emphasize: (a) self-directed learning, (b) peer-supported learning, (c) job-connected learning, and (d) learning from multiple voices can be effective for designing online PD in teaching STEM content (e.g., teaching statistics) that need wide-scale efforts to impact the perspectives and practices of classroom teachers.
Of course, our research is limited by the fact that we did not include interviews, collection of artifacts of practice (e.g., lesson plans or tasks), or conduct classroom observations of a subset of teachers. Such methods should be included in future studies and would provide more nuanced and direct evidence of whether teachers' espoused changes in perspectives and beliefs, and intentions for changes in their practices, were actually realized in classrooms. Working with a dynamic simulation tool (to explore Schoolopoly task) video with animated depiction of students working on task with human reading task and real student voices and images of computer work and video of computer work (4:24 min) https://youtu.be/VuFjTaGgsCw Multiple levels of sophistication (with Schoolopoly task) video with animated depiction of teacher introducing task and three student pairs working on task with computer images or written work (voices automated) (5:09 min) https://youtu.be/tdLx7eMecB4 Sample Dive into Data experiences Dive into Data About Vehicles Using CODAP A random sample of 300 vehicles manufactured in 2015 is provided to explore questions about relationships between fuel economy in the city and highway, types of transmission, hybrid vehicles, annual fuel cost, and number of cylinders.
https://codap.concord.org/releases/l atest/static/dg/en/cert/index.html#s hared=16202 Dive into Data about Fairness of Dice for Schoolopoly game with GeoGebra simulation Given a simulation of dice produced by six companies. Investigate whether or not the die made by each company is fair. Collect data through a simulation and support a decision as to whether to recommend that dice be purchased from each company.