Database of Case Studies

Atomic Learning

Project Title: Atomic Learning

Organization: Atomic Learning, Inc.

Contact: Dan Meyer, Chief Executive Officer

URL: www.atomiclearning.com

Date First Implemented: 2000

Audience: The original target audience was K–12 teachers in the upper Midwest (in and around Minnesota) but now includes educators from more than 47 countries, plus a component for higher education.

Need: The original target audience was K–12 teachers in the upper Midwest (in and around Minnesota) but now includes educators from more than 47 countries, plus a component for higher education. In the late 1990s, a network of technology coordinators in Minnesota had routinely gotten together to talk about issues they faced helping schools integrate technology, and training was a frequent topic of discussion. Before teachers could truly integrate the new digital technologies in teaching in learning, they required a foundation of basic technology skills. The group set out to use technology to answer common questions related to basic technology skills in order to have more time to work on technology integration issues. Those short technology tutorials were later launched as the new company, Atomic Learning.

Intended Outcomes: The original intent was to provide just-in-time training or to address a teachable moment related primarily to software that was finding its way into classroom in the late 90s. It was not originally intended to be courseware or to provide courses on a topic that took a substantial amount of time to complete. As the service has grown and the number of teachers who have developed foundational skills has increased, the focus has changed from “how do I learn . . . ?” to “how do I apply . . . ?” Atomic Learning now provides more resources related to integration practices as well as 21st century skills and assessments; however, they still provide basic skills training and support more than 150 software applications.
Incentives: Participants receive certificates of completion.
Instructional Design Considerations: The original constraint was that the tutorial sessions had to be presented within a 3-minute timeframe. Bandwidth was originally an issue, but Meyer says that if it took longer than 3 minutes, the idea was that you were probably trying to answer a different question. The “atom” refers to the core idea or kernel of the idea you were trying to answer.

Designers include experts on the software as well as pedagogical experts, sometimes practicing teachers.

Lessons Learned: The 3-minute “atom” of learning that has been the guiding principle for the development of learning objects provided by Atomic Learning has become its strength. During the school day, teachers often only have small blocks of time to conduct research or confirm a process, and that 3-minute opportunity filled an important need.

Customers wanted to be able to measure progress, with pressure for a quick and easy forced-choice assessment being a common request, but the short segments did not lend themselves well to such an assessment. These types of assessments especially did not lend themselves well to the philosophy of the Atomic Learning staff. Skills can be assessed by reviewing the application of skills embedded in projects. Teachers requested and now can receive certificates of completion that show they have completed segments of training.

Meyers reports that starting out as educators helped because they felt they understood schools and their needs. But over time, needs change, and you have to be careful to listen to your customers to meet their needs.

Technology can change quickly. Meyers notes that you can build interest and get the word out through early adopters, but to build for sustainability, sometimes you have to slow down and let people catch up to you.

Evaluation: Atomic Learning conducts a yearly customer survey of their large user base that helps them determine new features and enhancements. They also conduct focus groups in the product development process to provide formative feedback.

Comprehensive Literacy Program

Project Title: Comprehensive Literacy Program

Organization: Edvantia, Inc. for the Tennessee Department of Education

Contact: John Ross, former Senior R&D Specialist

URL: www.epd.edvantia.org

Date First Implemented: 2004

Audience: K–3 teachers and principals; K–12 teachers of special education

Need: The new Reading First program provided an influx of funds to states and provided an increased opportunity for professional development that encouraged the use of research-based practices to improve literacy instruction for early readers. The Tennessee Department of Education identified online professional development as a strategy for providing access to the same high-quality professional development to all faculty in all 56 (later 75) schools that received Reading First funds and contracted with Edvantia to design and deliver the professional development.

Intended Outcomes: The goal of the first year was to provide access to the same high-quality content to all 56 Reading First schools in Tennessee. There were not sufficient funds to hire external evaluators to visit all schools or a representative sample of the schools, so the literacy leader—funded by the grant—was included in the training in order to help determine if faculty at Reading First schools were indeed incorporating skills and knowledge from the professional development in their instruction. A pretest and posttest with questions matched to the objectives of the instruction were developed to monitor changes in participant knowledge. The project continued for 5 years, and a second course was developed and launched in the fall of 2004 due to popular demand by the participants in the first course.
Incentives: Schools applied for Reading First funds. Faculty and administrators at schools that received these funds were required to participate in one of the two 13-week courses per year using funds from the grant and received 30 professional development hours for their participation.

Instructional Design Considerations: Building on the expertise of internal staff in face-to-face professional development as well as multimedia and online instructional design, the course was designed to provide an online experience that capitalized on the best research available for online professional development. Special attention was paid to the development of learning communities and to the needs of adult learners. Reading-specific content and skill acquisition were emphasized over technical skills. Guidelines for content development included internally developed instructional design guidelines as well as application of Keller’s (1987) ARCS model of motivational design.

Lessons Learned: There were two significant barriers to success identified by participants: time and technology. Teachers reported lack of time to complete the course and a conflict with too many other school commitments. Strategies to address the barrier of time developed through consultation with participants include making sure there is buy-in for the program from the principal and that the school leadership places as much value on the online program as any other professional development program. Most teachers completed the course at school, so time should be made available during or immediately before or after the school day to allow teachers to work on the activities in the course.

The most common problem faced by participants was forgotten passwords, and several automated password reminder options were programmed into the system. Several technology-specific problems were noted, including older computers or low-bandwidth Internet connections. Because the course included videos, CDs containing all videos and handouts were mailed to each school to allow teachers to access these materials offline, if necessary. Other technology problems included not being able to install software, such as media players or Acrobat Reader, on school computers. Pop-up blockers, spam blockers, firewalls, and districts that used network caching software also prevented some participants from accessing the content or receiving automated e-mails generated by the system. Program staff attempted to bypass these problems by including answers to Frequently Asked Questions (FAQ) on the website as well as including these topics in the facilitator (literacy leader) training and supporting materials.

A “group self-paced” model was developed for the program based on the premise of developing an online learning community that already has a designated leader with some content expertise. The course included supports for this leader, such as a facilitator’s notebook and companion CD, guidance on preparing for course modules at the beginning of each, additional tips and hints indicated by a leader icon throughout the content, administrative reports to monitor group progress, and a separate online leader community in which they can ask questions and share advice.

Evaluation: The tests developed for the course consist of a 68-question pretest and five matched posttests corresponding to the learning objectives for the five modules. Participants were also given a chance to complete an online evaluation of each training module. Web-based discussion boards were available to participants to help address questions and share advice and information on the programs. Additionally, upon completion of the professional development, Literacy Leaders from each school participated in focus group interviews and a postprogram satisfaction survey.

Pretest and posttest scores of the participants were compared during the first semester, and significant gains were made by the participants as a whole. Posttest average scores were above 80% for every module. Further analysis compared scores by groups of literacy staff (Literacy Leaders and reading resource teachers) versus all others (classroom teachers, administrators, and unknown positions). The posttest scores for the classroom teacher/administrator group showed a significantly larger increase than those of literacy experts.

When interviewed, a majority of the literacy leaders reported that the initial reaction to the program was not positive; however, by the end of the program, literacy leaders report an overwhelming positive reaction to the program by their teachers, with agreement at a rate of 2:1 that their teachers ended up liking the program.

Keller, J. M. (1987). Strategies for stimulating the motivation to learn. Performance and Instruction, 26(8), 1–7.

Creating Powerful Online Learning

Project Title: Creating Powerful Online Learning

Organization: Created by EdLabGroup, now offered by Peer-Ed

Contact: Matt Huston, Director of Online Learning

URL: www.peer-ed.com

Date First Implemented: 2007

Audience: K–12 educators, most in the Northwest (Washington, Oregon, Montana, Idaho)

Need: Matt Huston became involved with the Concord Consortium and the then-new Virtual High School as a teacher in 1996–1997 and received experience taking and later developing and facilitating online courses. Concord’s facilitation model emphasizes facilitated, guide-on-the-side instruction, which puts the onus of learning on the learner. Huston reports that in the Northwest region, where he is currently located, there was a growing interest in providing online learning, whether completely online or in a hybrid format. Huston combined his current work with the Microsoft Peer Coaching Program with his knowledge about and experience with developing rigorous online learning to create this 6-week professional development course to address the growing interest of educators in the Northwest to provide online learning.

Intended Outcomes: Huston developed a 6-week online learning course to help educators understand how to develop online learning of their own. Huston’s vision includes the following four expectations: 1. Participants would engage in collaborative online learning; they would learn how to be online learners through participation. 2. Participants would engage in collaborative online course-building; they would learn how to be online designers using a hands-on approach and, in the process, gain comfort and skill in using an online learning management system. 3. Participants would use web affordances—video, fee-free texts, and other relevant resources found on the Web—as they designed their courses. 4. Participants would engage with one another in a course designed to promote deep understanding and would be encouraged to use the underlying understanding-based course design principles when they built their courses.
Of these, Huston’s observation is that approximately 90% or more of the participants achieved the first two expectations, with approximately 75% or more of the participants achieving the third expectation. Adopting an understanding-based course design principle garnered much interest but was harder to accomplish in a 6-week span; Huston estimates perhaps half of the participants successfully adopted the principles in courses they developed.

Incentives: Participants can receive credit in the form of clock hours or, for an additional fee, university credit. Certificates of completion are provided. Participants are also given a course in Moodle to practice with during the class that they can then take with them and load on a different Moodle server after completing the professional development.

Instructional Design Considerations: Huston used a backwards-design approach, Understanding by Design (UBD), to develop content. Using UBD, Huston determined core understandings he wanted the participants to achieve and worked backwards to choose and sequence activities. He also incorporated principles from Teaching for Understanding to help the participants come to a clearer understanding of their own learning. The use of these principles can help participants subsequently create activities that draw upon higher levels of cognition that go beyond recall and identification.

Because of his experience with the Concord Consortium, Huston incorporated facilitated collaboration and considers it a critical component of the course design. Facilitators prompt discussion with one or two probing questions designed to provoke critical reflection on the content and the users’ experiences. They summarize participant responses but seek deeper responses from participants about the tensions raised during the conversation by responding with probing questions to the group rather than individuals. Every week in the 6-week course featured one, two, or three significant collaborative activities. Initially, most of these were discussion; towards the end of the course, as participants spend more time building their own courses, the collaboration is more likely to focus around design and content choices, with one or a few trusted colleagues offering feedback.

Lessons Learned: Based on his experience, Huston offered the following design advice: “Be sure to budget time and resources, and do additional learning on your own part as needed” for each of the following key elements:

–“Assessing your participants’ needs carefully. What attitudes, skills, knowledge, and understanding should participants have by the end of the course? Which of those do they already have? The difference is the gap you seek to fill.”

–“Working with clients (or potential participants if there are no formal clients) on the course goals and objectives. If your OPD is reading- or science-focused, the content, goals, and objectives will be very different than if your OPD is focused on preparing teachers from a variety of disciplines to become online facilitators.”

–“Writing, testing, revising, and arranging compelling activities that are just-in-time for your participants. You may want to think about weaving themes throughout your course. For a particular type of online professional development, one theme might be ‘Web 2.0 tools.’ In Week 1, you might introduce the theme and a single tool, plus a couple of academic applications for it; in Week 2, you might ask participants to research and describe potential in-classroom uses of a single, useful-seeming Web 2.0 tool; in Week 4, you might ask participants to form a small group and design part of an activity employing one Web 2.0 tool—with the proviso that the activity must address understanding, not lower level tasks.”

–“Designing a simple, consistent, pleasing interface in the learning management system you are using. Participants will appreciate your use of consistent week or section syntax; a logical arrangement of activities within each week or section; and easily accessible key documents, such as a course syllabus, list of resources, and final evaluation.”

–“Assessing your participants’ progress continually. Embedded formative assessment is a powerful instructional strategy that is more accessible as the proportion of hands-on, constructivist activities rises. An end-of-course evaluation matters because participants will, one hopes, be honest about those activities that did and did not work well for them; this feedback will help you revise the course (or know that revision is not needed) for future cohorts. Longer term feedback regarding whether and how participants have applied the understandings they gained from your OPD matters as well.”

Evaluation: Early on, the course was offered online and later in a hybrid setting with a school district in Washington. Feedback from the participants in the hybrid setting was used to revise and refine the course, including the fully online version.

Each 6-week course concludes with a participant evaluation that asks for qualitative data on their experience in the course. These evaluations have been quite positive, with an 80+% satisfaction rating with the course.

In the winter of 2009–2010, Huston surveyed 30 of the participants who took the course between 2008 and 2009 and asked longer term evaluative questions such as “Did the course impact your teaching? If no, why not? If yes, how did it impact your teaching, and how do you know the course had that impact?” The results were widely varied but revealed several important findings that both reassured Huston that participants were using material and practices from the course and provided some suggestions for further revising the course to best meet teachers’ needs.

EdTech Leaders Online

Project Title: EdTech Leaders Online

Organization: Educational Development Center, Inc. (EDC)

Contact: Barbara Treacy, Director

URL: http://edtechleaders.org

Date First Implemented: 2000

Audience: K–12 teachers and school administrators

Need: EDC, a nonprofit educational institution, wanted to help districts, states, institutions of higher education, and others provide high-quality professional development. This grant-funded program grew out of earlier research on experimenting with online learning in a collaborative effort between EDC and the Harvard Graduate School of Education originally funded by the AT&T foundation. AT&T wanted to scale up the opportunity, so in the fall of 2000, EDC took the challenge to build the EdTech Leaders Online (ETLO) program to scale high-quality professional development online. ETLO then became the core online professional development partner of the of the 10-state e-Learning for Educators Ready to Teach Grant funded in 2005 by the U.S. Department of Education to Alabama Public Television.

Need: EDC was intent on finding an effective way to provide professional development to educators. They experimented with technology as the medium, but the goal was high-quality professional development.

Incentives: Participants get a certificate for the amount of hours in the course. Graduate credit is available for a fee from Antioch University. State programs may offer graduate credit or continuing education units according to state guidelines.

Instructional Design Considerations: EDC uses a learning community model with discussion at its core. The program has required the use of a strong facilitated online discussion to help support a learning community model. Some synchronous webconferencing has been used, but sparingly because EDC wants to still provide the flexibility to allow participants to meet their weekly learning goals on their own schedules.

Courses are based on week-long sessions that include readings, online and offline activities, and a focused discussion prompt. Their design encourages carefully crafted prompts to engage participants and encourage them to be reflective in their responses. There are goals for the week aligned to goals for the overall course. The discussions actually become one of the ways to determine what and whether the teachers are learning.
Project-based learning is another strategy EDC incorporates. Besides the weekly projects, there is often a culminating final project that is built through the progress of the course. Local courses include projects with outcomes that teachers or administrators can implement in their school, district, or organizations. These projects are another way learning can be assessed.

According to Treacy, “Less is more.” Limited funds helps you learn to design tightly. Content must be aligned to the learning goals. Again, she discourages throwing multimedia in just to have it. “Be judicious. Just because two readings are good, 25 aren’t better.” Stay focused on helping the learning to be better.

Lessons Learned: EDC uses a capacity building approach in which the participants are not just signing up to take a course but are being involved in a program in which they learn how to teach online as well as create content in a range of grades and subject areas. Learning to teach is the first part, but then participants deliver online courses in the year following the training. This has allowed EDC to develop a community of trained online designers and facilitators.

EDC has tried to build in addressing the “Why” question (the focus of Chapter 1) as a core aspect of the program. Common questions, reports Treacy, are “Why are you interested in online learning? What goals are you trying to meet? What’s the connection with other initiatives?” EDC really wants the leadership to understand what they’re getting into and why they’re getting into it, so they can really become engaged in the process.

The earliest programs were presented using HTML webpages that were fairly quickly moved to a learning management system (LMS). EDC was trying to build a scalable program, and the LMS offered opportunities for scaling the program. Recently, with the explosion of Web 2.0 and virtual meeting tools, the program has tried to investigate how these tools might support the delivery of high-quality professional development. Not every tool was appropriate. Different tools serve different learning goals and different learners. According to Treacy, “It’s important not to throw in tools for the tool’s sake. Participants will push back if the tool is not making it easier or better. It has to increase learning or make it more efficient to be useful.”

Treacy’s excitement over the impact of the program is palpable. As she notes, “It’s exciting to see the many different ways the core training and courses the program provides have impacted local programs in unique ways. When we’ve aligned our program to local goals, that’s when we’ve done our job well.” For example, in one state in the program, the online courses offered by locally trained facilitators enabled teachers on emergency certificates to obtain the professional development they needed to stay in the classroom.

Evaluation: EDC uses pre- and post-surveys and conducts annual evaluation reports. Boston College recently completed an evaluation report over multiple years of participation titled e-Learning for Educators. Effects of Online Professional Development on Teachers and Their Students (O’Dwyer et al., 2010) is available from the project website. Results show that teacher participants are learning and liked learning online. They report using what they learned online with their students. There is indication of retention of what they have learned, even after 6 months. The results are consistent across settings, such as rural, urban, novice, state-by-state, or different grade levels.

Reference: O’Dwyer, L. M., Masters, J., Dash, S., De Kramer, R. M., Humez, A., & Russell, M. (2010). E-learning for educators: Effects of online professional development on teachers and their students. Chestnut Hill, MA: Boston College.

e-Learning for Educators: Missouri

Project Title: e-Learning for Educators: Missouri

Organization: eMINTS (enhancing Missouri’s Instructional Networked Teaching Strategies) National Center

Contact: Christie Terry, Missouri Program Director

URL: http://www.elearningmo.org/support.html

Date First Implemented: 2003

Audience: K–12 Teachers

Need: eMINTS is a successful professional development program that helps K–12 teachers use technology to transform learning in their classrooms. Five years ago, the eMINTS National Center was offered an opportunity to participate in a Ready-to-Teach grant application to help build the capacity of the participating states in the development of online professional development. With the eMINTS staff’s expertise in technology and professional development, it was a natural move to add the online professional development piece of e-Learning for Educators to the eMINTS National Center’s repertoire.

Intended Outcomes: The Ready-to-Teach grant outcomes closely mirrored goals that eMINTS already had established: to build a cadre of educators who could both develop and facilitate online professional development and to create a presence in the state for online professional development and its potential benefits. e-Learning for Educators used the Ready-to-Teach grant resources, including course development processes offered by the Education Development Center (EDC), a partner on the grant. Over the past 5 years, however, Christie and her team have modified that process, especially using the Backwards Design (Wiggins & McTighe, 2005) and the 5 Es (Trowbridge & Bybee, 1990) instructional design models. The models were chosen because these are the models eMINTS and e-Learning staff wanted teachers in their programs to use. Both eMINTS and e-Learning try to model what they want their teachers to do in their professional development sessions and courses.

One of the driving philosophies of e-Learning for Educators is that the program is not going to succeed in isolation. Says Terry, “We definitely believe that, as a program, we want to be sustainable. Partnerships are the key to that. We need to work with as many programs as possible and the territorialism that can crop up as an artifact of funding is counterproductive.”

Incentives: Professional development contact hours that can be used to meet professional classification requirements for state certification and optional graduate credit through local universities.

Instructional Design Considerations: The e-Learning for Educators team takes steps to educate their content development teams about online education and teach them what’s possible, how to support collaboration, how to write good questions, and more, but they do not actually build the course online. If the content development teams think, “we really need this video . . . or interaction . . . or a discussion board,” the team members themselves don’t have to create the technical capacity to add those elements to the course. Those suggestions don’t necessarily spark action without consideration either. Terry says they do not want teams to get bogged down in the technical aspects. Those are taken care of by project staff with expertise in the technologies; however, just adding “techie” pieces to a course for the sake of having them present is not always encouraged. The content development team focuses on developing good content. Terry comments that many outside agencies often ask, “Where are all the shiny pieces?” Her response? “We’re working on the content now. We can worry about the shiny pieces later.”

Lessons Learned: Individuals who apply to become course facilitators for the e-Learning for Educators program have to take a 10-week course on online facilitation. e-Learning has developed its own version of the course specific to the needs and expertise of educators in the state. Facilitators also have requirements they have to meet as they teach throughout the year to maintain active status. Facilitator contracts have to be renewed each year. Some requirements may include taking professional development courses offered by e-Learning, posting support materials that are shared with other facilitators of the group, contributing to the e-Learning blog post, or writing a short article. Terry believes it’s important that facilitators remain connected to and contribute to the community of facilitators.

eMINTS provides technical staff, who in turn support the e-Learning staff as they handle all technical and administrative tasks. This allows facilitators to focus on teaching and learning. e-Learning staff set up courses, load the content, prepare gradebooks, enroll students, and follow up with students with participation issues. They also report grades and are the interface with the universities that offer credit.
There are many other organizations that provide professional development in Missouri, and Terry reports they tend to be in competition with each other. Both the eMINTS and e-Learning for Educators programs take a different approach. They try to provide districts with the capacity to deliver high-quality professional development, either using content provided by eMINTS and e-Learning or developed by the district itself. According to Terry, “This is about collaboration, not competition. It’s okay if we empower a district to not need our services any more. There’s another district coming on down the road.” Districts are taught how to write their own courses and then encouraged to share with others. As Terry says, “It changes the relationship. We don’t have the capacity to write all of the content we need to write.” The districts are valuable collaborators.

As a part of modeling practices, both eMINTS and e-Learning opted to use Moodle as their learning management system (LMS) because it is a tool that many schools have access to. They also incorporate freely available Web 2.0 tools and consider Google Docs a standard tool for their courses. Terry says that you “want to model things that teachers can use in their classrooms. You’ve got to look at it from the perspective of your users. YouTube is blocked in most of our districts. Voki has advertisements and won’t be allowed by most of our districts. You have to go beyond ‘how cool is it?’”

Because LMS evolve and many programs have to change their LMS, which can be a complicated and intense process, Terry recommends that you not tie your system’s URL to your LMS (e.g., www.MyOPD.serviceprovider.com). You don’t want your URL to be tied to a piece of software, because it could change!

Evaluation: eMINTS has a long history of program evaluation research. Since its inception in 1999, eMINTS has had multiple external evaluations of the efficacy of its professional development programs. It’s not surprising that program evaluation and research are important elements of the e-Learning program as well. e-Learning participated in a twofold evaluation recently. The first portion was part of the overall project evaluation for the eLearning for Educators grant program that included pre/post-surveys of participants. e-Learning also has its own evaluation survey that has undergone approval through the University of Missouri Institutional Review Board (IRB). There are some differences. Program staff try to be careful about not overdoing it, not “surveying people to death.” The current model is pre/post, with a follow-up 6 months later. Common questions include, “Did you apply it in your classroom? Did you find it useful? Would you take another course?”

The second portion of the program evaluation involved a random control trial (RCT) of the effects of online professional development on both teacher content knowledge and student performance. Collectively, the four trials (Grade 4 English Language Arts [ELA], Grade 7 ELA, Grade 5 Mathematics, Grade 8 Mathematics) provide strong evidence that participation in the three e-Learning for Educators courses used in the study had positive effect on teachers’ instructional practices and content knowledge. The trials also provide evidence that teachers’ participation in the courses can have a positive impact on their students. Although these effects are smaller and occur less consistently, a statistically significant impact was found for at least one student measure in each trial. A full description of the study and results are available at www.bc.edu/research/intasc/researchprojects/eLearning/efe.shtml. While e-Learning for Educators: Missouri has not had sufficient resources to conduct this type of rigorous program evaluation at the state level, funding for this type of evaluation will be sought.

References: Marzano, R. J., Pickering, D. J., & Pollock, J. E. (2001). Classroom instruction that works. Research-based strategies for increasing student achievement. Alexandria, VA: Association for Supervision and Curriculum Development.

Trowbridge, L. W., & Bybee, R. W. (1990). Becoming a secondary school science teacher (5th ed.). Columbus, OH: Merrill.

Wiggins, G., & McTighe, J. (2005). Understanding by design (Expanded 2nd ed.) [Electronic resource]. Alexandria, VA: Association for Supervision and Curriculum Development.

Florida School Leaders

Project Title: Florida School Leaders

Organization: Florida Department of Education

Contact: Henry Pollock, former Director of Education Retention Programs

URL: https://www.floridaschoolleaders.org/

Date First Implemented: 2005

Audience: Current and preservice K–12 educational leaders in the state of Florida

Need: Florida School Leaders built upon the success of a previous program, FloridaLeaders.net, funded by the Bill & Melinda Gates Foundation, which helped to establish a statewide network of coaches and mentors. The earlier model of using coaches and mentors, drawn primarily from retired high-performing school leaders, became the foundation for the new Florida School Leaders program.

Intended Outcomes: Old standards for principal certification became obsolete in 2000, and so the Florida Department of Education convened a group of participants from different stakeholder groups, including faculty from institutions of higher education, district administrators, and other practitioners who spent about 3 years developing a new set of state standards for what a high-performing school principal should be expected to do. To implement these standards, the state had to retrain all existing principals as well as work with all institutions of higher education that prepare the bulk of aspiring school leaders in the state. Florida School Leaders was developed to meet those needs. It was a hybrid model, with both face-to-face and online components.

Incentives: Participants in university educational leadership programs received appropriate graduate credit, and district school principal preparation programs received professional development hours for participation in approved William Cecil Golden School Leadership Development online modules and workshops.

Instructional Design Considerations: To improve the collaboration between developers and provide a more consistent user experience, the Florida Department of Education developed its own website and required each of the partner organizations in the program to import their content into the state website. The state partnered with one entity that set up a consistent navigation and interface design in order to improve the user experience. This also resulted in a single-user sign-on to all content for participants. All participants from across the state could access all of the components, regardless of where they were developed. Participants could pick and choose the most relevant components from the learning library, because not everyone needs the same materials or training. The management system allowed district and state monitoring of participants demographic data and program participation to better meet program participants’ training needs.

Lessons Learned: In Florida, when money is available, the state usually posts a request for proposal (RFP), and different regional entities submit proposals to develop programs to meet the requirements for that RFP. What you end up with, according to Pollock, is “five flavors of the same ice cream.” Each region wants to develop and use its own training materials, but not those from other regions. Florida School Leaders forced regional entities to collaborate. Fifty percent of the award had to include subawards to other partners, so no individual entity could win the entire pot of money. The result was that seven regional entities ended up being the lead on one of seven components and a subcontractor to each of the other six. Each regional entity then developed products that were delivered statewide. The state had to provide some oversight and facilitated the relationships between organizations that were not used to collaborating. Each regional organization had areas of expertise, so the state allowed each to leverage its expertise but also pull in the complementary expertise of others.

Pollock notes that sometimes, “there can be a disconnect between higher education and K–12 practitioners” in terms of what each feels is important for educators to know and be able to do. Higher education faculty tend to want to create materials based on a faculty member’s area of expertise (e.g., law, finance, etc.), because that is often the reason they are hired. A school or district wants to have holistic materials in which different areas of expertise are embedded in a larger framework (e.g., legal requirements in special education, financial requirements across school programs, etc.) because that’s how their job duties are structured. Florida School Leaders took the input from university faculty but organized it in a holistic framework to meet the needs of the district target audience members.

The state tries to provide a lot of just-in-time training opportunities as well as longer course-based materials. The site also has a community model in which participants could post information from their school or district to highlight promising practices.

Pollock suggests that no matter what type of professional development effort you pursue, don’t get too hung up on one type of technology. Technology changes too fast. Try to be as “platform-neutral” as possible. You don’t know what’s going to be out there even 2 years from now.

LEARN NC

Project Title: LEARN NC

Organization: University of North Carolina at Chapel Hill School of Education

Contact: Ross White, Associate Director

URL: www.learnnc.org

Date First Implemented: 1997; first online courses launched in 2000

Audience: K–12 educators in North Carolina

Need: LEARN NC was founded in response to a state-based report that highlighted the disconnected nature of the teaching practices in many of the schools in the state where teachers routinely teach in isolation and have little opportunity to share their professional knowledge. Through work with six pilot districts, LEARN NC began by providing lesson plans, annotated descriptions of web resources, and discussion forums for teachers in the state.

Based on the success of their initial work, LEARN NC collaborated with the NC Department of Public Instruction, institutions of higher education, and other entities to develop online professional development that is then made freely available to school districts in the state.

Intended Outcomes: LEARN NC originally intended to provide access to vetted teaching resources as well as support for online learning to all K–12 teachers in the state. Rather than being an online learning provider, LEARN NC set out to license or purchase a learning management system (LMS) for every district in the state to use.

In 2001, LEARN NC worked with the NC Department of Public Instruction to develop content in a train-the-trainer model that was made available for free to teachers across the state. The following year, at the request of the Department, LEARN NC began developing courses for K–12 students and taught faculty to deliver them until the North Carolina Virtual Public High School was chartered in 2007. LEARN NC leveraged lessons learned from its K–12 experience to then develop online professional development.

LEARN NC initiates and oversees the development of content by partner organizations. More than 70 professional development courses are available free for school districts in North Carolina, with additional courses available directly from LEARN NC.

Incentives:

Instructional Design Considerations: The backwards design process is central to the instructional design philosophy at LEARN NC. White acknowledges that any time you work with a content expert with deep knowledge and passion for their content, you can struggle with instructional design in terms of determining reasonable objectives for the learners and chunking that content appropriately. LEARN NC develops objectives first, and then develops assessments to guide content development. Backwards design also gives instructional designers the opportunity at every step to ask for research or evidence to support proposed content, and many content experts are surprised to have to consider those questions.
LEARN NC has codified a process for content development that includes a team of people including a subject-matter expert, an instructional designer, and a sponsor. The sponsor initiates the project, often the funder or someone who has critical oversight of course development, and makes sure the resulting project is not only good instruction but meets the initial needs of the intended audience. The sponsor can also be the “tiebreaker” on decisions that may be split 50-50 by the content expert and instructional designer. White notes it’s best when the sponsor is available to review the project periodically, but often if they have to compromise on one aspect, it is that review because the sponsor may not always be available.

Lessons Learned: While the larger school systems in the state had greater capacity to create and deliver online learning as compared to the smaller rural districts, they didn’t always do so. Of those that did, many were reticent to adopt the model of sharing content for free followed by LEARN NC, perhaps, surmises Ross, because “teaching has traditionally been isolated and collegial sharing is undervalued.” It did become popular in the smaller districts, though. There are more than 80 school districts in North Carolina classified as rural, and much of the early use of LEARN NC came from those districts. Ross has noted some change recently where districts that develop content in the online program are now more likely to offer it up for statewide use.

LEARN NC first provided discussion forums to teachers in order to provide an opportunity for professional dialog. The forums were turned off in 2001. LEARN NC discovered that the open format of the discussion forums wasn’t focused enough, so educators didn’t find enough purpose for participating. Based on that experience, LEARN NC is now much more purposeful about the use of technologies to support learning communities. Their use of discussion forums is now much more focused, usually limited to a cohort of individuals and moderated by content expert. Most are also limited in duration, which may not be a traditional learning community approach, but it has been a much more successful model especially in terms of value added to their participants.

The notion that if we just give teachers a place to share, they will use it has not proven to be true for LEARN NC, according to White. He notes that in his experience, because of the many factors that vie for teachers’ time and attention, teachers need to feel they are getting back much more for every minute they contribute. One can take steps over time to demonstrate the benefits are there, but it takes a significant amount of effort and someone to champion that effort in order to be successful.

LEARN NC’s model of sharing content with local districts has been a hallmark of success of the program. They have to operate courses on a cost-recovery basis. It took time to convince funders that giving the content away was an effective model of using state tax dollars, whereas trying to create a revenue-driven model would have limited the program’s reach, resulting in a less effective use of that money. According to Ross, “Revenue-generating models aren’t bad, they’re certainly necessary, but we just knew that we could move the needle further if we could offer systems a free option.” Once LEARN NC takes on a project, they make an effort to be sure it is not orphaned just because some new reform effort comes along. White is also proud of the fact that the model allows them to provide that content to local school systems that can then customize its use to meet their particular needs. He would like to see other states use their content for noncommercial purposes as well as share their content. He recommends noncommercial providers look at Creative Commons licensing as a way to share content and maintain intellectual property rights.

LEARN NC worked with professional development coordinators in the state to determine current spending on professional development in order to gauge the market value of their courses and calculate an appropriate cost. They found that school districts often underestimate the full cost of professional development. If a district brings in a speaker for $5,000, they tend to forget the additional costs, such as substitutes, stipends for time spend outside of school hours, food, facilities rental, maintenance, and other hidden costs that can make the total much higher than the perceived $5k.

LEARN NC tries to develop content for a 5-year cycle, but 3 years is probably a more realistic time frame. Hidden costs for the program include updates or revisions to the content, at least annually. Incorporating new and emerging technologies are also a cost consideration over the life of a course.
Having had to change LMS products more than once, White notes that portability of content is an important issue to being sustainable. He notes that if you tie yourself to a system with no portability, you’ll end up “hemorrhaging money.”

Evaluation: For several years, all participants in LEARN NC courses must complete an end-of-course survey in order to receive their certificates. A 6-month follow-up, which may actually occur anywhere from 3 to 9 months depending on scheduling, asks similar questions to the end-of-course survey as well as whether the participant has implemented skills or knowledge from the course into practice.
Based on their involvement with the e-Learning for Educators program, LEARN NC has newly implemented a precourse survey that asks the same questions about content and pedagogy as the end-of-course survey. It is not a survey focused on skills and abilities, but dispositions. Focusing on dispositions allows the survey to be used across courses regardless of the content addressed. While desired, White notes that “few funders are willing to invest in the time and effort it takes to create a rigorous objective-based test” to measure changes in participant knowledge.

The NSTA Learning Center

Project Title: The NSTA Learning Center

Organization: National Science Teachers Association

Contact: Al Byers, Assistant Executive Director, e-Learning and Government Partnerships

URL: http://learningcenter.nsta.org/default.aspx

Date First Implemented: 2008

Audience: K–20 science teachers

Need: In some preservice degree programs, especially for emerging K–8 educators, research shows many educators with minimal specific science content or pedagogy training, and modern teachers often have so many constraints on their time that they have little time during the school day to focus on professional learning. The original conception was to address the need for greater science content knowledge proficiency, an often reported national need, with a “just enough, just in time, just for me” model. The effort was designed to fill the niche between online summer institutes and the formal moderated online short courses NSTA offers four times a year.

Intended Outcomes: The goal of the NSTA Learning Center is to provide access to numerous on-demand self-directed high-impact professional development resources and opportunities catering to educators’ individual needs and learning preferences, and providing a level of accountability for district administrators in order to support teachers’ long-term professional growth over time.

The duration of the online professional development experiences ranges from 90 minutes for synchronous webinars; to between 2 and 10 hours for self-directed, web-based modules with unlimited e-mentor content support; to between 8 and 12 weeks for online, moderated, cohort-based short courses or extended graduate courses.

Overarching the Learning Center’s goals is the charge to establish a sustainable and scalable model of professional learning that is accessible to all of the nation’s 3 million teachers of science, including content resources and tools to help effectively diagnose, plan, track, and document individual professional growth over time.

Incentives: Incentives depend on the type of resource or opportunity in which one participates. Incentives may include:
–Continuing education units
–Graduate credit hours
–Certificates of completion
–Pass/fail certificate after completing a final assessment at the conclusion of an online self-directed web module (SciPack)
–Release time for participation
–Stipends for some experiences, such as self-directed modules, and generating reports utilizing NSTA Learning Center tools, such as the professional development (PD) Plan and Portfolio tool

Instructional Design Considerations: NSTA employs current research and proven best practices in all the e-learning resources and opportunities it develops in partnership with its sponsors. For example, each Science Object is structured around a learning cycle modified for adult learners in the online environment. These objects are designed to challenge teachers to struggle with questions, observations, and simulations/representations of scientific phenomena, and apply their ideas in an inquiry-based approach espoused by the 1996 National Science Education Standards. The design of simulations is guided by research in the effective use of multimedia for instructional purposes and incorporates emerging research on cognitive load. The Learning Objects also include embedded and final assessments to provide teachers with feedback and a means to track their progress, incorporates research about how people learn, and includes known preconceptions or misconceptions common in understanding certain science concepts.

Lessons Learned: Correlating all the assets available in the Learning Center to the 50 state standards is a challenge. There are some technology means (e.g., crawlers) for tagging resources and opportunities, but their degree of alignment is not foolproof. At some point, using human reviewers to tag the content becomes important to ensure the most accurate alignment to the standards.
To be compatible with the widest number of teachers whose school technology may not be as current as home technology or that used in other industries, NSTA makes sure their materials operate on different browsers two versions from the most current.

Byers notes, “You do need a critical mass of learners to support rich and worthwhile online discussions, especially mailing lists, discussion boards, and the like. Having trained moderators to facilitate discourse and increase engagement is paramount to help foster a vibrant professional online learning community.”
Scheduling and aligning the online professional development as part of the district’s larger systematic professional development is critical, which has implications for integrating online professional development into school calendars. Unfortunately, one 4-week pilot study back in 2008 overlapped with student testing by the district, and the participating teachers found it difficult to spend the time they would have liked because of the demands on their schedules from the testing obligations.

An interesting finding from the pilot study was the low use of science experts NSTA made available via e-mail. All participants reported this feature was helpful, but only 15.6% of the participants’ actually e-mailed one of the experts. While the low participation rates did not overburden the participants, Byers notes, “Going to the trouble of employing content experts to quickly reply to such messages represents an expense that might only be worth the trouble if more learners would take advantage of it.” Because the participants note the feature is helpful, NSTA has taken steps to make it more evident through their content and will investigate the use of live chat to better capitalize on the available experts. Perhaps the immediacy of feedback to support teachers’ professional needs at the moment they occur is an attribute of support needed for self-directed learning. Current usage trends for these mentors are now trending positively and in proportion to overall SciPack usage.

The success and impact of the professional development opportunities varies with how well the materials are presented and supported in the states and districts that use them. NSTA has observed that teacher attitudes and learning are greater when the materials are truly incorporated into professional development plans with incentives and measurable milestones and include technical support and administrator buy-in. But according to Byers, “If the online professional development portal is presented simply as a URL that is forwarded via e-mail with a password to a large number of teachers, the impact is less significant.” With over 70,000 active users, and 60 district and state departments of education using the Learning Center as part of their strategic professional development efforts, the heuristic algorithm for successfully combining both face-to-face and online professional development is becoming clear.

Evaluation: NSTA has conducted one research-based quasi-experimental project, with the results published in the February 2008 Journal of Science Education and Technology (Sherman, Byers, & Rapp). Looking at a three-district pilot of self-directed electronic professional development (n = 45), teachers were able to access on-demand content that incorporated a high level of interactivity via embedded simulations, questions, and hands-on learning opportunities in a self-contained 10-hour web module on force and motion.

Using an independent pre- and post-assessment developed and administered online by Horizon Research, NSTA found significant gains in teacher content knowledge in across all three districts and all participants. The mean pretest score over the eight force and motion word problems for all participants was 70.6%, while the mean posttest score for the same items was 80.5%.

An online survey of 41 teachers using the SciPack professional development program indicated significant improvement in confidence levels regarding their ability to teach concepts related to Newtonian force and motion after completing the self-directed web module. In addition, 98% of the teachers found the content relevant to their needs and the embedded simulations worthwhile to their learning. Additionally, 96% said they would recommend the modules to their colleagues.

Another recent third-party evaluation used a two pretest-posttest delayed-treatment control group design with random assignment involving 56 teachers across Grades 5 through 8 from a midwestern large urban school district who explored two web modules. Results found significant gains in teacher learning, self-efficacy, and preparedness to teach the subject matter via repeated-measures analysis of variance for the treatment group versus the control group. Students taught by educators in both the treatment and control group showed significant gains in learning, as one would expect after experiencing an educational unit. However, students taught by teachers in the treatment group showed significantly higher gain scores than those in the control group (i.e., students in the teacher treatment group started with lower preassessment scores and achieved higher overall postassessment scores than those in the control group). NSTA is encouraged by these outcomes.

An additional third-party study is being conducted with an outside evaluator to determine the impact of the NSTA Learning Center and the self-directed web modules with 15 school districts across the country. In addition to these formal studies, hundreds of deployments are occurring with the Learning Center, with numerous qualitative and anecdotal bits of feedback being captured and available for review at http://learningcenter.nsta.org/Testimonials.aspx.

Reference: Sherman, G., Byers, A., & Rapp, S. (2008). Evaluation of online, on-demand science professional development material involving two different implementation models. Journal of Science Education and Technology, 17(1), 19–31.

OPEN NH

Project Title: OPEN NH

Organization: New Hampshire e-Learning for Educators

Contact: Stan Freeda, Project Coordinator

URL: http://www.opennh.org

Date First Implemented: 2005

Audience: K–12 teachers in New Hampshire, especially those in schools with high-poverty, low-achieving populations

Need: OPEN NH came about through participation in the e-Learning for Educators project funded by the federal Ready to Teach Program and organized by Educational Development Center, Inc. (EDC) and Alabama Public Television. Main partners are the New Hampshire Department of Education and New Hampshire Public Television.

Intended Outcomes: New Hampshire has a large rural population and many smaller schools. There can still be a lot of travel involved in participating in local professional development, so OPEN NH is designed to help districts provide high-quality professional development with reduced costs—due to no travel—as well as greater flexibility in terms of when educators participate.
OPEN NH is especially focused on helping educators with students at high-poverty, low-achieving schools.

Incentives: Professional development hours (e.g., 35 hours for a 7-week course), which some districts define as continuing education units. Graduate credit from Plymouth State University in New Hampshire is also available.

Instructional Design Considerations: Courses run in sessions (7-week period in spring, summer, and winter). Courses have a minimum requirement for participation, since they are discussion-based, and Freeda notes that a group of at least eight participants allows for a richer discussion and better opportunities for facilitating learning.

OPEN NH follows the e-Learning for Educators model for developing courses and training facilitators. They prefer to have participants who have taken or facilitated courses become course developers. They have created their own developer and facilitator courses so they can continue to develop and offer courses if the grant funding goes away. The outcome of participation in these courses is an additional course that can then be offered.

OPEN NH has developed an online orientation used in all courses that involves activities relevant to learning about online learning, such as reading a white paper or research and then commenting on in the discussion area. By actually focusing on principles of online learning, the orientation is more relevant to the participants.

Lessons Learned: While other states in the e-Learning for Educators offer courses for free, OPEN NH charges a registration fee of $130 per person per course. This helps build commitment and appears to prevent course attrition.

The program offers courses in at least each of the four core content areas each year. It also offers courses in popular topics, such as Web 2.0 tools, differentiated instruction, project-based learning, or working with English language learners; but it embeds them within the context of a content area. Courses are then facilitated by a content expert (e.g., differentiating instruction in science would be taught by a science teacher). These pedagogy-specific courses seem to be more popular than content-area courses that focus on foundational knowledge, research, and theory in a domain.

OPEN NH has developed a course template in their LMS to make it easier for course developers as well as to provide some consistency across courses.

It has been difficult to promote the use of Web 2.0 tools that are not supported by the LMS. Participants don’t want to have to leave the LMS or to have multiple accounts for applications just to participate in a single course. Says Freeda, “There’s a fine line between how many log-ins you can support.”

Crafting an engaging discussion question is important. Freeda concurs with other providers that it can be difficult to engage people. OPEN NH wants participants to use the information from the course to support their own ideas, to make the content personally relevant. Discussions may include ways content may have to be modified to meet specific settings, such as for different grade levels or students with different needs.
OPEN NH has been trying to seed districts with people who have taken the courses or who know the program. Successful participants tend to encourage others to take their courses and can also give face-to-face support for those who might need it. They now budget and pay for a 2-hour, face-to-face orientation in four sites throughout the state.

Evaluation: OPEN NH participated in evaluations from the grant evaluator, and therefore created similar pre/post surveys into an online survey for its own use.

PBS TeacherLine

Project Title: PBS TeacherLine

Organization: PBS

Contact: Melinda George, Senior Director, and Elizabeth Wolzak, former Senior Manager, Instructional Design

URL: http://www.pbs.org/teacherline

Date First Implemented: 2000

Audience: K–12 educators

Need: PBS launched MathLine in 1995 with funding from the U.S. Department of Education. In 2000, PBS built on this work through a Ready to Teach grant also from the U.S. Department of Education that was the impetus for the development of TeacherLine, which expanded into other content areas. TeacherLine now has more than 100 courses that are fully online and fully facilitated, as well as a new Peer Connection component that makes the learning objects in those courses searchable and usable in a more standalone fashion for coaches and mentors.

Intended Outcomes: Initial courses were designed to address the needs of K–12 teachers in core content areas.

Incentives: Continuing education units and graduate credit

Instructional Design Considerations: Because of their experience, TeacherLine has done a lot of work around determining what an effective online course looks like and how it should be structured. Wolzak developed a course outline for TeacherLine called a Performance, Objective, Assessment, and Activity Chart (POAAC) that guides the development process and facilitates conversations with partners or content experts. TeacherLine courses have similar strands, so they are familiar to those who take more than one course, and they have a 33% rate of repeat learners (some who take multiple courses).

TeacherLine provides a project manager and an instructional designer and the institutions they partner with to develop content provide a project manager and a writer. These two to four people, depending on their expertise and commitments, use the POAAC to guide the development, even if it is modifying existing content for online delivery.

Wolzak notes that it is important when designing online instruction to know the needs of your learners. The audience for professional development is adults, and they have different needs. You don’t teach adults like you do children. She suggests that you need to model what you want them to do in their classroom. They come with prior knowledge, so she encourages reflection and incorporates a good bit of peer review and sharing of experiences.

All TeacherLine courses are project-based and have a direct real-world application to the classroom. There are often fewer activities in a TeacherLine online course than more traditional face-to-face instruction, and courses result in something tangible and very relevant. Often, the activities will build to a comprehensive whole rather than isolated activities. Language arts classes often incorporate case studies, and science courses model inquiry-based learning strategies.

All courses have performance objectives, and every objective is assessed through the use of a rubric. Multiple-choice assessments are not used.

Lessons Learned: One of the myths about online learning is that it’s always the same, but in reality, there are so many flavors of professional development. Review what other people are offering and make sure you know what your audience wants.

Self-paced is a really tough way to get buy-in from teachers online. TeacherLine created some self-paced learning opportunities in the past, but they were not very popular. So everything they do now is facilitated. TeacherLine has a 94% completion rate, and George feels that facilitators make the biggest difference in whether a person completes successfully or not.

Every facilitator has at least a master’s degree or higher, and every facilitator has to complete a 6-week online course about facilitation. TeacherLine has taken steps to make their facilitation of courses more of a profession by instituting periodic faculty meetings and peer observations. They don’t try to recruit as many facilitators as possible, but instead want to have their best facilitators repeat. They also offer a second course for veteran facilitators to strengthen their skills.

Much of the meat of the course occurs in the discussion area, so they think hard about discussion questions so that it can be manageable both for the learner and the facilitator. The discussion area can strengthen the development of community by connecting to and learning with peers. The facilitator is the “guide on the side” and does not necessarily direct the discussion.

George wonders whether graduate credit for completion is going to continue to be acceptable for recertification as the nation moves towards the idea of highly effective teachers. She surmises that professional development programs are going to have to collect some kind of data to demonstrate that it has been able to help teachers become more effective, not that they’ve just finished a course. Graduate credit and continuing education units may not be enough incentive if that shift occurs.

TeacherLine uses metatagging. They built the original database but engage an outside organization to actually tag the learning objects as they are put into the database. Schools or districts can also license TeacherLine content in a SCORM-compliant form (see Chapter 5 for information about SCORM).

Evaluation: TeacherLine has contracted with an outside evaluation organization to review the pre/post survey data, and early data have been used to monitor what has gone well and how to offer new opportunities.
Through evaluation of the first Ready to Teach grant, TeacherLine was able to determine what teachers wanted in terms of content and activities. The 100 courses were deconstructed and put into a searchable database embedded in a collaborative environment, called Peer Connection. The purpose is to provide information to support coaches and mentors.

Pedagogical Principles of Distance Learning

Project Title: Pedagogical Principles of Distance Learning

Organization: World Health Organization

Contact: Steve Baxendale, Project Coordinator, Pacific Open Learning Health Net

URL: http://www.polhn.org

Date First Implemented: 2009

Audience: College professors learned how to modify an existing face-to-face course for online delivery.

Need: As late as 2004, there were few continuing education opportunities for health workers (e.g., doctors, nurses, and other health workers) for the 14 Pacific Island countries and territories covered by the World Health Organization’s Representative Office in the South Pacific. Steve Baxendale was hired to develop continuing education opportunities. Medical officials in these countries do not specialize and often have to perform tasks that are routinely performed by multiple people in more developed countries. There is also a great shortage of health personnel. In some places, there may be only one nurse or doctor on an island, and if they leave for training, the people on that island have no access to health care. It became imperative to provide ongoing training for health workers where they did not have to leave their communities for long periods of time.

Intended Outcomes: This particular project targeted college professors who were currently offering courses in a face-to-face setting. The intended outcome was that these professors would learn to design and deliver effective online versions of courses in their area of expertise.

Incentives: Graduate credit

Instructional Design Considerations: The preference is problem-based learning, sometimes incorporating case studies, often modeled after real events. The designers try to make the material relevant, including making sure people represented in graphics and animations look like the target audience, which in this case comprises people from the Pacific Islands.

“Your audience members have to be able to relate to your content,” says Baxendale, “whether you are using scenarios; case studies; or the images, animations, and videos you include. If you have video staged in a resource-rich, pristine lab setting that is not similar to the experiences of your audience, you lose credibility and can completely turn off your audience.”

Lessons Learned: Doing a needs assessment and including all of the relevant stakeholders has been a successful means for getting buy-in. WHO contacts health-related governing boards and institutions, health facilities, content experts, and practitioners (members of your target audience) as much as possible to determine what professional development should be provided.

The current administrative system (academic services) they were relying on was not set up for online courses, so they have had to make an effort to provide student support specific to an online setting. The online students don’t follow the same procedures as other students because—in many cases—they can’t. They’re often isolated, and their work obligations don’t allow them to attend classes on a regular schedule.

Evaluation: Course evaluations are conducted at the end of the professional development. Most questions are qualitative. Baxendale corroborates the notion from other providers that there is a tension between the funders’ wanting data to demonstrate impact but not providing sufficient funding to actually observe the participants in practice.