Nursing Students as Epidemiologists: A Simulation Approach

Copyright © 2016 International Nursing Association for Clinical Simulation and Learning. Published by Elsevier Inc. All rights reserved.

Since January 2020 Elsevier has created a COVID-19 resource centre with free information in English and Mandarin on the novel coronavirus COVID-19. The COVID-19 resource centre is hosted on Elsevier Connect, the company's public news and information website. Elsevier hereby grants permission to make all its COVID-19-related research that is available on the COVID-19 resource centre - including this research content - immediately available in PubMed Central and other publicly funded repositories, such as the WHO COVID database with rights for unrestricted research re-use and analyses in any form or by any means with acknowledgement of the original source. These permissions are granted for free by Elsevier for as long as the COVID-19 resource centre remains active.

Associated Data

GUID: 9D5EF81D-F5A5-4B6A-9202-FBB214119845

Abstract

Simulation is commonly used in nursing education to teach clinical skills. Here, we describe the development processes, implementation, and evaluation of an epidemiology simulation used in a community and public health nursing undergraduate clinical course at the University of Pennsylvania. The simulation was designed to teach students the principles and concepts of outbreak investigation and was based on the 2003 Severe Acute Respiratory Syndrome outbreak in Toronto, Canada. The simulation places students in the role of a public health nurse in the outbreak investigation team, working in groups of five to seven students to complete analyses and make recommendations under time and information constraints. Since piloting in spring 2014, we have run the simulation three times (summer and fall 2014 and summer 2015). Student evaluations show high levels of engagement and interest and substantial increase in the skills and expertise required in an outbreak investigation. We share key lessons learned, including resources required for simulation development and delivery, revisions to the simulation format and content in response to student feedback, and transferability and sustainability of the simulation. Overall, simulation was a feasible and effective modality to teach epidemiology and should be considered in community and public health nursing courses.

Keywords: outbreak investigation, nursing students, education, simulation, epidemiology

Highlights

We developed an outbreak investigation simulation to teach nursing students principles of epidemiology.

Students reported increases in skills and expertise related to outbreak investigation.

We report lessons for future simulations including transferability, sustainability, physical settings, use of technology, expertise, and resources.

Key Points

We developed an outbreak investigation simulation to teach nursing students principles of epidemiology.

Students reported increases in skills and expertise related to outbreak investigation.

We report lessons for future simulations including transferability, sustainability, physical settings, use of technology, expertise and resources.

Simulation of various clinical situations is one teaching strategy used in nursing curricula. Simulation provides an opportunity for students not only to become competent in nursing-related skills required for the provision of care in acute settings (e.g., urinary catheter insertion) but also to gain additional skills from other academic disciplines and practice professions. Epidemiology is one such discipline; in this article, we present the development and implementation of a simulation to teach epidemiological concepts and skills to undergraduate nurses. Simulation can “replace or amplify real experiences with guided experiences that evoke or replicate substantial aspects of the real world in a fully interactive manner” (Gaba, 2007, p. 126). Simulations require students to actively participate, apply existing knowledge to problem-solving, and interact with peers and facilitators in high-fidelity settings. Simulation is a hallmark of the new undergraduate curriculum at the University of Pennsylvania (D'Antonio, Walsh Brennan, & Curley, 2013) and is used to some extent in every clinical course.

Another hallmark of our new undergraduate curriculum is the emphasis on public health and community nursing, reflecting the growing importance to the nursing field of population health and care provided outside a clinical setting. In the revised undergraduate curriculum, Nursing in the Community is a two course unit clinical course (i.e., twice the size of a typical semester-long course) taken at the beginning of the senior year. The goals of the revised Nursing in the Community course are to prepare students for public health nursing, community-based practice, and community-engaged practice. The syllabus and clinical placements for the course were informed by the core competencies established by the Quad Council, a coalition of four nursing organizations (The Public Health Foundation, 2014). Mastery of these competencies, including public health skills such as epidemiology, is critical to improve the quality and rigor of public health nursing.

As with all clinical courses in the undergraduate curriculum, Nursing in the Community includes a robust simulation component, comprising 36 hours across the semester. Although most of the simulation hours in the course focus on typical clinical scenarios in community or public health nursing (e.g., a home visit for a newborn and mother; a school-based clinic; a recently discharged patient with heart failure), we saw an additional opportunity to build competencies in epidemiology and public health through simulation. Notably, ten of the >30 hours of simulation in the revised Nursing in the Community course are focused on two epidemiologic (vs. clinical) scenarios in which nursing students take on the role of a nurse epidemiologist. In the first, students serve on an outbreak investigation team for a rapidly emerging infectious disease in a fictional city; in the second, they assess a national cervical cancer screening program in a Sub-Saharan African country. In this article, we share the development, implementation, and outcomes of the outbreak investigation simulation for undergraduate nursing students in the community and public health nursing clinical course at the University of Pennsylvania School of Nursing.

Simulation Development

Resources and Roles

Development and implementation of the simulation was resource intensive (see Table 1 for personnel involved and their roles in both the course and the simulation). The effort was overseen by a course instructor (A.B.) with expertise in public health and epidemiology, who initially developed the concept of adding epidemiology simulations to the course. To support simulation development, we hired a current Master of Public Health (MPH) student (H.O.) to be the lead author and lead facilitator of the simulation. The Course Directors (L.P., C.B.) and Associate Course Director (M.H.) reviewed draft and final simulation content and assisted with facilitation during piloting and full implementation; all had participated in a simulation facilitator training course previously. Teaching assistants (T.J.S., H.T.) helped develop and validate grading rubrics and assisted with facilitation. The authors also consulted simulation specialists within and outside the nursing school, including a nationally recognized nursing simulation expert.

Table 1

Personnel Involved in the Simulation Development and Course Instruction

Course StaffQualificationsSimulation TrainingRole in CourseRole in Simulation DevelopmentRole in Simulation Implementation
H.O.PhD, MPH Simulation InstructorLead authorProtagonist, lead facilitator, assessment
A.B.PhD, MBA Course InstructorCo-authorFacilitator, assessment
L.P.EdD, MPH, RNCourse Director (S2014)Content review, pilot facilitatorFacilitator
C.B.MSW, MSN, RNCourse Director (F2014, S2015)Facilitator
M.H.MSN, MPH, RNAssociate Course DirectorContent review, pilot facilitatorFacilitator, assessment
T.J.S.MSN, RN, NP-C, PhD(c) Teaching AssistantRubric developmentFacilitator, assessment
H.T.MSN, CRNP, PhD(c) Teaching AssistantFacilitator

Development Process

In the initial planning phases in fall 2013, four epidemiologic topics relevant to community and public health nursing were identified as possible simulation scenarios: outbreak investigation, public health surveillance, screening and prevention, and population-based interventions. Of these, outbreak investigation was selected as the first epidemiology simulation to be developed given its high relevance to public health nursing and its suitability for simulation-based instruction.

Beginning in January 2014, the lead author and course instructor first developed a series of eight learning objectives for the simulation (Box 1 ), based on outbreak investigation content covered in popular epidemiology textbooks (Gordis, 2008, Nelson et al., 2006) and the Quad Council competencies identified as important for nurses working in outbreak investigations (Sistrom & Hale, 2006). The lead author then researched several historic outbreak investigations on which to base the simulation. The 2003 outbreak of Severe Acute Respiratory Syndrome (SARS) in Toronto was chosen given available data and outbreak features conducive to meeting learning objectives (e.g., disease transmission in the health care setting), and the controversial role of quarantining in containing the outbreak.

Box 1

Learning objectives

LO1. Identify and determine epidemiologic features of an outbreak.

LO2. Identify the determinants and risk factors of the disease outbreak.

LO3. Design appropriate questions for a case interview.

LO4. Calculate the attack rate and fatality rate of an outbreak.

LO5. Prepare an epidemic curve and know what information can be obtained from it.

LO6. Develop preventive plans at primary, secondary, and tertiary levels.

LO7. Prepare descriptive and analytic epidemiologic written reports for an outbreak investigation.

LO8. Communicate findings to authorities and to communities.

The simulation content was then developed using the framework outlined by Jeffries (2005). The framework informed the educational approaches and the interaction between facilitators and students. The framework also guided the design of the learning objectives, assessment of simulation fidelity, and cues used during the simulation. We planned a 5- to 6-hour experience during which groups of five to seven students would work together on real-time deliverables related to the investigation of the outbreak. With our learning objectives as an end point, we worked backward to identify assignments and deliverables that would demonstrate content mastery. From there, we determined the information and data sets provided to students and the analytic, problem-solving, and decision-making tasks they would be asked to undertake.

Simultaneously, the lead author conducted a comprehensive literature review of the 2003 SARS epidemic in Canada using the Scopus database and the following key words: Canada, SARS, 2003. From the large body of published studies, three key articles were selected to form the core of the simulation (May and Schabas, 2012, Ontario Ministry of Health and Long-Term Care, 2006, Varia et al., 2003). Using these articles and other background data, a six-part simulation was developed addressing the essential components of outbreak investigations as captured in the eight learning objectives ( Table 2 ). Each part was focused on developing key skills and had one or more deliverables. Simulation materials included background reading to be completed before the simulation (Arizona Department of Health Services, 2013); the memos, assignments, data sets, and other material provided during the simulation; assessment rubrics and answer keys for the deliverables; and a detailed instructors manual.

Table 2

Summary of the Simulation Components, Objectives Achieved, and the Skills Developed

PartTimeObjectives ∗ Materials ProvidedStudent DeliverablesSkills DevelopedQuad Competencies Addressed †
123456781368
Part I60 minutes Medical records, spot mapContainment measures, risk factors associated with outbreak, inferences from spot mapExtract relevant information from medical records, identify risk factors, leadership, decision-making
Part II45 minutes Interview script of first admitted patientInterview questions, results and insights from practice interview, revised interview questionsInterviewing, communication, leadership
Part III40 minutes Line listing of casesEpidemic curve, case definitionGenerate epidemic curve, calculate attack rate and care fatality rate
Part IV35 minutes. Press release and scenariosPublic service announcement, recommendations for preventive measuresPrepare public service announcement, educate the public, leadership, communication, evidence-based decision-making
Part V45 minutes Line listing of cases, press releaseEpidemic curve, incubation periodCalculate incubation period, evidence-based decision-making
Part VI95 minutes Line listing, population statistics, press releaseIncidence rates, case fatality ratesPreparation of information to be included in analytic reports, leadership skills

∗ Objectives: (1) identify and determine epidemiologic features of an outbreak, (2) identify the determinants and risk factors of the disease outbreak, (3) design appropriate questions for a case interview, (4) calculate the attack rate and fatality rate of an outbreak, (5) prepare an epidemic curve and know what information can be obtained from it, (6) develop preventive plans at primary, secondary, and tertiary levels, (7) prepare descriptive and analytic epidemiologic written reports for an outbreak investigation, and (8) communicate findings to authorities and to communities.

† Quad core competencies: (1) analytic/assessment skills, (2) policy development/program planning skills, (3) communication skills, (4) cultural competency skills, (5) community dimensions of practice skills, (6) public health sciences skills, (7) financial planning and management skills, and (8) leadership and systems thinking skills.

The simulation was deemed to be quality improvement rather than human subjects' research by the University of Pennsylvania institutional review board and was determined to be exempt.

The Scenario

Early in the development process, we decided to fictionalize the outbreak and the setting to have more flexibility in developing simulation content and deliverables and to place the students in the same situation as the Toronto health officials at the time (i.e., unaware of the disease they were confronting). We renamed Toronto, Ontario Province, the relevant hospitals, and national and international organizations involved in the investigation; we also changed the disease from SARS to acute pneumonia–like syndrome (APLS, pronounced “apples”).

The simulation is delivered in-person and through the university's online courseware platform (Instructure, 2011). On the day of the simulation, the course instructor (not in character) briefs the class on simulation logistics including break-out room assignments, schedule, use of courseware discussion boards, and assessment. To launch the simulation, the lead author addresses the students in the role of the simulation protagonist—an MPH-trained public health nurse. She informs the students (now public health nurse intern in the fictional city's Department of Public Health) about the first reports of an outbreak of an unknown disease.

After the briefing, students are dispatched in preassigned groups to “situation rooms” (designated break-out rooms in the same classroom building) to begin assisting in the investigation. The simulation facilitators and the protagonist remain at “Command Central” and also circulate among the breakout rooms to engage students (in character) as they progress through the simulation. Assignments are released and submitted through the courseware platform. Each assignment includes a memo (see example memo, Figure 1) from the protagonist to the nurse interns requesting an analysis, report, or other deliverable. To simulate the real-time, rapid pace of an outbreak investigation, memos and assignments are released only as teams submit prior assignments. Memos alert teams to recent developments in the investigation (e.g., new cases, laboratory confirmation, etc.), and the assignments require students to respond to the outbreak as it unfolds. Assignments are all team based; there are no individual assignments in the simulation. At the end of the simulation, the lead author/protagonist delivers a brief lecture on the SARS outbreak in Canada, and the students are given the opportunity to debrief and reflect on the simulation experience and complete an evaluation. The debriefing is facilitated by the course instructor who asks the students to share their experiences and to discuss aspects of the simulation they found challenging or surprising. Debriefing sessions last approximately 15-20 minutes. Comprehensive feedback is also provided to student teams later through assessment and grading of deliverables.

Simulation Implementation and Evaluation

The simulation was piloted in the spring 2014 semester with six student volunteers, including MPH and accelerated Bachelor of Science in Nursing (BSN) students. The pilot was conducted via e-mail as the simulation had not yet been built into the courseware platform (Instructure, 2011). After the pilot, the simulation was run in the summer 2014, fall 2014, and summer 2015 sections of the Nursing in the Community ( Table 3 ). The summer section is primarily composed of accelerated BSN students (those with an undergraduate degree in another field), and the fall section, of traditional students. In summer 2014, simulation materials were provided to students in hard copy and via Canvas based on student feedback from the pilot. Beginning in fall 2014, the simulation was delivered entirely via Canvas.

Table 3

Summary of Simulation Implementation, Personnel, Evaluation, and Assessment

SectionImplementation Evaluation of Course by Students Student Assessment
Number of StudentsDelivery and Submission ModeScheduleStaffingCourse EvaluationFeedbackGroup FunctioningSubmitted Assignments
Pilot—spring 20146E-mailTwo 3-hour sessions over 2 daysLead facilitator/protagonist, two cofacilitatorsDebrief with studentsMake more paper based. Enjoyed not worrying about a grade. Too many calculationsNot evaluatedNot graded
Summer 201496Paper and coursewareTwo 3-hour sessions with 1-hour lunch break on the same dayLead facilitator/protagonist, one or two cofacilitators, one teaching assistantDebrief with studentsReduce paper and go 100% electronic; simulation too longAssessed by instructors via rubricEach assignment graded pass/fail
Fall 201474CoursewareTwo 3-hour sessions with 1-hour lunch break on the same dayLead facilitator/protagonist, one or two cofacilitators, one teaching assistantEvaluation form, debrief with studentsConsider use of Google doc to allow all students to enter responses and view responses in real timeAssessed by instructors via rubricEach assignment graded numerically
Summer 2015100CoursewareOne 5-hour session with a 15-minute break at teams' discretionLead facilitator/protagonist, one to cofacilitators, one teaching assistantEvaluation form, debrief with studentsConsider allowing teams to move through simulation at own paceAssessed by studentsEach assignment graded numerically

The simulation was initially designed as a 6-hour experience. In summer 2014 and fall 2014, the simulation was run for the full class (74-96 students) from 9 a.m. until 4 p.m., with a 1-hour break. Beginning in summer 2015, the simulation was reduced by 1 hour and ran in two sections on the same day (i.e., 7 a.m.-12 p.m.; 1 p.m.-6 p.m.) and included a total of 100 students; 50 in each section. This schedule aligned with the clinical simulation schedule for the course and accommodated the larger number of students in this section. The simulation requires, at a minimum, the lead facilitator/protagonist and two additional facilitators; a lecture hall that can accommodate the full class; and a breakout room for each small group. Ideally, each student has access to a laptop or tablet; in addition, computer-equipped breakout rooms enhance group collaboration.

Standardization

The simulation experience is standardized in several ways: the same lead protagonist is involved in both the morning and afternoon simulation, to ensure consistency in the information delivered. All facilitators review the instructor manual before the simulation and can access the lead facilitator to address any questions. A virtual discussion board on Canvas allows dissemination of information for clarification to all the active simulation teams. All simulation teams are provided with the same presimulation reference material to review, receive the same simulation materials, and produce the same deliverables. All students participate in the debriefing and are asked to complete the simulation evaluation following debrief.

Evaluation Process

The simulation has undergone continuous evaluation and revision. After the pilot and the first full implementation of the simulation, students were qualitatively debriefed on the experience, and many student suggestions were incorporated into subsequent versions of the simulation. Facilitators took detailed notes during the simulation on content and process which are compiled into action items for revisions. Beginning in fall 2014, students completed a qualitative and quantitative evaluation form at the end of the simulation that assesses achievement of learning objectives, the quality of small group interactions, suggestions for future simulations, and key takeaways, lessons learned, and skills acquired.

Facilitator Observations

Facilitators made detailed observations during the pilot that informed crucial revisions, including placing a time limit on case interviews, providing masks for interviewers, fictionalizing the location of the outbreak, encouraging all team members to participate in the construction of the epidemic curve, accommodating heterogeneity in Excel proficiency, and shifting the grading of group processes from instructors to students. Observations of the time required to complete each deliverable confirmed that planned times were sufficient.

Student Feedback

At the end of each simulation, students complete an evaluation consisting of seven items assessing learning and three items assessing team function. All items are scored on a five-point Likert scale. Open-ended questions addressed key takeaways or insights gained, the team dynamic, and suggestions for future simulations. Generally, students reported enjoying the simulation but felt that the simulation was too long. Students suggested different modes of receiving memos and assigned tasks; the students from the pilot indicated a preference for paper-based simulation, whereas the students in the summer 2014 session preferred a paperless simulation. Students from fall 2014 suggested running the simulation through Google forms, and students from summer 2015 suggested allowing individual groups to move through the simulation at their own pace.

Students welcomed the opportunity to interact with the protagonist/lead facilitator in character; this interaction was extensive for the pilot, but by necessity, was more limited in the full simulation. Breaking the simulation into two sections in summer 2015 increased interactions with, and feedback from, the protagonist. Based on feedback from the pilot and summer 2014 sessions, calculations at the end of the simulation (when students were fatigued) were moved earlier.

Evaluation Results

In fall 2014, student evaluations (N = 74) were conducted. Using Stata version 13 (StataCorp, 2013), responses were analyzed to determine mean scores and frequencies of each response for each of the eight learning objectives. Students reported substantial increases in skills and expertise for most learning objectives ( Figure 2 ). For learning objectives four and five, >80% of students indicated that their skills and expertise had increased a lot or to a great extent. The poorest performing learning objective was, “Develop preventive plans at primary, secondary, and tertiary levels,” with fewer than 50% feeling that their skills had increased in this area. This result led us to critically review the simulation content focused on preventive plans, and we determined that this topic was not explicitly addressed in any of the assignments, providing an implicit validation of student feedback. Students also assessed their group effort on five dimensions (engagement, contribution, staying on task, social interaction, communication), and 93% of the students rated the team effort as either very good or outstanding.

An external file that holds a picture, illustration, etc. Object name is gr1_lrg.jpg

Students assessment of extent of increase in skills and expertise with respect to each learning objective at the end of the simulation, fall 2014. LO1: identify and determine epidemiologic features of an outbreak. LO2: identify the determinants and risk factors of the disease outbreak. LO3: design appropriate questions for a case interview. LO4: calculate the attack rate and fatality rate of an outbreak. LO5: prepare an epidemic curve and know what information can be obtained from it. LO6: develop preventive plans at primary, secondary, and tertiary levels. LO7: prepare descriptive and analytic epidemiologic written reports for an outbreak investigation. LO8: communicate findings to authorities and to communities.

The open-ended responses were reviewed several times, and frequencies of the most common comments were calculated. The comments were then organized into themes (see Table 4 for key themes and representative quotes). Based on responses to open-ended questions, students were engaged and interested in the simulation and learned about the key challenges of outbreak investigation. Some students felt pressured by the time constraints, and although most were satisfied with the working as a team, others felt that group members did not appear to contribute equally to the assignments. This feedback led to the addition of a peer assessment component to the simulation grade beginning in summer 2015.

Table 4

Quotes Relating to Emergent Themes on Simulation Learning Process and Group Work

Emerging ThemesQuotes
Student interest and engagement“It was interesting to be able to think in the shoes of a public health nurse and made lecture concepts more realistic and easy to understand.” “Great benefit derived from acting in role of public health officials investigating outbreaks and completing associated tasks--> think[ing] about how to respond in leadership position”
Achievement of learning objectives“Creating the interview questions and having the role-playing make interview was really helpful. I learned more about what to look for and how specific questions need to be.”
“I learned how to create and analyze an epidemic curve-that was cool!”
“Helpful to learn how to calculate the statistics in a real-world scenario rather that in a lecture or homework assignment”
“There has to be a lot of critical thinking involved—you cannot consider just one patient and their disease. You have to think about who they exposed, what kind of symptoms they have how are they affecting other people. Containment and treatment measures affect everyone.”
Pace and schedule“This was great but way too long. Hard to stay engaged.”
“Shorter! But the simulation was very helpful”
“I liked the simulation overall but it was way longer than necessary (in my opinion!). I think it could have been done in ∼4 hours rather than 6 (some of the tasks were repetitive, things we covered a lot already in class)”
Group work“It was sometimes hard to formulate answers to questions with the input of so many people”
“Group worked well together but it was stressful when faculty came to observe knowing that part of our grade was based on that brief interaction i.e. one group member was quiet then even though they were an important contributor to the group”
“My team rocks! Everyone contributes based on their own strengths (technology, creativity, communication etc.)”
“Having a large group was really helpful for not only breaking up tasks but also for bringing new ideas into our deliverables. I think this kind of simulation definitely functions best within teams.”
Surprises“The impact nurses have in spreading disease in a healthcare setting.”
“How difficult it was to determine all of the potential ways to control and spread the disease and how to address it publicly”
“I was surprised by how difficult it was to come up with interview questions.”

Assessment of Student Performance

Simulation deliverables were qualitatively reviewed in summer 2014 to assure students demonstrated an understanding of epidemiology content and skills. Samples of student deliverables from summer 2014 (assignments of part III to part VI) were graded by two facilitators (H.O. and T.J.S.) to validate the grading key. There was consistency in the grades awarded; facilitators awarded full marks to the same assignments and partial marks to the same assignments, but this did not significantly affect the overall grade. Beginning in fall 2014, all group submissions were graded by the lead author. In most of the assignments (e.g., generation of an epidemic curve, attack rate and case fatality rate calculations, reading a spot map, interviewing cases, reviewing medical records, and developing public service announcements and press releases), the students demonstrated excellent mastery of epidemiology skills and content and earned average grades above 90%. Particularly, challenging assignments included generating hypotheses about the outbreak from preliminary data and calculating incubation period from an epidemic curve.

We developed an assessment rubric for the quality of small-group collaboration consistent with the principles outlined by Wolf and Stevens (2007). We used the rubric to evaluate group work in summer 2014 and refined it for fall 2014. Two instructors (course instructor and teaching assistant (TA) assessed the group work. The course instructor and the TA met before the simulation to discuss the rubric and describe scenarios consistent with each score. To further minimize bias, groups were assessed multiple times throughout the simulation and received the average score from all observations. Based on student feedback, role conflict when facilitators were also asked to assess group functioning, and constraints on the amount of time facilitators could spend with each simulation group, we transitioned this assessment component from faculty to the students beginning in summer 2015, using a peer evaluation point allocation system developed at the University of Buffalo National Center for Case Study Teaching in Science (Evaluating Student Case Work, 2015). The average peer participation grade was 9.9/10 (range, 8.0-10.6). The outbreak investigation simulation grade constituted 5% of the final course grade, with the peer participation score (which included participation in a second epidemiology simulation and in-class case groups) weighted an additional 5%.

Discussion: Lessons Learned and Next Steps

In this article, we have described the development, implementation, and evaluation of an outbreak investigation simulation taught in the epidemiology unit of a required BSN community and public health nursing clinical course at the University of Pennsylvania. The fictionalized scenario was based on the 2003 SARS outbreak in Toronto, Canada. The simulation has been piloted and run three times as of summer 2015. An ongoing evaluation and revision process incorporates instructor observations, student feedback, and student evaluation.

Wolf (2008) found that simulation-based learning in the health professions has proven to be an effective way to integrate didactic and clinical material, particularly in settings and for situations that students are unlikely to encounter during clinical placements. Ensuring that nursing students are prepared to assist in public health emergencies, such as rapid-onset infectious disease outbreaks, is a high priority. This was made very clear to us in fall 2014 when we ran the simulation at the height of the West Africa Ebola outbreak. Indeed, one of the major challenges to containing the Ebola outbreak was lack of sufficient training in outbreak investigation, including contact tracing and public health communication (Summers, Nyenswah, Montgomery, Neatherlin, & Tappero, 2014)—two skills that are central to this simulation providing an opportunity for nursing students to learn these skills in a simulated environment is an appropriate and effective way to build the public health nursing workforce and enhance the capacity of all nurses to provide care to communities in crisis. The experience developing and implementing the simulation offers some important lessons for instructors with similar pedagogical goals for BSN courses in community and public health nursing.

Lessons Learned

What Expertise Is Needed for Simulation Development and Implementation?

The simulation's emphasis on epidemiologic content required an interdisciplinary development team that included instructors trained in public health and epidemiology. Public health nursing and community nursing expertise was also critical to ensure that the nursing perspective was maintained. Finally, expertise in simulation-based and case-based learning guaranteed that learning objectives could be translated into high-fidelity situations and assignments. This interdisciplinary approach enabled us to equip students with epidemiologic skills in the context of nursing practice.

Real or Fictionalized Scenario?

Our decision to fictionalize the 2003 SARS outbreak in Toronto was based on feedback and observations during the pilot. When the students were aware that the disease was SARS, they searched for details about the outbreak on the Internet during the simulation to find additional information or “answers” to assignments. We determined that a fictionalized scenario gave us more flexibility to alter details and timelines and required the students to operate under the same information conditions as the outbreak team in Toronto.

How Resource Intensive Is the Simulation?

Simulation development required 180 hours over 16 weeks. Piloting took another 10-12 hours. In our case, an MPH student was hired as a lead author of the simulation, with substantial involvement from course instructors. Implementation of the simulation requires at a minimum two facilitators; additional facilitation is warranted if the class size is large. Ideally, facilitators are teaching assistants or course staff who know both the simulation content and the students well. Some background in epidemiology and public health is helpful to answer questions about, for example, calculating attack rates and case fatality rates or interpreting an epidemic curve. If the simulation is staffed at a minimal level, a detailed instructor manual and use of a courseware discussion board during the simulation can help. Grading the simulation is also time consuming given the number of assignments and, in a large class, the number of teams. We shifted part of the grading burden to students by having them assess peer participation.

Are Physical Setting and Technology Important?

We designed the simulation to be carried out by small groups working in breakout rooms. We have been fortunate to be able to schedule sufficient breakout rooms for all student groups in the same classroom building as the lecture hall. Most of our breakout rooms are equipped with computers, and we recommend that students bring a laptop or tablet. We observed that available technology did affect collaboration style. For example, in some groups, each student had a laptop and worked independently. In groups where not every student had a laptop, subgroups were formed with two to three students working together on one laptop with more discussion and verbal communication. In breakout rooms with a large projection screen, students often projected work in progress from a shared Google document so that the full group could contribute. These groups demonstrated the most engagement and spent the least amount of time collating and submitting assignments.

Can the simulation run without breakout rooms? Could student groups work efficiently at tables in a student lounge or cafeteria, in fixed chairs in a lecture hall, or even sitting in hallways or outdoors? In a distance learning setting, could groups function over Skype, FaceTime, GoToMeeting or Google chat? Although we think that these are all feasible options (and indeed may best simulate the true conditions of an outbreak team), our experience suggests that computer-equipped breakout rooms are ideal.

Paper-Based or Paperless? Courseware or Freestanding?

We have trialed three different modes of disseminating materials to student groups during the simulation: e-mail, paper plus Web, and Web only. Students have heterogeneous preferences, with some strongly preferring paper hard copies of data sets and other information, and others horrified by the “waste” of paper. We have determined that a paperless simulation delivered through our existing courseware is feasible, acceptable, and effective. We remain intrigued by the student suggestion to migrate the simulation from courseware to Google forms. With additional resources, we are confident that a stand alone (i.e., not within courseware) solution could be developed that would facilitate broader dissemination and adoption.

Should the Simulation Content Change to Reflect Current Events?

Our simulation is based on an actual outbreak that occurred >10 years ago. One advantage of this is that data about the outbreak are widely available and several published studies review the investigation in detail. However, as we prepared to run the simulation in fall 2014, the Ebola outbreak in West Africa was at its height, garnering considerable media attention and student interest. We debated rewriting the simulation to focus on the Ebola outbreak to capitalize on this timelines and salience. However, we quickly deemed this unfeasible given the resources needed to develop a new, high-fidelity simulation. We did mention the Ebola outbreak during debrief and linked the outbreak investigation skills needed in West Africa to the tasks and assignments the students had just completed in the simulation. We noted that student interest and engagement in the simulation was highest in fall 2014 compared with other semesters, with more students mentioning the relevance of the materials in evaluation forms.

Is the Simulation Sustainable and Transferable?

We originally intended for the lead author (H.O.) to develop the simulation and then transfer it to the permanent course faculty for facilitation after the pilot. We ran the simulation in summer 2014 without the lead author present, as she was traveling to a conference at the time. We quickly realized that her deep familiarity with the simulation as lead author was difficult to transfer, even with a detailed Instructor Manual, and with three instructors having participated in the pilot. The lead author then served as a lead facilitator in fall 2014 and summer 2015, but we have set an explicit goal for the Course Director, Associate Course Director, and Course Instructor to be comfortable running the simulation going forward. To ensure longevity and sustainability of the simulation without compromising its validity or pedagogical value, we plan to recruit epidemiology and nursing doctoral students to join the simulation team as facilitators, expand the existing instructor manual with detailed step-by-step instructions for simulation delivery to facilitate course faculty's ability to run the simulation with confidence and ease, and transition the lead facilitator role to permanent course faculty over the next year.

Next Steps

We continue to run the simulation twice each year, in the summer and fall terms. Going forward, the simulation will be revised in consultation with the International Nursing Association for Clinical Simulation and Learning Standards of Best Practice: Simulation SM to ensure that the standards are upheld (International Nursing Association for Clinical Simulation and Learning, 2013). We will continue to view simulation development as an iterative process and revise it to address student feedback and the realities of class size and available time. In response to overwhelming student feedback, the next major revision we will undertake is to change the courseware settings so that teams can move through the simulation at their own pace. We are sensitive to the fact that this may diminish the “real time” feel of the simulation which may alter the students' overall experience in an outbreak investigation simulation. At the same time, we believe a self-paced version will facilitate implementation. We are investigating opportunities and available resources to package the simulation for wider dissemination and hope that it will be adopted by other nursing programs.

Simulation is a valuable mode of learning for nursing students, permitting immersion in a high-fidelity but low-risk scenario they might not otherwise have an opportunity to experience. By priming students with simulation exercises, they can build skills, make mistakes, and explore new career opportunities. With this synthesis and evaluation of our experience with an epidemiology simulation for undergraduate nursing students, we hope to highlight the importance of public and population health education and the potential for simulation to re-enforce content learned in a community health nursing course. We look forward to continued dialogue about optimal pedagogical approaches to incorporating epidemiology and public health content into the nursing curriculum.

Acknowledgments

The authors thank Patricia D’Antonio, Kathleen McCauley, Ann Marie Walsh Brennan, Beth Mancini, and Antonia Villarruel for valuable advice and generous intellectual and financial support for the development and implementation of the simulation. The authors are also grateful to Deborah Becker, Ann Marie Hoyt-Brennan and the staff of the Helen Fuld Pavilion for Innovative Learning and Simulation for sharing their considerable expertise in simulation-based learning and for their logistical assistance with the simulation.