PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of pubhealthrepPublic Health Reports
 
Public Health Rep. 2010; 125(Suppl 5): 24–32.
PMCID: PMC2966642

A Longitudinal Study of the Impact of an Emergency Preparedness Curriculum

Mary M. Hoeppner, EdD, MS, RN,a Debra K. Olson, DNP, MPH, FAAOHN,a,b and Susan C. Larson, MPH, RNa

SYNOPSIS

Objective

We conducted a longitudinal study to evaluate the impact of a curriculum designed to develop competency in emergency preparedness among public health professionals.

Methods

At six and 12 months following completion of one or more courses in the areas of emergency preparedness, response, and recovery, or in food protection, course participants were contacted and asked to identify if their participation allowed them to develop targeted competencies, identify important knowledge or skills they had acquired, provide examples of application of learning, and describe the impact of changes. Over five years, 36 sets of data were collected.

Results

The response rate of those who responded at either six or 12 months, or for both time periods, was 63%. At both six and 12 months, those who responded agreed that the learning activity helped them develop the competency associated with it in the curriculum plan. Respondents described multiple applications of learning and reported the development of reflective and systems-thinking abilities.

Conclusions

The results provide compelling evidence that learners do develop competencies that impact their work activities as a result of competency-based educational programming and are able to apply these competencies in their work and organizational activities.

The body of knowledge required for professional practice evolves as new challenges are presented and as specific competency requirements of a profession flex and change. The increasing incidence and potential for terrorist events, ongoing natural disasters impacting larger and more diverse communities, and the advent of a novel pandemic influenza represent only a few of the challenges confronting public health professionals. Accordingly, new competencies are identified and existing competency sets are updated to better represent the knowledge, skills, and attributes (KSAs) needed to keep communities healthy and safe. This article describes a study conducted during a five-year time period to evaluate a curriculum designed to develop the competency of public health professionals responsible for responding to these challenges.

In 2002, in response to the aforementioned issues, the University of Minnesota School of Public Health (UMNSPH) initiated conversations with key partners to identify how best to respond to emerging learning needs. Working through the University of Minnesota Center for Public Health Preparedness (UMNCPHP), this group conducted a comprehensive, learning needs assessment and developed a standardized curriculum focusing on preparedness, response, and recovery competencies.1

UMNSPH decided to create and implement two academic certificate programs organized around the Bioterrorism and Emergency Readiness (BT/ER) Competencies2 to meet the needs of the large numbers of public health workers who were without formal academic preparation. These competencies represented basic knowledge and skills needed by public health practitioners to respond to a wide range of emergencies (Figure 1). The first certificate, the Public Health Certificate in Preparedness, Response, and Recovery,3 was designed to support acquisition of knowledge and skills necessary to effectively participate in planning, executing, and evaluating community public health preparedness activities. The second certificate, the Public Health Certificate in Food Safety4 (later retitled Food Protection), was designed to support those engaged in protecting the health and safety of the food supply from “farm to table.” Each of the curriculum plans shared course offerings targeted to the same BT/ER Competencies and also offered a larger selection of elective courses based on KSAs specific to the focus of the certificate. Generally, more than one course contributed to the development of a given competency.

Figure 1.
Bioterrorism/Emergency Readiness Competencies for public health professionalsa

The curriculum was grounded in an educational model proposed by Benner5 that framed learning as a developmental process. Benner believed that as the depth and breadth of KSAs develop and combine with experiential practice, each novice learner becomes increasingly proficient in the specific competencies under development. UMNSPH created a model for Lifelong Learning in Public Health Education1 based on this premise to guide the creation of learning activities targeted to various levels of competency. The Lifelong Learning Model was used to shape educational activities in the certificate programs.

To shape evaluation of the curriculum, we used the Kirkpatrick Model.6 As with the Lifelong Learning Model, Kirkpatrick's Model is hierarchical in nature, moving from basic evaluation of learning to evaluating application of learning and subsequent outcomes. Level 1 evaluation, “Reaction” (referred to as “the happiness index”), provided activity designers and instructors with baseline information regarding the degree to which learners were satisfied with the activity, but did not provide information on learning, application of learning, or outcomes. This level of evaluation was routinely employed for UMNSPH courses using a standardized evaluation tool. Level 2 evaluation strategies were designed to determine the degree to which KSAs were learned. A variety of evaluation strategies were used to measure student learning, including course presentations, group work, written papers, and pre- and posttest measures. Level 3 evaluation, characterized by a change in behavior, required contact with those who had completed courses to determine what, if any, behavior changes had occurred as a result of the newly acquired competencies. Kirkpatrick's Level 4 evaluation addressed outcomes resulting from changes in behavior. An evaluation yielding data regarding achievement of desired outcomes represents an example of Level 4 evaluation.

Competency reflects the ability of the learner to appropriately apply what has been learned in professional practice.7 It was, therefore, important to incorporate Level 3 and, to the degree possible, Level 4 measures into the evaluation plan. With this in mind, we decided that one of the strategies for evaluating the curriculum would be to include a longitudinal study to gather data about changes in behavior, based on what had been learned in courses comprising these two certificate programs, and to determine if and how new learning was applied as part of everyday practice. Because changes in behavior signaling integration of new KSAs occur over time, we decided feedback would be solicited at six and 12 months following completion of coursework.

METHODS

Study design and recruitment

Learners accessed required and elective courses supporting development of the BT/ER Competencies through multiple mechanisms. One option was attendance at the University of Minnesota Public Health Institute (the Institute). For one to three weeks, several concentrations of courses were made available to students interested in gaining academic credit or to community professionals interested in completing courses for professional development purposes. Because of the number of course options offered during these weeks (Figure 2) and the availability of courses supporting development of BT/ER Competencies, students enrolled in the Institute were selected as the target population for this study.

Figure 2.
Bioterrorism/emergency readiness concentration areas and courses, University of Minnesota Public Health Institute, 2003–2008

Between 2003 and 2008, each new cohort of certificate enrollees was offered support to attend required or elective courses, available through the Institute, that were designed to develop targeted competencies. The number of courses for which each student could receive support depended on financial resources that could be allocated to this activity each year. Typically, support was offered for attendance at one to three courses a year.

Students enrolled in the Public Health Certificate in Preparedness, Response, and Recovery and in the Certificate in Food Protection programs represented the first two groups to which participation in the study was offered. We also offered support to students enrolled in UMNSPH graduate degree programs who were known to be interested in developing competency in preparedness, response, and recovery, but who were not interested in obtaining a formal academic credential in emergency preparedness. The courses with BT/ER content represented electives for their academic program requirements. These students were offered a tuition waiver to attend courses incorporating BT/ER Competencies, and represented a third group of learners participating in the study. Students enrolled in the Public Health Certificate in Occupational Health and Safety were also offered support for course participation and included in the study, as several courses were offered at the Institute with an emphasis on preparedness targeted to this group. These students represented a fourth group of learners completing BT/ER-focused coursework as part of their program of studies. Also attending the Institute were public health practitioners who were not enrolled in formal academic programs, but who were participating to obtain discipline-specific continuing education credit. The Institute represented an important avenue for them to gain new competencies needed for their work in a focused period of time. Based on the specific course they attended, these learners were incorporated into one of the four groups of learners previously described (Figure 3).

Figure 3.
Potential study participants, Bioterrorism/Emergency Readiness Competency Study, University of Minnesota Center for Public Health Preparedness, School of Public Health, 2003–2008

We developed application materials to advertise the availability of support for course attendance. In the application, we asked people to identify current or previous work experience in the area of public health preparedness and response (or food safety, biosecurity, or occupational health and safety), how they felt courses would apply to their current or future work situation, and why financial assistance was needed. The application noted that those who accepted support would be contacted at six and 12 months to solicit feedback on the usefulness of the content to their practice and to gather feedback regarding coursework. A survey was selected as the mechanism for data collection, and a tool for each cluster of courses was constructed. Each survey included the same eight questions, with the exception of the listing of course options. We updated the survey tool annually to reflect changes in course offerings.

The first question asked participants to select from a list which course(s) they attended and, for each course, to identify which of the BT/ER Competencies they felt they had developed through their participation in the course. This question provided an indication of the degree to which there was a match between the competency intended to be developed by that course and the perception of the learner as to which competency they actually developed. Next, they were asked to identify the most important thing learned in each course attended (Question 2); if their learning had influenced how they thought about issues or their work (Question 3); how the learning influenced their work (Question 4); if they had applied what they learned in any manner and with what outcomes (Questions 5 and 6); and if the application of their new learning had affected colleagues or people in the community and the number of individuals impacted by the change (Questions 6 and 7). Finally, they were asked to identify what barriers they encountered in trying to apply their new learning (Question 8).

Because participants could choose from multiple course offerings, a large number of potential key content areas were contained in the courses. This large selection made it less feasible to create a comprehensive list of these content areas in a paper survey. We also anticipated that participants might want to identify important learning that emerged not from content, but from interactions with other participants or from other elements related to a learning experience. The range of potential applications of their new knowledge and skills was also anticipated to be wide due to variations in work and experience. For these reasons, the majority of the questions were framed as open-ended questions.

We drafted a recruitment message and secured approval from the University of Minnesota Institutional Review Board. An announcement of the availability of this support was developed and widely distributed with the assistance of UMNCPHP Advisory Cooperative members in Minnesota, North Dakota, and Wisconsin. Electronic messages were also inserted into a variety of UMNSPH venues. The academic program coordinator and UMNCPHP program staff reviewed applications and determined awards.

Data collection

At approximately six and 12 months following an Institute session, those students who received financial support through the awards were contacted and offered an opportunity to participate in the study. Originally, the study design called for the survey to be mailed. However, after the first iteration, we determined that e-mail represented a more efficient and effective way to contact potential participants and, thus, e-mail was used for the remainder of the study. The option to complete a paper survey was offered, but with few exceptions, participants elected to complete the survey and return it as an attachment to an e-mail message. A follow-up reminder was sent two weeks after the initial contact to those who had not responded; data collection for each collection period lasted one month. Electronic submissions were printed and, along with any mailed surveys that were received, the appropriate code number was noted and the e-mail message was permanently deleted. Data were entered into a Microsoft® Excel spreadsheet, with information organized by course offering.

Between 2003 and 2008, 387 people received tuition waivers or scholarships, and this group comprised the potential survey respondents. Multiple rounds of data collection were conducted during this time period (Figure 4). Twelve rounds of surveys were sent soliciting feedback from those who attended courses required as part of the Food Safety/Food Protection curriculum; 13 rounds were sent to gather data regarding the Preparedness, Response, and Recovery curriculum; six sets of surveys were sent to those who received tuition waivers; and five rounds of surveys were sent to those who received support under the umbrella of Occupational Health and Safety. In all, we collected 36 sets of data.

Figure 4.
Schedule of data collection per cluster of courses, Bioterrorism/Emergency Readiness Competency Study, University of Minnesota Center for Public Health Preparedness, School of Public Health, 2003–2008

RESULTS

For one set of data, the incorrect survey was sent, making data received from seven participants unusable. A total of 244 usable responses from individual participants were received at either six or 12 months; many participants responded at both data collection points.

As previously noted, the curriculum was designed so that multiple courses contributed to the development of a given competency. During the time this study was conducted, the 29 courses that comprised the certificate programs were offered a total of 53 times, with each course offered from one to 11 times. The data for all courses reflecting a specific competency were combined, yielding one set of data for each of the competencies. The smallest number of datasets relating to a specific competency equaled eight sets; the largest number of datasets relating to a specific competency equaled 37 sets (Table 1).

Table 1.
Datasets per competency, Bioterrorism/Emergency Readiness Competency Study, University of Minnesota Center for Public Health Preparedness, School of Public Health, 2003–2008

The number of respondents who provided feedback for a specific course varied from course to course because those who received support enrolled in one or more courses. The number of respondents from the six- and the 12-month time periods also varied, given that data were not collected for each time period in all years and for all types of scholarships.

To determine the degree to which participants identified that the curriculum reflected targeted competencies, we completed a number of calculations. First, we grouped all courses reflecting a specific competency. Next, the number of study participants who completed each of the courses reflecting a given competency were tallied, generating the total number of possible responses for each competency. This number was then compared with the actual number of responses received regarding that set of courses (Table 2). The possible number of responses for the entire curriculum was 1,719 at six months and 1,609 at 12 months for all learners across all courses and for all years. Next, we compared the total number of people who submitted competency data with those who indicated that the courses reflected the competency as delineated in the curriculum plan (Table 3). At six months, 64.1% of those who responded identified that course(s) they attended helped them develop the competency linked to the course in the curriculum plan. At 12 months, 63.5% of those who responded indicated the course(s) helped them develop the competency linked to the course in the curriculum plan.

Table 2.
Number of responses per competency, Bioterrorism/Emergency Readiness Competency Study, University of Minnesota Center for Public Health Preparedness, School of Public Health, 2003–2008
Table 3.
Number of responses affirming competency development, Bioterrorism/Emergency Readiness Competency Study, University of Minnesota Center for Public Health Preparedness, School of Public Health, 2003–2008

The survey's open-ended questions were designed to elicit information about important elements learned and applicability of the content to the learner. The 813 comments submitted at six or 12 months identifying important elements learned varied greatly depending on the participant's specific work setting, the course(s) completed, and the certificate program in which the participant was enrolled. To aid in analysis, we clustered the comments (Table 4).

Table 4.
Clustered responses of important content, Bioterrorism/Emergency Readiness Competency Study, University of Minnesota Center for Public Health Preparedness, School of Public Health, 2003–2008

While many comments identified course content elements felt to be particularly helpful to the participant, many others reflected development of critical thinking and systems-thinking abilities. Participants noted that courses challenged their analytic abilities and challenged them to think systematically about how public health prepares, recognizes, and responds to a variety of health issues. They identified a deeper appreciation for the complexity of planning and response, and for the need for collaboration across all agencies and levels of government. They recognized the need to use the National Incident Management System8 as a mechanism to organize elements of a plan and response. In considering the emergence of pathogens that cross borders either through person-to-person contact, from animals, or from contamination of food, participants recognized the need for improved surveillance, for the use of technology to organize data, and for shared data management across agencies. They noted the importance of participating in coursework that stressed interdisciplinary practice and identified the value of having an opportunity to practice skills they were being taught in a classroom setting before being asked to apply them in an actual situation.

The survey asked for a description of how the participants had applied what they had learned, and the number and nature of the impacts of any efforts they had made to apply their new knowledge and skills. These questions garnered 274 submitted comments. Multiple participants noted that they were better able to gauge the resources needed within their agencies and more effectively advocate for these resources. Others offered examples of applications related to work with specific community groups or individuals, with a focus on improving food handling and safety, assessment of risk, and revision of agency surveys, so that better data were being collected from and about the community. Many noted that they had used what they had learned to inform and educate county and other elected officials, others in their own agencies or departments, and students enrolled in courses they were responsible for teaching. Some reported that they had conducted trainings that spurred citizens to engage in personal preparedness planning activities. One student who attended the Institute in multiple years described several deployments as part of military service. During these deployments, this person applied concepts from coursework in the development of training materials for use by the national military in the country in which the deployment occurred and in response to “specific threats of intentional contamination of the army's water supply.” Risk assessment and management skills were also used to protect the food supply in that country. Another participant reported the application of behavioral health skills during a Medical Reserve Corps deployment in response to Hurricane Katrina, while another reported on wild bird testing and “avian influenza” surveillance in Kenya. Reports included expansion of professional practice through participation in a national task force and through plans to enter doctoral programs or the U.S. Public Health Service. By far, the largest number of comments described ways that learners applied their new KSAs to develop a wider range of exercises, revise agency and/or department emergency plans, enhance and expand cross-agency collaboration, and improve in risk communication/media relationships. Across all years and courses, 105 participants reported that changes they implemented impacted 364,216 individuals, with the potential to impact up to 884,053 people in their communities, agencies, or organizations.

A total of 131 participants identified barriers to implementation of learning. The most common barrier, noted by 43 (33%) of the participants who submitted comments to this question, was current enrollment as a student. Those in the workforce who responded to this question identified ongoing challenges with financial and human resources, and lack of time to develop and implement changes in practice. In smaller agencies, staff “wearing multiple hats” struggled to find time to integrate new learning into community education materials or to create or update agency procedures. While courses included teaching students to use a specific risk assessment/management software, this software was not always available in the work setting, limiting the ability to apply what had been learned. The challenge of working off a shared knowledge base was also noted, along with the challenge of trying to educate or convince those who had less knowledge but wider spans of control of the merit of new ideas: “Old habits die hard,” as one participant reported. The impact of funding changes was also identified as a barrier to implementation, with layoffs and shifting agency priorities influencing the degree to which new knowledge could be applied. Participants also reported underutilization of expertise due to changing job responsibilities or the inability of an organization to link new skills to situations in which the skills could have been helpful.

DISCUSSION

Overall, participants' responses supported that the curriculum delivered what it said it would: the development of the BT/ER Competencies. In all but one of the collection periods, all courses were seen as supporting development of these competencies by more than half of the participants. Only the 12-month data for competency #8 demonstrated less than 50.0% support for acquisition of the competency. The ratings were also fairly stable over one year, with 64.1% of the responders indicating they had met the competency requirements at six months, and 63.5% of the responders identifying this achievement at 12 months (Table 3). Level 2 of the evaluation plan identified that courses would use a variety of evaluation methods (e.g., final examination, papers, and presentations) as evidence that learning had occurred. During this time period, 99.2% of all participants who attended courses offered during the Institute received a passing grade (defined as a grade of C– or higher). Those receiving scholarships or tuition support were a subset of that group. Given the high passing rate, it is likely that the majority of these students successfully completed course requirements. Additionally, the grade point average (GPA) for students who are enrolled in or who have graduated from the Public Health Certificate in Preparedness, Response, and Recovery and the Public Health Certificate in Food Safety/Food Protection programs was 3.8 (out of 4.0). These two groups of students represented 74.4% of the students participating in the study. This high rate of successful course completion and high GPA are evidence that learners successfully acquired new KSAs. Comments that the participants submitted revealed that they had applied/were applying what they had learned across a wide range of situations.

The language of the BT/ER Competencies includes higher-level skills such as “apply creative problem-solving and flexible thinking & and evaluate [the] effectiveness of all actions taken”.2 Participants' submitted comments reflecting changes in analytic abilities and systems thinking provide an indication that reflective and critical thinking skills were developed through this curriculum.

Lessons learned

Conducting this research also provided important learning for UMNCPHP. The data collection timeline was ambitious, and competing priorities impacted the ability of the Center to maintain the original data collection plan. What was envisioned as a simple project took far more time than anticipated, and some planned data collection points needed to be eliminated. Difficulty in contacting study participants also impacted data collection. Changes in the participant's personal situation or work organization were common, and the degree to which this occurred was unanticipated. Many solicitations for participation were met with dead-end e-mails at both the six- and 12-month data collection periods. Some of the opportunities to solicit participation were lost as students graduated and discontinued use of their university e-mail accounts. For participants in the workforce, it is possible to suggest several reasons why this rapid turnover occurred, but impossible to identify the cause without engaging in speculation.

As previously described, those who planned the curriculum grounded it in the BT/ER Competencies. The designation of competency as identified by the planners was compared with the identification of competency obtained from survey responses. Data analysis demonstrated that the majority of those who responded to the survey agreed that the courses reflected targeted competencies (Table 3).

In addition to selection of competencies by the curriculum planners and the students, we used a third process to gather feedback about the competency developed from a given course. While not a formal part of the study, UMNSPH continuing education (CE) specialists with advanced degrees in education or public health reviewed all of the Institute's courses to identify competencies. Their review incorporated participation in course planning meetings, analysis of course syllabi, and, at times, direct classroom evaluation. The competency selection of the CE specialist was reviewed as part of data analysis. The majority of the time, the competency selection by the CE specialist matched that of the curriculum planners, but there were instances when, based on their observation, analysis, and evaluation, the course as implemented focused on competencies not identified in the curriculum plan. This finding illustrates the difficulty of maintaining fidelity to a competency-based curriculum. That said, interim study reports generated each year and disseminated to course designers and faculty to inform planning for the next year were extremely valuable in refining courses offered as part of the curriculum.

CONCLUSIONS

While there are challenges associated with conducting longitudinal studies in educational settings, the results provide compelling evidence that learners do develop competencies impacting their work as a result of competency-based educational programming. Learners can identify what competencies they develop and articulate the ways in which they apply knowledge. They are motivated to use new learning to improve the preparedness of organizations and communities, but at times struggle with organizational readiness for change and with the reality of working during a time in which their abilities are being challenged in many ways. As educators, we have the tools to help them address these challenges through development and implementation of competency-based programming.

Footnotes

This project was supported in part through the Centers for Disease Control and Prevention (CDC) Grant/Cooperative Agreement #U90/CCU524264. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of CDC.

REFERENCES

1. Olson D, Hoeppner M, Larson S, Ehrenberg A, Leitheiser A. Lifelong learning for public health practice education: a model curriculum for bioterrorism and emergency readiness. Public Health Rep. 2008;123(Suppl 2):53–64. [PMC free article] [PubMed]
2. Columbia University School of Nursing, Center for Health Policy. Bioterrorism and emergency readiness: competencies for all public health workers. 2002. [cited 2009 Oct 26]. Available from: URL: https://www.train.org/Competencies/btcomps.pdf.
3. University of Minnesota School of Public Health. Public health certificate in preparedness, response, and recovery. [cited 2009 Oct 26]. Available from: URL: http://www.sph.umn.edu/programs/certificate/index.asp.
4. University of Minnesota School of Public Health. Public health certificate in food protection. [cited 2009 Oct 26]. Available from: URL: http://www.sph.umn.edu/programs/certificate/index.asp.
5. Benner P. From novice to expert: excellence and power in clinical nursing practice. Menlo Park (CA): Addison-Wesley Publishing Co.; 1984.
6. Kirkpatrick D. Evaluating training programs: the four levels. 2nd ed. San Francisco: Berrett-Koehler Publishers, Inc.; 1998.
7. Lucia A, Lepsinger R. The art and science of competency models: pinpointing critical success factors in organizations. San Francisco: Jossey-Bass; 1999.
8. Department of Homeland Security (US) NIMS resource center: about the National Incident Management System. [cited 2010 Jun 23]. Available from: URL: http://www.fema.gov/emergency/nims/AboutNIMS.shtm.

Articles from Public Health Reports are provided here courtesy of SAGE Publications