|Home | About | Journals | Submit | Contact Us | Français|
Since the events of September 11, 2001 (9/11), health-care institutions have been encouraged to enhance their readiness for disasters. The Joint Commission (previously the Joint Commission on Accreditation of Healthcare Organizations) has, since 2001, required member hospitals to complete an annual hazard vulnerability analysis (HVA), which is expected to provide a foundation for emergency planning efforts. A literature search revealed that little has been written and published on HVA since that requirement came into effect, and no known investigation of current HVA procedures has been completed.
To begin to address this gap, researchers from the Harvard School of Public Health and the Southern Maine Regional Resource Center for Public Health Emergency Preparedness (SMRRC) interviewed staff members at eight hospitals in Maine to document current HVA processes and develop recommendations for improvement. SMRRC is one of three regional nonprofit hospital-based centers in Maine guiding health systems and public health preparedness activities.
Hospitals and other health-care organizations have always had to prepare for and respond to a wide array of routine emergency and catastrophic disaster events. Since the terrorist attacks of 9/11 and subsequent attention and funding from the U.S. Department of Health and Human Services and Department of Homeland Security, hospitals have been urged to substantially expand their response plans and overall readiness for disasters. Hospitals are now expected to develop, implement, train, and exercise comprehensive all-hazards emergency management and operations plans. These planning efforts need to be inclusive of all four phases of emergency management: mitigation, preparedness, response, and recovery.
Emergency management programs and their associated emergency operations plans are only as good as the assumptions upon which they are based, which is especially true at the local level where planning must take into account specific risks unique to the immediate environment. Local priorities need to be considered, in addition to those required by federal and state authorities, and detailed in the goals, objectives, and deliverables tied to all funding streams. However, local priorities based on opinion alone, and not on objective data, can provide a weak foundation for planning. Expert clinical or administrative staff opinions can result in waste, duplication, missed opportunities, siloing, and confusion over what the true priorities are in terms of threat, vulnerability, and risk.
In the 2001 edition of its Comprehensive Accreditation Manual for Hospitals, the Joint Commission significantly revised the existing standard for emergency management.1 For the first time, the Joint Commission was guiding hospital emergency preparedness efforts “into the same arena as emergency management in the community as a whole.”2 Hospitals were now expected to function as an “integrated entity within the scope of the broader community.”
The 2001 standard urged that hospital response plans now must be “based on a hazard vulnerability analysis (HVA) performed by the hospital.” Although HVA was a relatively new term for hospital staff, the concept itself was not.2 The Joint Commission defined HVA as “the identification of hazards and the direct and indirect effects these hazards may have on the hospital.” The actual or anticipated hazards are analyzed in the context of the population at risk to determine the vulnerability to each specific hazard.
Hospital emergency managers have long performed HVAs in their heads, as “much of the process is highly intuitive.” For example, hospitals in the Midwest do not need to plan for hurricanes, while those along the Atlantic Coast must. Even the way risk has been defined both qualitatively and quantitatively for hospitals is wide-ranging in its scope and use. As a result, “risk may be one of the most elusive concepts in health emergency management.”3
While mandating that hospitals perform HVA, the 2001 Joint Commission standard did not formalize the process for doing so. Additionally, the Joint Commission did not offer a specific tool to normalize the process in hospitals. While the American Society for Healthcare Engineering (ASHE) of the American Hospital Association offered the first standard methodology in 2001 for performing a hospital HVA,2 a wide array of other tools and methods also became available for hospitals to utilize for risk and vulnerability assessment.3
Later in 2001, Kaiser Permanente developed a modified Hazard Vulnerability and Assessment Tool for Medical Center Hazard and Vulnerability Analysis.4 This tool expanded both the guidance and scope of hazard “events” that hospitals should consider. Specifically, it expanded the risk measures to include human impact, property impact, and business impact. Each measure was rated separately for each event and weighted in the final vulnerability score. Likewise, the mitigation measure was expanded from the ASHE tool, which simply rated preparedness as “poor,” “fair,” or “good.” The new tool broke mitigation down into preparedness (preplanning), internal response (time, effectiveness, and resources), and external response (community/mutual aid staff and supplies). This final measure reflected the intended outcome of the new Joint Commission standard by assessing hospitals as community organizations rather than stand-alone institutions.
The following year, HCPro, Inc., a private health-care regulation and compliance product and service provider, published its own HVA Toolkit for hospitals.5 Similar to the Kaiser tool, this toolkit is meant to facilitate the evaluation of every potential event in each of the three categories: probability, risk, and preparedness. Like the others, the kit allows the user to add events as necessary. To determine probability, users are encouraged to consider known risk, historical data, and manufacturer/vendor statistics. The Joint Commission does not provide this level of detail or guidance; rather, it is individual private publishers that offer HVA tools with this level of specificity. While helpful, these modifications make it difficult to draw comparisons among hospitals, or across jurisdictions or states.
While the Joint Commission continues to refine and expand emergency management standards, it has yet to provide a standardized method or tool for conducting HVAs. What none of these tools or the Joint Commission standard offers, however, is a standardized method for collecting or using HVA data at the hospital or community level. Hospitals are left on their own to determine how they will collect information on probability and severity, how they will process that information within the institution, and what to do with the results.
The primary objective of this study was to investigate how institutions at the local level, in particular hospitals in Maine, currently implement HVA, in an effort to encourage future research on this topic to ultimately improve HVA efficacy.
During 2005 and 2007, the SMRRC invited eight hospitals in the Southern Maine region to participate in a regional HVA process. The Southern Maine region includes acute care and mental health hospitals within York, Cumberland, Sagadahoc, and Lincoln counties, most of which are Joint Commission accredited. An electronic copy of the Medical Center HVA template and instructions were provided to each hospital's emergency preparedness contact. These individuals participate regularly in SMRRC activities and preparedness efforts. They represent a variety of departments from their institutions, including hospital administration, planning, safety, infection control, and facilities management.
Administration of the HVA tool was customized to best meet the needs and available resources of each facility. If a facility had recently completed an HVA, its staff members were encouraged to use those data to aid in the completion of the SMRRC version. Other facilities distributed the HVA forms to individual members of their internal Environment of Care or Emergency Preparedness Committees and then convened as a group to reach consensus for the organization. The HVA tool used in this study was based on the model developed by Kaiser Permanente and modified for use by the SMRRC.
During April 2008, we conducted a series of face-to-face, semi-structured, in-depth interviews with staff from each of the participating hospitals who were identified to have a key role in the HVA process at their facility. Two interviewers attended each discussion and subsequently compared notes to assure objectivity. The questions were largely drawn from a paper entitled, “Risk and Risk Assessment in Health Emergency Management.”3 Beyond the issues suggested by this paper, the interviewers discussed the HVA results produced in each hospital and changes in results from year to year.
The lack of standardization in the HVA process from hospital to hospital became apparent as the survey progressed. Specifically, the researchers found the following:
We believe the efforts presented in this article are among the first exploratory investigations into this important issue. We encourage other public health professionals to pursue investigations covering more health-care institutions and employing more rigorous research methods. In addition, we offer the following recommendations:
The contents of this article are solely those of the authors and do not necessarily represent the views of CDC, the U.S. Department of Health and Human Services, or any partner organizations, nor does mention of trade names, commercial practices, or organizations imply endorsement by the U.S. government.
This article was supported by funding awarded to the Harvard School of Public Health (HSPH) Center for Public Health Preparedness under Grant/Cooperative Agreement #3U90TP124242-05 from the Centers for Disease Control and Prevention (CDC).