|Home | About | Journals | Submit | Contact Us | Français|
Notifiable disease surveillance systems are critical for communicable disease control, and accurate and timely reporting of hospitalized patients who represent the most severe cases is important. A local health department in metropolitan Denver used inpatient hospital discharge (IHD) data to evaluate the sensitivity, timeliness, and data quality of reporting eight notifiable diseases to the Colorado Electronic Disease Reporting System (CEDRS).
Using IHD data, we detected hospitalized patients admitted from 2003 through 2005 with a discharge diagnosis associated with one of eight notifiable diseases. Initially, we compared all cases identified through IHD diagnoses fields with cases reported to CEDRS. Second, we chose four diseases and conducted medical record review to confirm the IHD diagnoses before comparison with CEDRS cases.
Relying on IHD diagnoses only, shigellosis, salmonellosis, and Neisseria meningitidis invasive disease had high sensitivity (≥90%) and timeliness (≥75%); legionellosis, pertussis, and West Nile virus infection were intermediate; and hepatitis A and Haemophilus influenzae (H. influenzae) invasive disease had low sensitivity (≥90% and timeliness to ≥80% for H. influenza invasive disease, legionellosis, and pertussis; however, hepatitis A retained suboptimal sensitivity (67%) and timeliness (25%).
Hospital discharge data are useful for evaluating notifiable disease surveillance systems. Limitations encountered by using discharge diagnoses alone can be overcome by conducting medical record review. Public health agencies should conduct periodic surveillance system evaluations among hospitalized patients and reinforce notifiable disease reporting among the people responsible for this activity.
Notifiable disease surveillance systems are critical for communicable disease control. In the United States, the Council of State and Territorial Epidemiologists (CSTE), in conjunction with the Centers for Disease Control and Prevention (CDC), has established a list of recommended notifiable diseases and provided surveillance case definitions.1 All 50 states have statutes that require health-care providers and laboratories to report notifiable diseases to state or local health departments.2 In Colorado, 60 laboratory findings and 66 diseases or conditions were notifiable to state or local health departments as of May 2008.3–5 Colorado also participates in Active Bacterial Core surveillance (ABCs) for six invasive bacterial agents as part of CDC's Emerging Infections Program.6 Notifiable diseases in Colorado are categorized as 24-hour, seven-day, or 30-day notifiable events, depending on the need for timely public health investigation. The list of notifiable events changes, on the basis of approval from the state board of health, with the emergence of new pathogens (e.g., West Nile virus infection was added in 2002), appreciation of previously recognized pathogens, or recognition that a notifiable condition is no longer a public health priority.
In 1999, the Colorado Department of Public Health and Environment (CDPHE) established one of the first online notifiable disease reporting systems in the United States. The Colorado Electronic Disease Reporting System (CEDRS) includes the majority of the state notifiable communicable diseases, with the exception of human immunodeficiency virus/acquired immunodeficiency syndrome and sexually transmitted diseases, which are maintained in separate databases. CEDRS can be used to report conditions, view case reports, create line lists or tables of case data, and export data for more complex analyses. Reports of notifiable disease are primarily obtained from hospitals through infection-control practitioners (ICPs) and outpatient commercial laboratories. Additionally, reports can be entered directly into CEDRS by disease-control staff at state and local health departments. Health-care providers and ICPs can also submit reports indirectly through telephone or fax to state or local health departments, which enter the reports into CEDRS the same day they are received.
Because confidential patient information (e.g., name, address, and telephone number) is tracked in CEDRS, access to the system is restricted. To obtain access, eligible people must apply and be approved by CDPHE to obtain a login identifier and password. Tri-County Health Department (TCHD) is the largest local public health agency in Colorado and serves 1.2 million residents of Adams, Arapahoe, and Douglas counties in metropolitan Denver.
Accurate and timely reporting of notifiable diseases is important, especially for hospitalized patients who represent the most severe cases. Thus, notifiable disease surveillance systems should be evaluated regularly to improve quality, efficiency, and usefulness and to ensure accurate interpretation of surveillance data.7,8 Three key evaluation components are sensitivity (i.e., proportion of true cases that are detected by the surveillance system), timeliness (i.e., interval between recognition of a notifiable disease by the physician or laboratory and receipt of the corresponding notifiable disease report by a public health agency), and data quality (i.e., completeness and validity of data items in a surveillance system).8 TCHD previously evaluated data completeness and timeliness of reporting for notifiable disease cases reported to CEDRS in 2004;9 however, it did not assess sensitivity because the total number of cases was unknown. Hospital discharge data are a population-based data source that has been used in prior surveillance system assessments to identify the true number of notifiable disease cases for diseases with high hospitalization rates.10,11 In our study, we explored the use of hospital discharge data as a tool to evaluate the sensitivity, timeliness, and data quality of reporting notifiable diseases to CEDRS from 2003 through 2005 among a subset of hospitalized patients.
We used inpatient hospital discharge (IHD) data to identify cases of notifiable diseases among hospitalized patients (hereafter, hospitalized cases). We then compared IHD data with cases reported in CEDRS to evaluate sensitivity, timeliness, and data quality.
We downloaded data for all cases of notifiable disease reported to CEDRS from January 1, 2003, through December 31, 2005. The extracted CEDRS dataset included all people whose residential street address was located within Adams, Arapahoe, or Douglas county. Identifying variables included first and last name, complete date of birth, sex, street address, ZIP code, and telephone number.
TCHD purchased IHD data from the Colorado Hospital Association for all inpatient admissions occurring from January 1, 2003, through December 31, 2005. The IHD dataset includes demographic, diagnostic, procedural, payment, and length-of-stay variables. The IHD data obtained by TCHD were de-identified and only included birth month and year, sex, and ZIP code variables. Because street addresses were unavailable, we requested data for all people whose residential ZIP code had >10% of area within Adams, Arapahoe, and Douglas counties combined. IHD data purchased separately by CDPHE included medical record numbers, which we used to request medical records for chart review and, thus, obtain additional patient identifiers (e.g., first and last name, complete date of birth, and street address).
We used two methods to identify the population, or denominator, of hospitalized cases using the IHD data: (1) discharge diagnosis only and (2) discharge diagnosis followed by medical record review. Hospitalized cases of notifiable diseases were identified within the IHD database if an applicable International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) code12 was listed as one of the first three discharge diagnoses (Table 1). We evaluated 24-hour and seven-day notifiable diseases that are routinely investigated by local or state public health agencies. The discharge diagnosis method was completed for eight diseases that had at least 10 records in the IHD database (i.e., Haemophilus influenzae [H. influenzae] invasive disease, hepatitis A, legionellosis, Neisseria meningitidis [N. meningitides] invasive disease, pertussis, shigellosis, salmonellosis, and West Nile virus infection). We completed the medical record review method for the four diseases that had the lowest sensitivity according to the discharge diagnosis method.
For the discharge diagnosis method, we limited the IHD database to patients who resided in a ZIP code assigned by the U.S. Postal Service to Adams, Arapahoe, or Douglas county (i.e., ZIP code with >50% of area within one of the three counties) to avoid inflating the denominator for sensitivity calculations. We created separate datasets for each of the eight diseases before merging IHD with CEDRS data. We created a 12-digit identifier for each record in the IHD and CEDRS databases by combining sex (1 = male, 2 = female), birth month, birth year, and ZIP code into a single field; a seven-digit identifier without ZIP code was also created for each record. For example, a male born in March 1970 and residing in ZIP code 80111 would have the following 12-digit and seven-digit identifiers: 103197080111 and 1031970, respectively.
We examined IHD records with identical 12-digit identifiers manually to identify and remove duplicate records for the same patient. If one of the duplicate records was listed as a transfer in the IHD database, and if the admission and discharge dates of the duplicate records corresponded, only the record for the first admission was retained; otherwise, all records were retained as potentially unique patients.
The first round of data linkage used an exact match on the 12-digit identifier to merge CEDRS and IHD records. For records that did not match in the first round, we performed a second round of data linkage by using an exact match on the seven-digit identifier. We reviewed matched records manually to identify and remove duplicates and records with incompatible IHD admission and CEDRS report dates.
We calculated sensitivity as the proportion of IHD hospitalized cases residing in a ZIP code assigned to a TCHD county that matched a CEDRS case record. Because neither the diagnosis date nor the laboratory test result date was available in the IHD data, report time was calculated as the number of days from IHD admission date to CEDRS entry date. We classified timeliness as adequate if the median report time for all cases was less than or equal to the required reporting interval designated by the state board of health (i.e., 24-hour or seven-day); otherwise, timeliness was classified as inadequate.
We selected the four diseases with the lowest sensitivity using the discharge diagnosis method for further evaluation. To capture all potential hospitalized cases, we expanded the IHD database to include all patients who resided in a ZIP code with >10% of area within Adams, Arapahoe, and Douglas counties combined (10 additional ZIP codes were included). We obtained medical record numbers for the IHD cases from CDPHE, and one member of the study team abstracted clinical and laboratory information from the medical records. For each of the four diseases, we classified cases as confirmed if they met the clinical description and confirmatory laboratory criteria defined in the CSTE/CDC surveillance case definition in place during the year the cases were reported.1 Probable, suspect, and epidemiologically linked cases were not included in the analysis. We mapped street addresses obtained from medical charts to remove hospitalized cases who did not reside in TCHD jurisdiction.
We calculated sensitivity as the proportion of confirmed IHD hospitalized cases residing within the TCHD jurisdiction that matched to a CEDRS case. We calculated report time as the number of days from medical record laboratory result date to CEDRS entry date. If the laboratory result date was missing, we added one day to the specimen collection date for use as a proxy. We used the same criteria to classify timeliness as adequate or inadequate.
We assessed completeness and validity for three CEDRS data items associated with hospitalization (i.e., hospitalized [yes/no], admission date, and discharge date) among the CEDRS cases that matched to an IHD case using the discharge diagnosis method. The hospitalization data item is typically included as part of the initial case report, whereas the admission and discharge dates are typically completed during follow-up of the case report by public health agencies. We excluded West Nile virus infection from this analysis because the hospitalization data field was not available for the entire study period. We calculated completeness as the proportion of non-missing values and validity as the proportion of correct values. We considered the IHD database the standard for admission and discharge dates, and all matched CEDRS cases were expected to have “yes” in the hospitalization field.
Using the discharge diagnosis method alone to identify the population, or denominator, of hospitalized cases of notifiable diseases, we found the sensitivity of reporting to CEDRS was highest (≥90%) for shigellosis, salmonellosis, and N. meningitidis invasive disease and lowest (≤25%) for hepatitis A and H. influenzae invasive disease (Table 2). Among the 24-hour reportable diseases, hepatitis A and H. influenzae invasive disease had an inadequate median report time, and only one-third of cases were reported within the required time interval. In contrast, N. meningitidis invasive disease and pertussis had adequate timeliness. All four seven-day reportable diseases had an adequate median report time; however, about half (49%)of the West Nile virus infection cases were not reported within seven days.
Compared with the discharge diagnosis method alone, the use of medical record review to limit the population to confirmed hospitalized cases resulted in improved sensitivity and timeliness for each of the four diseases examined (Table 3). Because there were 112 H. influenzae invasive disease cases in the IHD database (Table 2), only 39 cases admitted during 2005 underwent medical record review. H. influenzae invasive disease, pertussis, and legionellosis all obtained high sensitivity (≥90%) and timeliness (≥80%). Hepatitis A retained the lowest sensitivity (67%) and longest median report time (three days) of the diseases examined. The number of confirmed cases was small (n=6) for both hepatitis A and H. influenzae invasive disease.
We evaluated data quality among the 103 CEDRS cases that matched against the IHD cases in the discharge diagnosis method (excluding West Nile virus infection). The CEDRS hospitalization field was 96% complete and 86% valid. Hospital admission and discharge date fields were available in CEDRS for four diseases (H. influenzae invasive disease, N. meningitidis invasive disease, salmonellosis, and shigellosis) representing 68 CEDRS cases. The CEDRS admission and discharge date fields were 88% and 85% complete, respectively, and both fields were 74% valid, compared with dates provided in the IHD database.
The sensitivity and timeliness of reporting hospitalized cases of notifiable diseases to CEDRS differed across the eight diseases examined and according to the methodology used to identify true hospitalized cases. Reliance on ICD-9-CM discharge diagnoses from the IHD database worked better for certain notifiable diseases than for others. Review of medical records to confirm discharge diagnoses resulted in improved sensitivity and timeliness of reporting. Hence, medical record abstraction should be considered a necessary component of evaluating notifiable disease surveillance systems when using hospital discharge databases.
Our study demonstrated that the utility of using discharge diagnoses is dependent upon the complexity of the notifiable disease case definition and a clinician's ability to make a diagnosis based on clinical signs and symptoms alone. Using discharge diagnoses to identify true cases of notifiable diseases was most useful for diseases (i.e., salmonellosis, shigellosis, and legionellosis) that had simple case definitions and nonspecific clinical signs (e.g., diarrhea, vomiting, or chest oscillation suggestive of pneumonia) that would require laboratory confirmation of an etiologic agent before diagnosis. In contrast, discharge diagnoses were least useful for diseases (i.e., H. influenzae invasive disease and pertussis) that had complex case definitions or unique clinical signs (e.g., cough paroxysms) that might lead to a presumptive diagnosis in the absence of laboratory confirmation.
Both N. meningitidis invasive disease and H. influenzae invasive disease are included in Colorado's active laboratory-based surveillance (i.e., ABCs) and were therefore expected to have high sensitivity.6 By using the discharge diagnosis method only, we determined that the sensitivity of N. meningitidis invasive disease was 90% and H. influenzae invasive disease was 17%. To meet the CSTE/CDC definition for a confirmed case, both conditions require isolation of the bacterium from a normally sterile site (e.g., blood or cerebrospinal fluid). For H. influenzae invasive disease, only 15% of the medical records reviewed were for confirmed cases; the majority of unconfirmed cases had H. influenzae isolated from a nonsterile site (i.e., sputum) and a diagnosis of pneumonia. The removal of these noninvasive cases during the medical record review substantially improved the reporting sensitivity of H. influenzae invasive disease from 17% to 100%.
For pertussis, one-third of IHD cases that underwent medical record review were not classified as confirmed based upon the available information. The majority of unconfirmed cases met the clinical description in the CSTE/CDC case definition but had a missing or negative laboratory result in the medical record (i.e., they were probable or epidemiologically linked cases). This finding indicates that clinicians might be more likely to diagnose pertussis without laboratory confirmation either because of confidence in recognition of clinical signs or the low predictive value of the diagnostic tests. Limiting the population to confirmed cases through medical record review improved reporting sensitivity of pertussis from 59% to 100%. We did not assess the sensitivity of reporting probable or epidemiologically linked cases in CEDRS.
Medical record review had less of an impact on the reporting of legionellosis cases. The identification of one additional case in CEDRS by using information available from the medical record increased the reporting sensitivity of legionellosis from 80% to 90%.
Hepatitis A had suboptimal sensitivity and timeliness according to both the discharge diagnosis and medical record review methods. A limited number of cases of hepatitis A were confirmed during medical record review; unconfirmed cases usually had a prior history of hepatitis A infection indicated in the patient's medical record. After medical record review, the reporting sensitivity of hepatitis A improved from low (21%) to moderate (67%), but timeliness remained inadequate. These findings might reflect unstable estimates resulting from the limited sample size (n=6) or the need for improved hepatitis A reporting within the TCHD jurisdiction.
Challenges to using the discharge diagnosis method alone for surveillance evaluations can result in an underestimation of sensitivity and timeliness. First, IHD databases are designed for administrative and billing purposes, and inconsistencies exist in assigning and ordering ICD-9-CM codes in the discharge diagnoses fields across hospitals and physicians. Second, the denominator of hospitalized cases obtained from the IHD database might be overestimated for multiple reasons: (1) a notifiable disease discharge diagnosis might not meet the criteria for a confirmed case according to the CSTE/CDC surveillance case definitions; (2) a notifiable disease discharge diagnosis listed as the second or third diagnosis can indicate past rather than current illness; (3) the IHD database might contain multiple records per patient, and identifying and removing duplicate records (i.e., hospital transfers or readmissions) is difficult without personally identifying information; and (4) the IHD database might contain records of patients who reside outside TCHD's jurisdiction, because records were selected according to residential ZIP code rather than street addresses.
Third, the lack of personally identifying information in the IHD database limited the ability to merge records with the CEDRS database because: (1) multiple patients with the same 12-digit identifier might be included in one or both databases, (2) the same patient with incorrect information in one or both databases is unlikely to match, and (3) lack of patient name and street address makes verifying the accuracy of the matches difficult. Lastly, use of admission date as a proxy for diagnosis date in the discharge diagnosis method likely increased the calculated report time. This is especially true for diseases that require more time for laboratory confirmation (e.g., culture results for H. influenzae).
Review of medical records provided the clinical and demographic information necessary to overcome the challenges of the discharge diagnosis method described previously, resulting in improved sensitivity and timeliness of reporting for each of the four diseases examined. Specifically, we limited the denominator of hospitalized cases to confirmed cases, according to the national CSTE/CDC case definition. We then identified and removed data regarding patients who had been transferred or readmitted, or who resided outside TCHD's jurisdiction. Matches between CEDRS and IHD databases were verified by using patients' names, and laboratory result dates were available for the majority of cases. However, the medical record review method has limitations that deserve mentioning.
First, TCHD did not obtain identifiers for medical record review and had to partner with the state health department to obtain medical record numbers from the IHD database. Second, requesting medical records is time and resource intensive, as is traveling to different hospitals to conduct chart abstraction. This process can be simplified if electronic medical records are available to the state health department, as was the case with one hospital in this study. Third, we suspected that using specimen collection date plus one day as a proxy for laboratory result date increased the calculated report time. However, a post-hoc analysis determined that using collection date plus two days made no difference for three of the four diseases but improved the timeliness of hepatitis A reporting from 25% to 50%. Lastly, limited sample size can affect the calculation of sensitivity and timeliness for notifiable diseases that have a low incidence or do not often result in hospitalization, as was the case for hepatitis A and H. influenzae invasive disease. For such diseases, the medical record review method should only be considered for health departments that serve a substantial population or are able to combine multiple years of surveillance data.
To provide the best communicable disease control, local and state health departments should strive to obtain 100% sensitivity and timeliness in reporting cases of notifiable diseases. Through this evaluation of notifiable disease reporting among hospitalized patients, TCHD identified the following areas for improvement. First, more information is needed to understand the inadequate reporting of hepatitis A. Also, medical record reviews and more recent years of data are needed to determine whether sensitivity of West Nile virus infection reporting has increased with time. Lastly, TCHD staff should work to increase the completeness and validity of data items related to hospitalization, especially obtaining the correct admission and discharge dates from ICPs when conducting follow-up on hospitalized patients.
Hospital discharge data can be a useful tool for evaluating reporting of severe cases of notifiable diseases that result in hospitalization. Using discharge diagnoses alone is practical only for diseases that have simple case definitions and nonspecific symptoms that require laboratory confirmation before making a diagnosis. In most instances, however, medical record review should be considered a necessary adjunct to using discharge diagnoses alone, as it results in more accurate calculations of sensitivity and timeliness. We recommend that local and state health departments use IHD data and medical record review during routine evaluations of notifiable disease surveillance systems. To improve the utility of using IHD data for disease control and public health needs, state hospital associations might consider providing local public health agencies with personal identifiers or medical record numbers. Lastly, local and state health departments should regularly provide education, training, and feedback on notifiable disease surveillance to people responsible for reporting (e.g., ICPs, laboratorians, and physicians).
The findings and conclusions in this article are those of the authors and do not necessarily represent the views of the Centers for Disease Control and Prevention.