This hospital-level intervention is the first to demonstrate immediate and durable changes in COD reporting with a reduction in heart disease overreporting and an increase in the average number of conditions reported. The reporting changes at the 8 intervention hospitals were so pronounced that citywide outcome measures were notably different before and after the intervention.
Fourteen previous reports have measured effectiveness of COD educational efforts (33
); only 1 study was conducted in the United States (21
). Some studies asked trainees to complete death certificates for fabricated cases immediately pre- and postintervention, an approach that limits generalizability to other causes of death and does not evaluate sustained intervention effects (33
). Other studies compared COD documented on the death certificate with the medical record before and after intervention, a resource-intensive approach. In those reports, hospital-affiliated physicians provided training (33
). We add to the literature by showing that public health staff vested in vital statistics data can also effect change and that interactive training can have a sustained effect on COD reporting.
Our study has some limitations. The average number of conditions reported was correlated with data accuracy preintervention, but this may not be generalizable to other settings. Our intervention hospitals had significant heart disease overreporting; thus, our findings may not be generalizable to hospitals with a lesser degree of overreporting or different types of quality issues. Another limitation is that we did not query physicians postintervention to learn whether the in-service training met their COD reporting needs.
Given the volume of deaths occurring at intervention hospitals (375 per month on average), resources did not permit direct comparison of death certificates with the medical record except in our preintervention sample. The inability to definitively conclude postintervention accuracy is a study limitation. We did establish that hospitals overreporting heart disease preintervention reported fewer conditions on average and that inaccurate death certificates of overreporting hospitals reported fewer conditions than their accurate death certificates reported. Thus, the observed increase in average number of conditions reported at intervention hospitals suggests improved COD reporting. Additionally, the number of deaths from other leading causes postintervention increased proportionately as expected, suggesting improved reporting rather than a shift from heart disease to a single or few random erroneous reported causes of death (34
). An alternate explanation is that deaths inaccurately reported preintervention as heart disease continued to be reported inaccurately, but in proportion to NYC’s leading causes of death, which seems less likely.
Our analyses of 10-year trends in these outcome measures and our comparison of outcome measures pre- and postintervention at nonintervention hospitals establishes that the decrease in heart disease deaths and the increase in the average number of conditions postintervention is restricted to intervention hospitals and is not likely due to a secular or historical trend. Although the decrease in heart disease proportions was statistically significant in the nonintervention group, because approximately 14,000 deaths occurred during each observation period, the decrease is an order of magnitude less than at intervention hospitals. The slight decrease in heart disease deaths and increase in average number of conditions reported at nonintervention hospitals between postintervention and extended postintervention periods may reflect DOHMH’s continued efforts to improve COD reporting citywide.
Although bias or confounding might explain results in any nonrandomized study, neither can fully explain our results, unless the true heart disease death rates decreased to this degree in populations served by intervention hospitals only during our study period, and we coincidentally embarked on a campaign to improve COD reporting during this period. Another possible but unlikely explanation is regression to the mean; that is, because we selected hospitals based on their high percentages of deaths from heart disease, by chance this percentage was closer to the average upon second measurement. Previous years’ data do not support this explanation because intervention hospitals historically had reported high proportions of heart disease and nonintervention hospitals had reported low proportions.
Decedents at intervention hospitals were older and more likely to be non-Hispanic white than at nonintervention hospitals. While heart disease risk also varies by these factors and COD reporting quality varies by age, this variation cannot explain the intervention’s positive results. We compared the recent outcome measures over time within each intervention hospital so that differences in decedents’ characteristics by hospital are not confounding our primary comparison. Furthermore, an analysis of changes in all reported causes of death following the intervention and controlling for decedents’ characteristics did not alter the intervention’s observed effect (34
This intervention incorporated hospital-specific policy, practices, and educational components to achieve the support of staff and administration and was completed at little cost. The primary expenditure, outside of the staff time of DOHMH personnel, consisted of minimal travel expenses. One full-time epidemiologist devoted approximately 50% of her time to developing content over the course of 18 months with input from subject matter experts. The director of the Office of Vital Statistics conducted the conference calls. The director and a DOHMH physician conducted the in-service trainings. Health departments with fewer resources or those that cover a larger geographic area may be able to improve COD reporting quality without extensive travel by using a conference call alone.
One key benefit of on-site, in-service training was qualitative feedback from a larger audience. Most doctors reported no prior training in death certification. Additionally, physicians and staff expressed frustration over the past DOHMH practice of rejecting certificates based on COD and suggested that funeral directors, affected by death registration delays, may proactively request certain “safe” causes of death, such as atherosclerotic heart disease, to avoid DOHMH rejections. Physicians also perceived validation checks in the Electronic Death Registration System (EDRS) as obstacles to death registration. DOHMH has reduced these barriers citywide. As of March 2010, the death registration protocol requires rejection only if the cause of death does not appear natural or the only COD reported is a mechanism (eg, cardiopulmonary arrest, asystole, respiratory arrest). As of October 2009, physicians can override many COD related EDRS validation checks. As of January 2010, the NYC Health Code requires all EDRS users to complete an online training on COD documentation, which may explain some postintervention changes in outcome measures among nonintervention and intervention hospitals. However, compliance with this training requirement has been poor, and further efforts are planned to improve enforcement. DOHMH has also discussed the importance of accurate cause of death reporting and physician autonomy at meetings of NYC funeral director associations. These and other citywide efforts may explain some COD reporting changes at intervention hospitals and at nonintervention hospitals.
In NYC, medical residents complete many death certificates. Sustained improvement in COD reporting will depend on hospital and residency administrations’ support of continuous quality improvement. At some hospitals, improvements waned in the expanded postintervention period, indicating that ongoing training is needed to ensure that new staff and medical residents understand COD documentation. On the basis of the success of this intervention, DOHMH conducted conference calls and in-service trainings in 12 additional hospitals in 2011. As resources permit, DOHMH will reach all NYC hospitals. Other completed COD improvement initiatives include issuance of COD data quality reports; dissemination of education materials such as physician pocket cards and COD posters; assisting via telephone during weekdays; and monitoring DOHMH death registration rejections.
National efforts to improve COD reporting quality are ongoing. In partnership with National Association for Public Health Statistics and Information Systems (NAPHSIS) and the National Center for Health Statistics, DOHMH developed an e-learning course on COD completion for national use, which is available to NAPHSIS members. Other jurisdictions can customize the national module.
Inaccurate COD reporting occurs at local, state, and national levels. Many researchers use mortality data; therefore, poor quality COD reporting, including heart disease overreporting, affects the usefulness of public health policies, spending, and programs informed by the data. We have demonstrated that a health department can reduce heart disease overreporting with a training intervention. DOHMH continues to expand COD training in NYC. Other US jurisdictions should consider similar interventions to address this critical national problem.