Volume 50 | Number 3 | June 2015

Abstract List

L. Elizabeth Goldman M.D., M.C.R., Philip W. Chu M.S., Peter Bacchetti Ph.D., Jenna Kruger M.S., Andrew Bindman M.D.


Objective

To evaluate how the accuracy of present‐on‐admission () reporting affects hospital 30‐day acute myocardial infarction () mortality assessments.


Data Sources

A total of 2005 California patient discharge data () and vital statistics death files.


Study Design

We compared hospital performance rankings using an established model assessing hospital performance for with (1) a model incorporating indicators of whether a secondary condition was a comorbidity or a complication of care, and (2) a simulation analysis that factored indicator accuracy into the hospital performance assessment. For each simulation, we changed indicators for six major acute risk factors of mortality. The probability of being changed depended on patient and hospital characteristics.


Principal Findings

Comparing the performance rankings of 268 hospitals using the established model with that using the indicator, 67 hospitals' (25 percent) rank differed by ≥10 percent. reporting inaccuracy due to overreporting and underreporting had little additional impact; overreporting contributed to 4 percent of hospitals' difference in rank compared to the model and underreporting contributed to <1 percent difference.


Conclusion

Incorporating indicators into risk‐adjusted models of care has a substantial impact on hospital rankings of performance that is not primarily attributable to inaccuracy in hospital reporting.