Public Reporting of CLABSI: Is it a valid measure for hospital comparison?
Overview of the computerized CLABSI agorithm |
In today's JAMA, Michael Lin and authors from four CDC Epicenter academic hospitals (2 in Chicago, 1 in Columbus, OH and 1 in St. Louis) compared annual IP-determined CLABSI rates in 20 ICUs during 2004-2007 with a computer-generated reference standard. The median CLABSI rate was 3.3/1000 central line days. Overall correlation with the computer algorithm was poor at 0.34 and ranged from 0.83 in one center down to 0.10 at another. Interestingly, the center with the lowest reported CLABSI rate by IP had the highest computer rate. (2.4/1000 CL-days vs. 12.6/1000 CL-days)
I have posted the schematic of the computer algorithm and also the link to the code (below). My only methods question (at this moment) is why did they limited the analysis to yearly comparisons and not quarterly or monthly comparisons. I would have liked to see that level of data analyzed even though it would be noisier. It was interesting how the IP-reported rates were narrowly clustered around each other while the computer-generated rates were widely distributed. The findings should make us pause when we consider public reporting of these rates. If so much emphasis is being placed on CLABSI rates at the state and national level for comparison and reimbursement, there should be funded validation of the reported rates and also consideration of other measures (outcome or process) that might be more reliable.
Lin et al JAMA November 2010
Link to computer algorithm code (looks like you might need to apply for a password)
Comments
Post a Comment
Thanks for submitting your comment to the Controversies blog. To reduce spam, all comments will be reviewed by the blog moderator prior to publishing. However, all legitimate comments will be published, whether they agree with or oppose the content of the post.