Great news from CDC! Help me to understand it...
My major takeaway: investments in laboratory detection of AR are beginning to pay off. By improving their diagnostic capabilities, more regional and state labs are now able to help health care facilities confirm, characterize and respond to pathogens that display unusual resistance phenotypes. This is unquestionably a good thing, though it still requires local facilities to be able to detect phenotypes of concern—for example, many hospitals don’t have ready access to MALDI-TOF or sequencing, and many do not identify Candida to species level routinely from non-sterile sources. How long might C. auris spread in such a facility prior to its detection? So although the AR Lab Network (ARLN) is a major step forward, we still need to emphasize that individual hospitals must invest in improved and timely AR diagnostics--and we ought to ensure they receive updated guidance about how best to detect new or emerging pathogens of concern.
Now for the data itself: the report presents NHSN CAUTI and CLABSI data from 2006-2015 on the percentage of all isolates of E. coli and K. pneumoniae with ESBL and CRE phenotypes over time. The ESBL phenotype was consistently detected in 16-19% of isolates over time, with no major change noted. However, the % CRE declined steadily, from a peak of 10.6% in 2007 down to 3.1% in 2015 (see Figure below). This is despite the fact that CLSI breakpoints for carbapenem susceptibility were lowered during the surveillance period, which should have caused more isolates to meet phenotypic criteria for resistance. And although the CDC published CRE-specific control guidance in 2009, the decline in %-resistant preceded this guidance (which would obviously not have been immediately implemented across the country, likely delaying any impact for 1-2 years). This also coincides with the spread of carbapenemase-producing Enterobacteriaceae (CPE) across the US. So I’m left wondering: why is the % CRE declining as a cause of CAUTI and CLABSI in acute care hospitals?
Is it related to the general improvements in infection prevention practices that have accompanied public reporting and pay-for-performance? Would such improvements differentially impact CRE over other pathogens (keep in mind this isn’t a rate, it’s a proportional decrease)? It’s clear that the epicenter of CRE/CPE seems not be in acute care but rather “post-acute care” as is also noted in the other part of the report (the results of CPE screening by ARLN labs for 9 months of 2017)….but I’d still expect increases in post-acute care or other health care settings to eventually be reflected in the NHSN CAUTI/CLABSI data. Granted, the NHSN data represents a relatively small number of organisms (for example, Clare Rock and colleagues demonstrated that CLABSI represent only 6% of all hospital-onset bacteremias).
I remain perplexed. And happy to hear any thoughts on this!