Shouldn't evidence guide our selection of hand-hygiene surveillance systems?

It's amazing how little evidence is required before infection prevention interventions are adopted. A current example of this is the installation of automated hand-hygiene surveillance systems that track healthcare worker room entry and hand hygiene compliance. Hospitals are committing significant resources, both financial and person-time, to implement these systems with minimal evidence that they sustainably improve compliance or are accurate and cost-effective. To channel Jerry Maguire, before hospitals "show them the money", shouldn't they ask companies to "show me the meta-analysis"?

With that in mind, I really enjoyed reading the study in February's ICHE by Luke Chen and colleagues at Duke describing the implementation of an electronically-assisted, directly-observed hand hygiene surveillance system. The investigators recognized that directly observed compliance remains the gold standard, but also realized that economic and time costs, along with potential biases such as the Hawthorne effect, limit its utility. Thus, they set out to improve on the gold standard by modifying it to address potential biases and reduce costs.

Beginning in 2009, non-secret (they wore ID badges) auditors began monitoring compliance in 40 wards/clinics. They observed two moments of compliance, before/after room entry. All data was entered into wireless-PDAs that were linked to a centralized server allowing instantly updated 30-day tracking reports.

Overall, the compliance rate reported was 88% after 100,000 observations. That's pretty good. What's more, they reported compliance by the order of observation. For example, they calculated the average compliance for the first opportunity observed, the second opportunity and so on. The reason they did this, is they hypothesized that the Hawthorne effect wouldn't kick-in until the observer was seen by the healthcare worker and that this would become more likely the longer the direct observer remained on the ward. What did they find? Look for yourself:
What do you see? They found compliance for the first five observations was less than 86% while the average over the 35th to 43rd observations was 95%. We see an actual Hawthorne effect. Excellent. After seeing this, and for other reasons as well, they created standard operating procedures that observers used. The most important change to their procedures was they limited observations to 10 minutes or 10 total opportunities before moving to the next ward. Following these changes, compliance was reported to be 86%.

Of course there are limitations to this study in that it was a single-center study and lacked economic data to help guide its broader adoption. The authors acknowledged these and are addressing them already in a future study. My sense is that this method will be more effective and cost-effective than fully-automated systems and also keep infection preventionists visible on the wards to identify and address other important issues, which they can't do from behind their computer screens. But unlike many pushing for a new hand hygiene monitoring system, I'm going to start collecting the data and let the evidence guide my decision making.

Image: Luke Chen, MBBS MPH


Post a Comment

Thanks for submitting your comment to the Controversies blog. To reduce spam, all comments will be reviewed by the blog moderator prior to publishing. However, all legitimate comments will be published, whether they agree with or oppose the content of the post.

Most Read Posts (Last 30 Days)