iPad as a metaphor for the future of infection prevention

I've been doing a lot of thinking lately. This is perhaps because I'm between jobs and have some occasional downtime but it's more likely the result of not actually owning a computer these past two weeks, unless you count my phone. What I've been thinking about is why I ended up in hospital epidemiology in the first place. I mean, who, in their right mind, would choose to go into a profession that is under appreciated, under funded and lacks the proper scientific data to make rational clinical decisions. I could go on, and I'd be happy to if you meet me at the local pub...

What keeps sticking in my head is how little we've advanced since Semmelweis described the benefits of hand disinfection. For one, we're still doing quasi-experimental studies, although Semmelweis's design was at least controlled (midwives). It also seems that most of our contentious debates, including those around active detection and isolation, would be unnecessary if we'd followed his advice and had 100% hand hygiene. As for recent advances, the CLABSI checklist popularized by Peter Pronovost is probably the most important. Why was this advance made by someone who wasn't an ID physician or hospital epidemiologist? I think it's because as a profession we're stuck. Perhaps more clinical trials will advance things, but I have this suspicion we'll just end up proving what we already know - where will the advancement come from?

So, what does this have to do with the iPad? I just came across this article by Daniel Eran Dilger on how Steve Jobs utilizes creative destruction to change the world. The article is a little tech heavy and perhaps Apple biased, but I don't think that detracts from it's main points which I believe have relevance for a potential 'new' future for infection prevention. Dilger's main points are: (1) "Jobs understands death as a creative force better than most people. For society, culture, and technology to progress, old thinking has to die off to make way for fresh new ideas. People who don’t die are dragged kicking and screaming in the future..." and more importantly (2) "When something works, you don’t need to kill it. But in some cases you should."

So what does this mean for infection prevention and what should be "killed off"? I don't know. It's only been two weeks! However, I think we need to discuss what old ideas we are unnecessarily holding on to and we shouldn't back away from killing off things that work. Yes this is easier said than done and is perhaps a useless exercise. I'm just one guy, I don't even own a computer and I'm not even sure I know what a metaphor is.

Comments

  1. Nice post, Eli!

    A couple of things occur to me…first, I believe the fact that folks who aren’t in the traditional hospital epi/ID mold are taking leadership roles is related to the tension between academic infection control and process improvement—between establishing the foundation of knowledge about “what works” and then going ahead and applying it widely. For example, Pronovost didn’t invent the infection prevention checklist (I always attribute this to our friends at Wash U—Vicky Fraser and Dave Warren—but there were probably others using it even earlier). Peter’s genius was in popularizing it, and demonstrating its effectiveness when applied across a wide swath of different hospital types (in the Keystone Project).

    Many of us who have been involved in infection prevention research have strong epidemiology backgrounds, and we think a lot about study design, validity, bias, confounding, biological plausibility, etc., etc. So it isn’t surprising that we’d be somewhat allergic to the “just do it” philosophy that prevails in the PI-QI-implementation community. Before my PI colleagues object, I should clarify: of course implementation can be seen as a science in its own right--it just isn’t one that many ID-trained hospital epidemiologists happen to understand very well. And a focus on implementation is urgently needed when the process being implemented is known to be effective (and not harmful)…the problem as I see it is that we are busy implementing processes that have very weak scientific foundations. Why? Because years of under-funding of hospital infection prevention research leaves us with a foundation made almost exclusively of flawed, quasi-experimental studies. The challenge, as you point out, is in strengthening this foundation—determining what parts are strong enough, and what parts need to be discarded or fortified.

    Why make this effort? What’s the harm in moving forward with interventions that seem biologically plausible and that have some data (albeit flawed) in support? One fairly simple example: the consequences of antibiotic overuse (including fatal cases of C. difficile) that resulted from the ill-advised “4-hour” rule for antibiotic treatment of community-acquired pneumonia.

    ReplyDelete
  2. I believe the funding is often there...its the lack of champions and leaders. i.e. Those who are prepared to put the time (out of hours) into the endless proposals to research and ethics committees to get the simplest of studies approved...

    ReplyDelete
  3. Nice post to read while getting ready for bed! Good luck in your new job.

    ReplyDelete
  4. After researching quite a few waterproof cases, I tested a ziploc one gallon bag.
    The ipad worked fine....sanitize the outside as needed between patients, then you can use it tomorrow to bag your lunch.
    JJC

    ReplyDelete

Post a Comment

Thanks for submitting your comment to the Controversies blog. To reduce spam, all comments will be reviewed by the blog moderator prior to publishing. However, all legitimate comments will be published, whether they agree with or oppose the content of the post.

Most Read Posts (Last 30 Days)