Making Sense of Initial Specimen Diversion Study Data
Three factors influence the efficacy of diversion technology in reducing blood culture contamination (BCC) rates:
The percent reduction in BCCs is dependent upon the definition of the study sample.
- When n= blood cultures drawn with the diversion device, the resulting “perfect-world rate” reflects how well the device works when used.
- When n = ALL blood cultures collected, the resulting “real-world rate” reflects how the introduction of the device impacts clinical care. All results are calculated inclusive of collections with and without the diversion device. This real-world rate factors in the level of caregiver compliance achieved.
Reduction rates are also impacted by the inclusion of various methods of blood sample collection: direct venipuncture, syringe draws, and peripheral IV catheter. A real-world rate includes cultures taken by all methods, while a perfect-world rate may include only samples taken by direct venipuncture. The gap between these two rates may signify an opportunity to enforce compliance with the hospital’s existing blood culture collection policy. And, although perfect-world rates trend higher than real-world rates, they may not represent an accurate metric for clinical nor financial impact of diversion technology. Accurate measurement requires the inclusion of all blood culture collection methods.
When diversion compliance is heavily policed, BCC rates decline further than when clinicians autonomously chose when to use the technology. To be effective in clinical practice, a specimen diversion device must perform consistently and present no operational challenges for the clinician, fitting seamlessly into the standard blood culture collection procedure.
With each new study, ask yourself the following questions…
- Does the study include all collections or only collections using the device?
- Does the data include collections by all methods or a subset of methods?
- Was caregiver practice monitored/guided beyond routine levels?