Attending Physician Adherence to a 29-Component Central Venous Catheter Bundle Checklist During Simulated Procedures. (Chaudhary)

Barsuk JH, et al. Attending Physician Adherence to a 29-Component Central Venous Catheter Bundle Checklist During Simulated Procedures. Crit Care Med. 2016 Oct;44(10):1871-81.

OBJECTIVES: Central venous catheter insertions may lead to preventable adverse events. Attending physicians’ central venous catheter insertion skills are not assessed routinely. We aimed to compare attending physicians’ simulated central venous catheterinsertion performance to published competency standards.

DESIGN: Prospective cohort study of attending physicians’ simulated internal jugular and subclavian central venous catheter insertion skills versus a historical comparison group of residents who participated in simulation training.

SETTING: Fifty-eight Veterans Affairs Medical Centers from February 2014 to December 2014 during a 2-day simulation-based education curriculum and two academic medical centers in Chicago.

SUBJECTS: A total of 108 experienced attending physicians and 143 internal medicine and emergency medicine residents.

INTERVENTION: None.

MEASUREMENTS AND MAIN RESULTS: Using a previously published central venous catheter insertion skills checklist, we compared Veterans Affairs Medical Centers attending physicians’ simulated central venous catheter insertion performance to the same simulated performance by internal medicine and emergency medicine residents from two academic centers. Attending physician performance was compared to residents’ baseline and posttest (after simulation training) performance. Minimum passing scores were set previously by an expert panel. Attending physicians performed higher on the internal jugular (median, 75.86% items correct; interquartile range, 68.97-86.21) and subclavian (median, 83.00%; interquartile range, 59.00-86.21) assessments compared to residents’ internal jugular (median, 37.04% items correct; interquartile range, 22.22-68.97) and subclavian (median, 33.33%; interquartile range, 0.00-70.37; both p < 0.001) baseline assessments. Overall simulated performance was poor because only 12 of 67 attending physicians (17.9%) met or exceeded the minimum passing score for internal jugular central venous catheter insertion and only 11 of 47 (23.4%) met or exceeded the minimum passing score for subclavian central venous catheter insertion. Resident posttest performance after simulation training was significantly higher than attending physician performance (internal jugular: median, 96%; interquartile range, 93.10-100.00; subclavian: median, 100%; interquartile range, 96.00-100.00; both p < 0.001).

CONCLUSIONS: This study demonstrates highly variable simulated central venous catheter insertion performance among a national cohort of experienced attending physicians. Hospitals, healthcare systems, and governing bodies should recognize that even experienced physicians require periodic clinical skill assessment and retraining.

Does Simulation Improve Recognition and Management of Pediatric Septic Shock, and If One Simulation Is Good, Is More Simulation Better? (Williams)

Dugan MC, McCracken CE, Hebbar KB. Does Simulation Improve Recognition and Management of Pediatric Septic Shock, and If One Simulation Is Good, Is More Simulation Better? Pediatr Crit Care Med. 2016 Jul;17(7):605-14.

OBJECTIVES: Determine whether serial simulation training sessions improve resident recognition and initial septic shock management in a critically ill simulated septic shock patient, and to determine whether serial simulations further improve resident task performance when compared with a single simulation session.

DESIGN: Prospective observational cohort study with a live expert review of trainee simulation performance. Expert reviewers blinded to prior trainee performance.

SETTING: A PICU room in a quaternary-care children’s hospital, featuring a hi-fidelity pediatric patient simulator.

SUBJECTS: Postgraduate year-2 and postgraduate year-3 pediatric residents who rotate through the PICU.

INTERVENTIONS: Postgraduate year-3 residents as the control cohort, completing one simulation near the start of their third residency year. Postgraduate year-2 residents as the intervention cohort, completing two simulations during their second residency year and one near the start of their third residency year.

MEASUREMENTS AND MAIN RESULTS: Resident objective performance was measured using a validated 27-item checklist (graded 0/1) related to monitoring, data gathering, and interventions in the diagnosis and management of pediatric septic shock. The intervention cohort had a higher mean performance percentage score during their third simulation than the control cohort completing their single simulation (87% vs 77%; p < 0.001). Septic shock was correctly diagnosed more often in the intervention cohort at the time of their third simulation (100% vs 78%; p < 0.001). Appropriate broad-spectrum antibiotics were administered correctly more often in the intervention cohort (83% vs 50%; p < 0.001).

CONCLUSIONS: Simulations significantly improved resident performance scores in the management of septic shock with repetitive simulation showing significant ongoing improvements. Further studies are needed to determine long-term impact on knowledge and skill retention and whether results attained in a simulation environment are translatable into clinical practice in improving bedside care.