Simulcast Journal Club November 2018


Introduction :  

Simulcast Journal Club is a monthly series that aims to encourage simulation educators to explore and learn from publications on Healthcare Simulation Education.  Inspired by the ALiEM MEdIC Series, each month we publish a case and link a paper with associated questions for discussion.  We moderate and summarise the discussion at the end of the month in pdf and podcast format, including opinions of experts from the field. 

For the journal club to thrive we need your comments!  Some participants report feeling nervous about their initial posts, but we work hard at ensuring this is a safe online space where your thoughts are valued and appreciated.  To ensure this, all posts are reviewed prior to posting.  We look forward to learning from you. 

Journal Club (2)

Title :  “Happy Endings 

“This is it!” grinned Nitin.  In his hand, he held a copy of ‘Resuscitation’.  “An ecological study, across 26 hospitals.  Improved survival!  Statistical significance!  This is the paper we’ve needed!” 

He grasped Nimali’s hand as they walked towards their office.  Nimali’s phone rang, she glanced down to see that Catherine was calling her and quietly cancelled. 

“Nimali, between this data and the AHA Statement there’s no way the hospital can shut us down.”. 

Her heart skipped a beat as she considered the implications.  “All this time we’ve been looking at RCT’s that never get enough power.  But with this paper we can justify rolling out Brad’s new rapid cycle program!  Funding might get easier…  This is a huge moment!”.  She jumped briefly as her phone buzzed again.  She put it on silent. 

Nitin stared into her eyes.  “I told you not to doubt yourself.  Everything we do here, it actually means something. And now we have proof!  What a way to finish the year.”.   

Nitin’s infectious passion dampened Nimali’s inhibitions.  She leaned forward and pecked him on the cheek.  “Everyone else is off campus at mandatory training.  I think it’s time we invested in some Spaced Repetition.” 

“I’d prefer some Rapid Cycle, but as long as there’s some contextual learning.” Grinned Nitin. 

For the third time Nimali’s phone buzzed.  She sighed.  “Hold that thought.”  

“Hey, what’s up?” she asked, as Nitin kissed her neck. “What? Oh God. Catherine I’ll be right there.”. 

She looked up from the phone in shock and pushed Nitin away.   

“The staff meeting, we need to get there… now.” 

Nitin could see the fear in her eyes.  “Nimali, what is it?”.  

“Professor Snythe has been murdered.”. 

The Article :  

Josey, K., Smith, M., Kayani, A., Young, G., Kasperski, M., Farrer, P., Gerkin, R., Theodorou, A. and Raschke, R. (2018). Hospitals with more-active participation in conducting standardized in-situ mock codes have improved survival after in-hospital cardiopulmonary arrest. Resuscitation, 133, pp.L-52. 

 

Discussion :  

 

For several years on journal club we’ve mentioned the great white whale of Simulation Research. The ability to correlate our educational efforts with improved patient outcomes.  While some papers have achieved this on some levels, Josey et al’s paper finds an association between In Situ Sim Programs and decreased patient mortality in hospital arrest. 

 

For our journal clubbers this month, does this paper seem as exciting and validating as Nimali and Nitin seem to think?  Or are they seeing what they wish to see from this data? 

 

This is our last journal club for 2018, so jump in now! 

 

References : 

Josey, K., Smith, M., Kayani, A., Young, G., Kasperski, M., Farrer, P., Gerkin,https://www.ncbi.nlm.nih.gov/pubmed/30261220 R., Theodorou, A. and Raschke, R. (2018). Hospitals with more-active participation in conducting standardized in-situ mock codes have improved survival after in-hospital cardiopulmonary arrest. Resuscitation, 133, pp.47-52.


About Ben Symon

Ben is a Paediatric Emergency Physician at The Prince Charles Hospital in Brisbane and a Simulation Educator at Lady Cilento Children's Hospital. He currently teaches on a variety of paediatric simulation based courses on paediatric resuscitation, trauma and CRM principles. Ben has a growing interest in encouraging clinical educators to be more familiar with simulation research.


Leave a comment

Your email address will not be published. Required fields are marked *

11 thoughts on “Simulcast Journal Club November 2018

  • Glenn Posner

    Like Nitin & Nimali, I was overjoyed when this paper was published and it created a twitter-flurry. I sat and read it cover to cover with a fine-tooth comb (which is admittedly rare for me) because this study, if true, might be the evidence I am always searching for to justify our in-situ simulation program. My first question was related to what the definition of “more active” vs “less active” hospitals was, because (prior to reading this paper) I would have considered my hospital as one of the “most active” hospitals in Canada with respect to running mock codes. In the study, the hospitals that were more active were running 17 sessions per 100 beds vs 3/100 beds in the less active group. This is disheartening, because my hospital has nearly 1200 beds – I would need to run 200 sessions per year to match the median of the “most active” group. Alas, the largest hospital in the study had 708 beds, and was a “less active” hospital, so I’m not sure my experience is comparable. I would need to run 4 mock codes per week rather than the two per month that are currently happening (and that I was proud of). This is in the context of other in-situ events that aren’t code-blue related. Beyond the coordination and debriefing instructor-hours required for these events, I also have lots of guilt about constantly taxing the code team residents and nurses with more frequent events. To this end, I contacted Dr. Raschke and asked for the median number of sessions at the >200 bed hospitals, and his preliminary answer has been that “any reasonable increase in training is likely to benefit patient outcomes.” He is implying a dose- response relationship between training and outcomes, and I can certainly live with that. This is the kind of paper I am going to share with the CEO of our hospital as evidence for the value of in-situ sim.

    • Ben Symon Post author

      Hi Glenn, Thanks so much for stopping by and sharing some really useful points about practical application of this data.
      I think my own enthusiasm overshadowed any thoughts about how to implement this data, and it’s a really important thought about how to actually achieve the production line of in situ sims that would be required to replicate this in a large service. My question for you though, if there’s evidence for this, is it the responsibility of the sim team alone to provide this service?

  • Jennifer Dale-Tam

    Hi Ben & Glenn.

    To answer your question Ben, the sim team needs to be involved, but does not need to have the sole responsibility of this. It could be shared between all educational departments of an institution. Organization and financial support of in-situ programs could be shared across all stakeholders. The in-situ sim needs to be lead by a trained simulation instructor In a recent AHA statement released by Adam Cheng et al. (2018) focusing on education strategies and structures around cardiac resuscitation recommended debriefing focused on tasks and team work by trained faculty to improve outcomes. It also spoke to spaced training as improving outcomes which is in alignment with the dose-response statement of Glenn and Dr. Raschke. Personally, I have seen improvements in tasks and team function around resuscitation in the teams I work with during the in-situ simulations I run and debrief through Glenn’s program such as: time to 1st defibrillation in less than 3 minutes, break down of hierarchies in the inter-professional team where the nurses where debriefing the physicians in what they needed from them. There was a successful resuscitation in our ambulatory care setting where the front-line nurses attributed a part of the success to the regular simulations we have run for them. Though initially resistant to participate, consistent feedback indicates healthcare staff appreciate the simulations and request more of them. Glenn, comments like this could help alleviate your guilt about running in-situ simulations. The hard evidence from the Josey article, support from the AHA statement and the soft evidence of feedback from participants and anecdotal support from instructors all provide sound support for in-situ simulation programs and engaging all relevant stakeholders.

    Cheng, A., Nadkarni, V. M., Mancini, M. B., Hunt, E. A., Sinz, E. H., Merchant, R. M., … Bhanji, F. (2018). Resuscitation Education Science: Educational Strategies to Improve Outcomes From Cardiac Arrest: A Scientific Statement From the American Heart Association. Circulation. https://doi.org/10.1161/CIR.0000000000000583

    • Ben Symon Post author

      Hi Jennifer,
      Thanks so much for your considered and thorough response, I really enjoyed your perspective.
      I think this paper and Glenn and your comments have made me reflect on our own simulation service, and the limits of what we can do as a crew of 10 people in a large tertiary hospital service.
      I think as a simulation micro-community we have a fair amount of work in upskilling other educational staff to feel comfortable running In Situ in their areas. I also think that this signals an important transfer of ownership of peer feedback conversations. If wards own their own responsibility to practice, ideally I would imagine this would help with morale and professional pride as well as patient outcomes.
      Interesting stuff!
      Ben

      • Jennifer Dale-Tam

        Hi Ben,

        Thanks for the feedback. Currently working on a project “upscaling” my educator colleagues in hopes of increasing their confidence of facilitating threatre based sim with future sessions on insitu sim. Insitu sim usually bring the interprofessional component which can be daunting to new simulation instructors. Sim services are “the guardians of expertise” who need to ensure the safety of the learning environment and quality of the debriefing are maintained, but with support and mentorship other educators can facilitate in-situ sim, which takes time. Yes, it would help with moral and professional pride, it certainly does mine 😊!

        Jenn

    • Glenn Posner

      Thanks, Jen!

      Indeed, I have started quoting this article at the end of the debriefing of the code team to assuage my guilt.

      Ben, it is not the responsibility of the sim team to provide this service, but it is our responsibility to coordinate this service because we are the guardians of the expertise regarding best educational practices (curriculum development, instructional design, objectives, etc…) and best practices regarding debriefing.

  • Janine Kane

    Hi Ben, Whilst I am not involved in any insitu simulations I know through many years of SIM training for EN students that anecdotally, repetition builds skill! Repetition also builds speed and confidence. The more we can run any immersive SBE is a bonus for everyone involved. However, we are so busy trying to improve patient outcomes as well as staff confidence, expertise and let’s face it resilience, that the task of constantly having to prove SBE/Insitu Sim works is exhausting and frustrating! Loved the story, my favourite storybook characters 😀

    • Ben Symon Post author

      Hi Janine,
      Great to have you back! It’s been so nice to have you coming along so regularly.
      I agree that the work done by Josey at al will hopefully help us all justify our jobs, and also inform how we can best achieve optimal patient outcomes.
      Ben

  • Susan Eller

    Hello Ben,

    Thank you for updating the Nitin & Nimali story – I had to chuckle at your use of simulation terms in their particular connotation 😉

    As an educator, I agree with the article, Glenn & Jennifer that an increase in training leads to a dose-related improvement in performance. It is a little daunting in a 900, or 1200, bed hospital to achieve the “more active” rate of in situ simulations. As many of you mentioned, there is a need for good debriefing of the training, and that means training the ACLS instructors and code team trainers in good facilitation techniques. The Josey article and the Cheng et al. article provide rationale to present to the financial authority of your institution.

    As a PhD student, there is still some part of my brain that gets stuck on the fact that correlation ≈ causation. The authors described the work as an ecological study, and again that challenges the rules of hard science. YET – I don’t know that I think “hard science” is necessary. When I read Jennifer’s description of the interprofessional simulations with nurses debriefing physicians, I think that kind of training changes not only performance, but culture. I believe those changes in culture contribute greatly to the improved outcomes.

    • Ben Symon Post author

      Susan, thanks so much for coming along and mentioning some concerns regarding the ‘hard science’ aspect of the paper. I was wondering if you could unpack a little bit about what your concerns are about correlation vs causation?
      P.S. I have no idea what you mean about any connotations in my case study, it’s all very serious and above board I assure you :p

      • Susan

        Hello Ben,

        Hmm, sending the ice cream link, which is how I first understood correlation versus causation 😉 https://www.youtube.com/watch?v=BaETnBzM7yU.

        I think that a better example that I can give is that carrying a lighter is correlated with an increase in lung cancer. Of course the lighter doesn’t cause cancer. The cigarettes that have a high correlation to both the lighter and the cancer is the cause. I think that is part of my question with this article: does the actual training cause the differences, or is the culture of the institutions – one that encourages and funds in situ training – what really creates the difference in outcomes?