Simulcast Journal Club November 2019


Introduction :  

Simulcast Journal Club is a monthly series that aims to encourage simulation educators to explore and learn from publications on Healthcare Simulation Education.  Each month we publish a case and link a paper with associated questions for discussion.  Inspired by the ALiEM MEdIC series, we moderate and summarise the discussion at the end of the month, including exploring the opinions of experts from the field. 

The journal club relies heavily on your participation and comments and while it can be confronting to post your opinions on an article online, we hope we can generate a sense of “online psychological safety” enough to empower you to post!  Your thoughts are highly valued and appreciated, however in depth or whatever your level of experience.  We look forward to hearing from you. 

Title :  “Undirected Observations 

The Case :  

Elliot was on the way to lunch after a busy morning teaching.  He only had a few minutes spare on the program, but despite the rush he couldn’t help pausing as he walked past the debriefing room.  Inside the morning’s candidates were watching the Sim being broadcast. 

At least, they were supposed to be watching the Sim.  Unless everybody was frantically looking up clinical guidelines on their phone the sad truth appeared to be that they were scrolling through Facebook or Instagram, depending on personal preference.  Any performance anxiety their colleagues in the lab were feeling appeared wholly unnecessary. 

He sighed a little as his enthusiasm dipped internally.  He’d read papers on observers learning much from sim, and in fact he’d been one of the faculty who’d pushed hard to allow larger course sizes and to not sweat that each participant would get fewer sims.  But if the candidates were just going to tweet instead of reflect, then seriously…. What was the point? 

The Article : 

Delisle, M., Ward, M., Pradarelli, J., Panda, N., Howard, J. and Hannenberg, A. (2019). Comparing the Learning Effectiveness of Healthcare Simulation in the Observer Versus Active Role. Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare, 14(5), pp.318-332. 

Discussion :  

Our understanding of learning benefits from simulation has evolved with time, and one recurrent question is regarding the difference between learning outcomes between active participants and simulation observers. 

In this months’ article, Delisle et al engage in a meta-analysis to understand any differences more clearly. 

For the journal clubbers this month, how do you approach observers in simulation?  And does this paper change your practice? 

References : 

Delisle, M., Ward, M., Pradarelli, J., Panda, N., Howard, J. and Hannenberg, A. (2019). Comparing the Learning Effectiveness of Healthcare Simulation in the Observer Versus Active Role. Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare, 14(5), pp.318-332. 


About Ben Symon

Ben is a Paediatric Emergency Physician at The Prince Charles Hospital in Brisbane and a Simulation Educator at Lady Cilento Children's Hospital. He currently teaches on a variety of paediatric simulation based courses on paediatric resuscitation, trauma and CRM principles. Ben has a growing interest in encouraging clinical educators to be more familiar with simulation research.

Leave a comment

Your email address will not be published. Required fields are marked *

19 thoughts on “Simulcast Journal Club November 2019

  • Boon Nee Tang

    Sim observers generally do not feel as involved. One method is to ask them to write their observations as they watch. Also appoint a leader. The leader lead the observations and do a running commentary. When the Sim completes, have a joint discussion and then make the observation group with its leader run the Sim again.

    • Ben Symon Post author

      Hi Boon Nee, thanks for getting this started! Is that a technique you use regularly? I love the idea of specifically structuring the observer group with tasks. Sometimes I find observers get pulled into the role of critiquing their colleagues and are uncomfortable. Does this happen in your service?

  • Jenn Dale-Tam

    Hi Ben,

    Thank you for bringing this article forward. The review was very interesting. I appreciated that Delisle et al defined the type of external observers as directed versus non-directed. Elliot’s experience of seeing observing learners not engaged with what is going on in a separate room is not uncommon in any type of education. There is no accountability for these individuals to participate. I’ve seen similar responses when we video conference a lecture from one room to another. We are human, intrinsically, we will take the path that is easiest, and personal devices are quick and easy distraction as well. In my simulation world of primarily nursing for on-boarding and CPD we have many observers due to the shear volume of learners at any one given time. The difference is our learners are in the simulation environment on the periphery and provided verbal instructions to observe and prepare to actively participate in the debrief before the simulation starts. Sometimes our observers participate more actively in the debrief because they can see the “whole picture” of the simulation and have a decreased cognitive load related to the decreased stress of being “on stage”.

    It’s interesting that in a sub group analysis of the article it was found that for Kirkpatrick level 2 (learning) directed observers and active participants had the same level of learning. I did a very small pilot (n=4) of observers using a checklist of a theater-based simulation based on the O’Regan et al 2016 article. The learners found it to be beneficial and helped to participate in the debrief, of which they did. The checklist had interventions that were expected to occur in the simulation which is in alignment with the worked example suggestion to decrease cognitive load (Fraser, Ayres, & Sweller, 2015). From a learning perspective it does provide some direction that we are doing the right thing by instructing our observers and have them participate in the debrief. Maybe we should be using a checklist too more frequently, but that raises some other logistical challenges.

    This article raises some interesting research questions, some of which have peaked my interest.

    Thank you.

    Fraser, K. L., Ayres, P., & Sweller, J. (2015). Cognitive load theory for the design of medical simulations. Simulation in Healthcare, 10(5), 295–307. https://doi.org/10.1097/SIH.0000000000000097

    O’Regan, S., Molloy, E., Watterson, L., & Nestel, D. (2016). Observer roles that optimise learning in healthcare simulation education: a systematic review. Advances in Simulation, 1(1). https://doi.org/10.1186/s41077-015-0004-8

    • Ben Symon Post author

      Hi Jennifer, thanks so much for your thoughtful response. Apologies for the late reply , I’m currently on holidays with my family. I get the impression from conversations with a few nursing leaders in Sim like Susie Edgren and Patrea Andersen that in some ways nursing education has had to develop some unique techniques to provide simulation to their larger cohorts of learners.

      So i was interested to hear your use of checklists for the observers, and I again wonder about if there’s any unanticipated consequence of moving observers to a role in appraisal? Or is this a good thing and better coming from their colleagues?

      I agree that this phenomena can be seen in every unsupervised training session of any sort and that I’m as guilty as anyone else of the occasional phone check.

      • Jenn Dale-Tam

        Hi Ben,

        It really depends on what the purpose of the simulation: for learning vs of learning and/or formative vs summative. Also, how the use of the checklist by the observers is communicated to the entire group. When the concept of the checklist is introduced in the pre-brief to the active participants and observers where it’s used as a tool to assist the observers be active in the debrief the sense, I get from the entire group is that they are okay with it, but this could be explored more. The checklists also provide a form of peer coaching that can help the learners become more reflective in their practice. More avenues to explore regarding observers in simulations and how to engage them for their learning while maintaining psychological safety of everyone involved.

        No worries about timing, we all deserve to take vacations :).

  • Scott Seadon

    I like to give observers pre-made cards with specific aspects of crisis resource management to watch for. Often leads to very robust and meaningful observations brought up by the observers in the debrief. I find (at least it seems to me) it improves the learning experience for both observers and participants.

  • Ann Mullen

    Thanks Ben for bringing up a great topic. I feel fortunate to have worked with Megan and Alex, and enjoyed many interesting conversations with them about observers in sim, so I was excited to see their work featured this month!
    I’d like to share how we used observers and pose a question. In my previous workplace, a community teaching hospital, we had weekly sims for Medical, PA and nursing students. Each week, the students managed an urgent problem such as chest pain, hypotension etc. The goal: practice deciding when to call for help, what to do while help is en route, and what to say to the person who arrives to help. Three students would be in the sim, while the rest of the group watched via live stream. During the prebrief, we talked about the role of the observer. We asked the observers to try to think through the case and write down their differential dx, keeping in mind that it is a lot easier to do that when you are not standing in the room trying to manage an acute situation!
    Here was the dilemma: how much information to provide to the observers? We would give the whole group the brief summary, similar to what you would hear if a nurse were summoning the team to a rapid response. If the in-room team ordered an EKG, we would simulate hooking up the EKG machine and hand them the EKG. We gave a copy to the observers as well. Unfortunately, the observers sometimes focused so much on reading the EKG that they stopped paying attention to the rest of the case.
    We were always trying to strike the right balance between providing the observers the right amount of data to help them think through the case, while avoiding having them miss the point of the case while focusing on a task.
    I’d be interested to hear how others have managed this.
    I also have some thoughts about how to create and maintain the safe container for those being observed, but I think ill ponder that a bit more and post that separately.

    • Ben Symon Post author

      Hi Ann! Thanks so much for coming again! I didn’t realise you knew the authors and I’m glad we get to share some of your/their practice as we discuss the paper!

      It sounds like a lot of thought has gone into incorporating observers and shifting them into the space of being ‘active meaning makers’ but that this process is not without tension and a significant faculty energy expenditure. In some ways it sounded like they were in their own mental rehearsal sim!

      Do you have any further observations on the impact on the observers?

  • NEMAT ALSABA

    Thanks Ben for choosing this topic and article!

    As you and the other participant in this journal club discussion pointed out there is a lot we need to uncover to fully understand this important concept and how to make it work for the learner “ observer”.

    So to answer your question How do I approach observers in the simulation? My approach has been inspired by O’Regan et al 2016 article, not necessarily the checklist but the concept of the role of observer and how/ when to engage them.
    I divide that into 3 opportunities/ phases (pre, post, and intra sim)
    1- Pre-Brief phase / Role clarity:
    it is our responsibility to explicitly explain the role of the observers, the expectation and how they contribute to the sim /learning activity.
    2- During the Simulation /Engagement and role modeling:
    The facilitator or co-facilitator encouraged to sit with the observers to observe and role module the interest in the actions of the participants (they will not be checking their phones unless we as facilitators start checking our phones!) and sometimes you can share with or prompt the observers that this action during the sim by the participants or Sim phase is interesting and we are likely to discuss this during the debrief.
    3- Debrief phase /Contribution:
    This is where we should actively seek the observer’s contribution, opinions and thoughts on an issue or topic but not necessarily to critique their colleagues as they find this hard to do. Again, making it explicit on what I am seeking their feedback and thoughts on.

    These 3 phases help shift the observer perception of “Observing “from being a boring and passive activity to an exciting role that complements the simulation with fruitful discussion and enriches the learning activity for the entire team.

    Things that this paper brought to my attention:
    Great paper with an excellent systemic review of the issue. I love how the paper groups the observer into
    A. in-scenario observer when they are given a passive role in the actual scenario, such as a family member
    versus
    B. external observers. Those who are watching but not participating in the simulation
    This makes me wonder when we work with our simulated participants, we never explore their learning experience from that simulation activity but I think we should as they are part of the simulation team. Although some might argue they are not our learners in this sim activity but I might disagree. This might be a future research idea 😊.
    I totally agree with the authors of this paper that we need more research in this fascinating area of Sim activity and learning.

    • Benjamin Symon

      Hi Nemat,
      Thankyou so much for these wonderful, granular tips on incorporating observers effectively.
      I particularly love how you’ve broken down the ways we can activate learners throughout each simulation phase.
      Many thanks!
      ben

  • Alex Hannenberg

    We’re enormously grateful for all of the comments on this paper that have helped us understand the experience and perspectives of others.
    Our main interest in examining the potential of observed simulation has been in our quest to overcome barriers to scaling simulation based education, especially on teamwork and communication. For this purpose, we’ve “married” observed video simulation with within-group debriefing in a model designed to fit in the 60 minute continuing education time available to most clinicians. The prototype module makes a real effort to guide both observation and debriefing to focus on very specific learning objectives, especially considering the limited time available. It makes very little demand of the on-site facilitator, recognizing that individuals trained for this are so scarce in the real world
    Early evaluation of the feasibility, acceptability and usability of this model are very encouraging. We outlined this approach in a BMJ Q&S editorial a few months ago: http://orcid.org/0000-0001-5890-0456. We are evaluating approaches to testing its impact on skills, behavior and outcomes. As we learn more, we look forward to sharing with this community.

  • Mizzi Clifford

    Hi Ben,

    Thank you for bringing this article to our attention.

    Many of the points discussed in the article are very interesting.

    However, I am wondering if we really should compare observers to active participants of the simulation. Isn’t this a bit like comparing apples to pears?

    To achieve maximal impact on long-term learning, I think a simulation session has to consist of three essentials: preparation on the subject, the actual simulation, and a debriefing.

    At our institution, pre-reading – consisting of a mix of papers, online links and podcasts – is sent out to the group of learners several days in advance. The target audience are usually ACEM trainees and emergency nursing staff.
    Of course, you have to trust that the learners are motivated and take the time to do some preparation, but that is part of self-regulated learning and another issue.

    Debriefing is arguably the most important part of a simulation, as it encourages the learners to reflect on their practice and can ideally facilitate double loop learning, ultimately resulting in clinical improvement.
    Debriefing also gives the observers an opportunity to provide formative feedback. This can not only help the participants in their reflection and self-evaluation, but also serves as a chance for the observers to work on their feedback literacy.

    While active participants and simulation observers share the preparation/ pre-reading and the debriefing, the experience of the actual simulation scenario is clearly very different.
    As the article describes, observers reported significantly better reactions to nontechnical skills training, likely due to reduced stress and anxiety. On the other hand, active participants learned skills significantly better than observers.

    To take this into a clinical context, I’ll use the example of a surgical airway:
    Both participants and observers receive the same education material and will hopefully practice some mental rehearsals as in how to perform a surgical airway. This is the first step of the learning process.
    During the simulation, the participants performing the surgical airway will increase their technical skill level/manual handling, whilst the observers will ideally take away improved non-technical skills for this situation, like clinical reasoning, situational awareness, communication and teamwork skills.
    Finally, both participants and observers come together for a debriefing, where they collectively reassess performance (both technical and non-technical), consolidate the learning and, lastly, attempt to transfer that shared learning into clinical practice.

    I think we can all agree that both the participants as well as the observers learned something from the case, if somewhat different things.
    The learning outcomes are varying, but one is not necessarily superior to another.

    The aim of the study was to ask the question “Compared to active simulation, is observed simulation as effective in healthcare training for improving patient outcomes and participant behaviour, learning and reactions?”.
    As we all know, words matter. Maybe, instead of calling it “active simulation” vs “observed simulation”, we should introduce the term of “active observer”.
    And maybe, instead of comparing simulation participant vs simulation observer role, we should optimise educational approaches by transforming a passive observer into an active observer.

    While I think the authors did a good job, maybe we should ask ourselves if they did the right job…

    • Benjamin Symon

      Hi Mizzi,
      Thankyou so much for joining us, I found your comments very insightful and they shook up my perspective a bit.
      I think the concept of ‘separate but equal’ learning experiences is a useful one, and I appreciated your contextualisation with the example of an airways skills simulation helpful in understanding it better.
      I guess my counter question to this would be, (while I agree with your hypothesis) “Why are we doing a Sim if the learning outcomes for participant and observer are so varied?”. For example, we are currently writing a sim about port device access in febrile neutropenia. One question we’ve been mulling over has been : “Why are we doing a sim on this, rather than an in-service?”. Our solution was to pretty much do what you’d prescribed, 1) An initial inservice and skills rehearsal. 2) A simulation contextualising the skill within a clinical, in situ context to allow systems and equipment checking in real departments.
      It’s an interesting problem.

    • Ann Mullen

      I truly enjoyed reading your comment, Mizzi, and the airway sim example helped me to understand your point. If we are purely focusing on the surgical technique, then the observers would be engaging in mental rehearsal, and that learning outcome is quite different from the person who had the opportunty to practice the hands-on skill.
      My sim program ran a surgical airway sim for staff anesthesiologists. All had been previously trained to do this procedure. We ran it in two parts. First we did a simulation of the airway emergency; the sim ended just prior to the incision. The debriefing focused on the decision to do the surgical airway, team communication etc, and explored the perspectives of the people in the room vs those who were observing the sim.
      Many experienced MDs rarely do this procedure, so they were eager to engage in the second half of the class, where each person practiced on a task trainer.

    • Ann Mullen

      My second comment on Mizzi’s post: I agree that the observers must be active! When we plan a sim with observers, we need to provide the observers with a meaningful task. We want them to focus on the learning objective rather than be overwhelmed or focus on other aspects of the case. We also need to create and maintain the safe container for learning. (this reminds me of September Journal Club discussions).
      Another example: Our hospital was implementing an emergency manual/ crisis checklist in the OR. The manual was introduced at dept meetings; many people had helped to create, edit, and approve the manual. We decided to do a large group exercise just prior to placing the manuals in the OR.
      We had a combined grand rounds with all members of the peri-operative team: surgeons, anesthesiologists, nurses, technicians, students and support staff. It was standing room only in our 200 seat auditorium!!
      Prior to the grand rounds, we filmed a simulation of a surgical emergency (our volunteers knew in advance that their video would be shown in this forum) We handed out a shortened version of the manual ( 6 pages instead of 20) The hour started with a short intro to the manual and explanation of the exercise. We told the audience that the clinicians in the video had agreed to be observed, but that we would not be critiquing their performance. Instead, we asked that during the video, they mentally place themselves in the situation as it unfolded. We asked them to consider several questions, e.g. “At what point would I find the manual useful?” “Would I be able to suggest the manual to a senior team member?” We also asked them to turn to the page in their handout that fit the scenario and follow along, not to critique or grade the participants, but to mentally rehearse using the manual as a cognitive aid. After the video ended, the audience was debriefed, and we used those questions to guide the discussion.
      We were very pleased with the engagement of the audience and with the interesting comments during the debriefing. We also anticipated the possibility that someone in the audience would criticize the performance of the video team, and we had a plan to address that. Fortunately we did not have any negative comments. The positive buzz continued as the OR teams returned to work.
      This large scale exercise introduced the manual to the OR team; we conducted drills over the next year for additional practice and reinforcement.