Introduction :
Simulcast Journal Club is a monthly series that aims to encourage simulation educators to explore and learn from publications on Healthcare Simulation Education. Inspired by the ALiEM MEdIC Series, each month we publish a case and link a paper with associated questions for discussion. We moderate and summarise the discussion at the end of the month in pdf and podcast format, including opinions of experts from the field.
In order for the journal club to thrive we need your comments! Some participants report feeling nervous about their initial posts, but we work hard at ensuring this is a safe online space where your thoughts are valued and appreciated. To ensure this, all posts are reviewed prior to posting. We look forward to learning from you.
Title : “PEARLS before Snythe”
It had been easier, Brad thought, when Dr Snythe had been focused on destroying his simulation program. But with the publication of Brad’s improved CPR stats post implementation of the PICU Simulation Program, he’d been both confused and delighted to find the rival intensivist suddenly supportive of his work. There had been, however, an unexpected catch.
Snythe was suddenly excited about the educational benefits of simulation. And he wanted in.
While Brad had tried to maintain the fundamental premise as Snythe tried to negotiate both learning to debrief and learning to use his frontal lobe, he was ashamed to admit that his archenemy’s combination of enthusiasm mixed with concrete thinking was making him frustrated, and in some ways, downright snarky.
“You’re getting there, Snythe.” He said after their latest debrief. “But remember your debriefing molecule. I felt like today you lacked a decent 3 phase structure, and it lead to a very instructor centred debrief. It came across as a bit all over the place.” He paused and muttered under his breath. “Kind of like your resuscitations.”.
“I heard that!” snapped Snythe, “And I wasn’t going for your traditional model. I’ve been reading a lot about PEARLS and I wanted to give it a try today. I assume you haven’t seen it but they’ve just released a debriefing tool.”.
He pulled out his tablet and showed Brad a crisp, blue and white cognitive aid. “Gotta get with the times, old friend.” He grinned. “Wouldn’t want to come across as outdated and irrelevant.”. He paused and grinned wickedly. “Again.”
Brad scowled. The truth was he’d heard a lot about PEARLS in conversations with sim educators, but he’d never really ‘got it’. Sounded like he’d better jump on the bandwagon though. It had been a while since he’d felt motivated to read much sim literature, but nothing got his inner bookworm going like a good case of career rivalry.
It was time to time to head to debrief2learn. For knowledge…. and more importantly, for revenge.
The Article :
Eppich, W. and Cheng, A. (2015). Promoting Excellence and Reflective Learning in Simulation (PEARLS). Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare, 10(2), pp.106-115.
https://debrief2learn.org/pearls-debriefing-tool/
Discussion :
In 2015 Eppich and Cheng released a new structure for debriefing that is practical, pragmatic and more flexible than some more traditional approaches. Over the next few years they have released a number of papers to assist in translating their original landmark paper.
For our journal club discussants this month, what has PEARLS meant for you?
How have you found using the new debriefing tool? Or if you haven’t used it, check it out and let us know what you think? The team behind it are keen for your input!
References :
Eppich, W. and Cheng, A. (2015). Promoting Excellence and Reflective Learning in Simulation (PEARLS). Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare, 10(2), pp.106-115.
https://debrief2learn.org/pearls-debriefing-tool/
Hi there! I am an Emergency Medicine trainee in Australia who has just started in an education role that includes simulation-based training for medical students, nurses and ED trainees. I am incredibly fortunate to have Victoria Brazil as a mentor in this role and have been able to attend learning sessions introducing debriefing to the absolute novice.
My first impression of debriefing is just how hard it is! What appears to be effortless for Victoria I find a minefield of poorly phrased questions, variable psychological safety and instructor centric dialog that misses key learning opportunities.
Having now had the opportunity to lead a few debriefs I have found the PEARLS tool to be incredibly helpful primarily as a cognitive scaffold with which to structure the session. I feel that as I gain more experience I’ll then be able to layer more complexity from analysis section of the tool (ie, figure 2 of the article) into my approach.
The other tool I have also found really helpful for a novice is the Debriefing Assessment for Simulation in Healthcare (DASH) instructor version. Using this for immediate self-assessment post debrief been great for identifying areas to improve.
Having now dipped my toe into the field of simulation based education I certainly have a newfound respect for how much effort and complexity is involved!
Hi Warwick,
Thanks so much for dipping your toe into journal club as well, it’s great to have you along.
I really appreciated you sharing your perspective on PEARLS as a new debriefer, as it sounds like it’s an extremely useful tool for you when starting out.
How do you tend to use it in your debriefs? Do you have a physical copy or do you just sort of mentally hold onto the principles?
I have to start by confessing I LOVE this paper. We have debated the values of debriefing scripts quite a lot within our group and the (grossly generalized) theme seems to be that novice debriefers find them helpful and more experienced debriefers find them overly restrictive. PEARLS really addresses that divide by providing broad structure with specific examples which the authors explicitly state are not intended to be used verbatim by all. As a clinician who really started doing education when I got to consultant level I find when judging a lot of educational theory I am overly influenced by face validity, as influenced by my own educational journey, and this has tons of it – it just all sounds so sensible! I (consistently with what I read) find that when debriefing novice learners there is often quite a strong need for, and learner interest in, more time spent on didactic teaching. When debriefing experts it feels more like steering a conversation while specifically trying to avoid using the position of pseudo-power that you get from facilitating the session to give undue weight to my own opinions about how things should be done. Most scripts I have come across do not allow for this variation in debriefing when using the same scenario with different levels of learner but PEARLS is great for this. I think there is an analogy between debriefing scripts and clinical guidelines in that the less expertise you have in a domain the more likely you are to follow the guideline to the letter. The best guidelines allow for that and act as a scaffolding rather than a cage and I think PEARLS is a great example of that. I guess I am a bit cautious that this is still not very well proven to be the optimal approach and really relies on “expert opinion” albeit with a very well described rationale rather than level 1 evidence. I am also undecided on what the effects of using the same debriefing structure during a day long course with multiple scenarios are, I’m not sure whether it would be better to use a variety of debriefing structures/techniques (which you can to a degree within the PEARLS framework) or whether there is actually an additional value in participants getting used to the pattern of the debrief as that re-enforces the expectation setting piece described in the paper. The other analogy that seems relevant is that of the structure we learn when taking a medical history at med school. That basically serves as a frame in which we learn to map out and recognize illness narratives which we label as diagnoses and learn how to confirm/manage. The more expertise we develop in a particular area the more shortcuts we start to take or adaptations we start to make to suit our own practice and experience and I think consistently using PEARLS is really helping me learn to recognize patterns in groups I am debriefing making the decision of which route I want to take through the analysis phase a bit easier. Another great choice of paper – thank you!!
Hey B1, Thanks again for your comments and ongoing presence.
I think your thoughts have helped me reframe my thinking around PEARLS. Particularly with regard to your comments about clinical guidelines becoming more easy to diverge the more experienced we become. I have to confess it hasn’t been a paper that has hugely changed how I approach debriefing, and interestingly this was the first case study I wrote where I really struggled to find the right frame to present it from. (and I’m dissatisfied with my results). From my conversations with friends and colleagues PEARLS is repeatedly praised as a particularly useful tool or guideline for the beginner debriefer. I was chatting with Bruce Lister about it and he said he feels like there’s “just so much bad debriefing going on out there that we don’t realise”, and how useful PEARLS is at providing a structure to people who are unfamiliar with debriefing and need support.
Warwick’s thoughts regarding how useful he’s found it starting out demonstrates that really nicely.
For me, while I feel like it hasn’t changed my practice hugely I think that it many ways it is an important acknowledgement that there’s more than one way to do this stuff, and that time availability, learning needs and patient safety all form important roles in how we approach a particular debrief. I wonder sometimes if the traditional 3 phase debrief (which is my favourite to participate in and to learn from), has become perceived as the only way to do things in some circles, and I think that limits us from innovating. Rapid Cycle Deliberate Practice for example, hasn’t translated as swiftly, while traditional debriefing models from the original principles continue to be promoted around the world.
Interesting stuff!
Pingback: LITFL Review 323 | Edwin M. Thames
In my current position I am not able to actively debrief and I only get access to clinician educators to discuss facilitation of debriefing at the time of the simulation. As the program puts adjunct clinicians in the facilitator role , many times without any simulation experience and without development, I have found that the tool assists in quickly imparting some type of standard for facilitation of debriefing.
I have also directed many healthcare educators new using simulation, and those who I have observed to have limitations in simulation debriefing to access PEARLS and further their personal development of simulation debriefing.
I find the tool very helpful, and am grateful to have access to the site which can easily be shared with many healthcare educators who are utilizing simulation with limited introduction into the methodology.
Thanks for your thoughts Melissa, have you been using the paper or have you been using the new debriefing tool?
Out of all the debriefing scaffolds I have tried (and believe me I have tried a lot) this has been by far the most useful not only for me, a fairly experience debriefer, but also for some of my colleagues who are less experienced. I like the fact that you can also use it for both experienced and novice learners too.
It may also have uses outside medical simulation as I have been looking at it with simulation educators in the police to see if it’s adaptable for their use as they lacked a framework.
In short – nothing but praise and for me personally a game changer.
Thanks for your thoughts Nick. How have you found the police have responded?
Very positively – first scaffold they’ve had to use (Currently using Pendleton’s) – working through some of the differences & translating from med to cop 🙂
Many people consider me an “expert” debriefer… whatever that means… but thinking that I’ve been learning, practicing and perfecting Debriefing with Good Judgement so almost a decade, I feel that I can speak a bit from this perspective. First I want to agree with others who have stated that the paper is brilliant. I think it is well written and very helpful in clarifying various sub-mechanisms within the phases of a structured debriefing.
While the authors propose that the approach is helpful for novices, knowing the authors personally, I read the manuscript as a description of various moves that very experienced debriefers can use in a debriefing situation. Jenny Rudolph, taught me that this is called a contingency model in the field of systems dynamics. So I recognize the moves described in the paper as option for the debriefer that can be activated given various situations. The tables do a good job clarifying this. Summary- I don’t recommend handing this paper to a novice and say: “go debrief”. Thought it has some scripts, this paper is not the place to start.
Now, for some critique- I think my biggest area of concern is the move towards “Learner Self-Assessment” or the “Plus-Delta” move. My concern is focused on 3 pillars: 1) Dunning-Kruger effect (https://en.wikipedia.org/wiki/Dunning–Kruger_effect) and 2) the critical role instructors play in fostering reflection via feedback and asking questions and 3) a practical issue that remains unsolved- what happens when the learners list a Plus that actually describes a substandard performance, ie: we had very high quality CPR, we really nailed the rate of 60 compressions per minute! Suddenly the debriefer has 2 tasks, point out and correct the substandard performance in the sim and also the poor assessment based on incorrect knowledge.
In summary, I think the paper is very useful. I think it follows from the tradition of and is well aligned with Debriefing with Good Judgment and offers good ideas and practical recommendations for debriefers wishing to think through and codify debriefing moves.
Hi Demian, thanks so much for coming along, I always learn a lot from you when you jump in to drop some knowledge bombs.
I really appreciate that you took some time to critique the paper as well, as I think it’s an important part of journal club that people are a bit nervous about when we are in such a public format.
While I get where you’re coming from with respect to the Dunning-Kruger effect and the pre-contemplative learner, I wasn’t sure why you find this of particular concern? In the paper they acknowledge that this could lead to a need to switch tactic but that it’s great for finding learner focused learning objectives to discuss. So I was wondering if I could ask you what concerns you about getting into that situation? Do you think it’s inefficient, or that the learner might feel tricked?
Thank you for your comments, Demian and Ben!
@Demian: Like Ben, I will be keen to learn more about your criticism of the learner self-assessment part of the PEARLS framework. Although learner self-assessment can certainly be flawed, I wonder how you reconcile the notion of the basic assumption with your comments. More specifically, if I believe that my learners are intelligent, capable, do their best, and want to improve—and I am curious—then I find it only appropriate to invite participants early in the debrief to share their perspective on events in terms of what was working, what they should keep continue doing, and what they would improve next time. That we might have different perspectives is certainly par for the course during a debriefing—the participants’ take on events can be a great springboard for discussion. I guess what I’m struggling with is your message that we should avoid learner self- assessment methods. Perhaps I misunderstood the comment. In any event, keen for your thoughts…
Stimulation discussion here on this thread re: learner self-assessment. Thanks Demian for getting discussion going on this – very important to throw around ideas, suggestions, and points of view. Only through scholarly debate can we advance the field!! A few thoughts …..
– The Dunning-Kruger effect – such an intriguing and interesting phenomenon! I think we can all agree that it can be harmful when our learners over-estimate the quality of their performance. Enter debriefing – a tool to help counteract this effect.
– I like to think of the learner self-assessment as a diagnostic tool. As a debriefer, I want to know where the inconsistencies (between my assessment of performance, and the learner’s self-assessment of their own performance) lie. Once uncovered, these inconsistencies offer opportunity for more in depth discussion, often with advanced questioning techniques like advocacy inquiry. In this sense, the learner self-assessment is a particularly helpful tool for novice facilitators, who may struggle to identify these inconsistencies as they manage the burden of cognitive load building with each learner comment.
– When a learner self-assessment is not conducted during the debriefing, debriefers theoretically run the risk of not identifying these inconsistencies – which represent missed opportunities for learning. Often times these issues come up in discussion organically …. but sometimes they don’t, even with the most experienced debriefers …. do we really want our learners leaving thinking they did sometime really well when in reality, they needed significant improvement?
– One last thought – sometimes diving in with a formal learner self-assessment is unnecessary – particularly with experienced, insightful learners – as these learners may naturally engage in high-level discussion, are already keenly aware of where things went wrong, OR are actively seeking suggestions for points of management that triggered uncertainty and/or discomfort. As with all debriefings – best to match method to need!
Hey Ben
Nice to see the great comments.
The paper deserves to be read in depth.
I have been using the PEARLs framework for my debriefing teams at both the hospital and medical school, where i find the consistency of structure helpful for co-debriefers and for learners – everyone has a road map of where the conversation will go (roughly) and hence knows where their responsibilities and opportunities for contribution are.
So I do lots of debriefing faculty development using this, and the recently published PEARLs debriefing tool has been further helpful
Like Ben ( B1 that is 🙂 I don’t find it a cage, and think it allows for ‘own words’ to be used with confidence, and for some of the ‘artistry and values’ that expert debriefers use – https://advancesinsimulation.biomedcentral.com/articles/10.1186/s41077-016-0011-4
Couple of adaptations/ additions i have made
(not suggesting ‘improvements’ per se – just what i do …)
1. Setting the scene
– some is repeated at the commencement of the debrief
“We’re going to spend x minutes debriefing, first we’ll do a general reactions chat, then i’ll briefly outline the facts of the case, then we’ll pick 2 or 3 points to chat about. ok?”
2. Reactions phase
– I do it as described, but i think that description under-estimates just how DIAGNOSTIC this phase is for debriefers – what is said and how its said. I also would re-inforce that this phase needs intense active listening and ‘triaging’ of those points for discussion while listening
3. Facts of the case – I do this myself. Keeps it short and objective. I haven’t ever had much luck with participants being able to remove their personal perceptions/ narrative
It then also becomes a ‘pause’ for transition to analysis (which i say explicitly)
4. Analysis
– i actually offer up the 2 or 3 points for discussion as an ‘agenda’ at the beginning of this phase and invite agreement or additions/ substitutions.
Looking forward to the podcast !
vb
Thanks Vic, I enjoyed your list of adaptations and variations.
I agree that the way PEARLS is written achieves a nice balance in providing structure without constricting prescriptiveness, and I particularly like their line in the closing of the article that ” We agree that educators should avoid formulaic speech and tokenisms as well as linguistic rituals by being curious and authentic; Educators need to find and speak with their voice.”. It’s such a lovely sentiment and very true.
How do you find offering up 2 or 3 points for discussion as an agenda? One challenge I’ve found is that despite being transparent, once I have listed learning objective suggestions, unless the group is quite senior, they kind of still follow hierarchy and accept those as the talking points without disagreement. I now try to open with something like ‘in the time we’ve got available, is there anything in particular you’d like to explore?’, but that usually ends in awkard silence and I end up suggesting my own anyway! I keep persisting though!
yes like you – generally they agree with my points.
With one or two exceptions though – enough to think its not a blanket agreement.
But even if its a general agreement – of course i don’t mind that much….. we can only take this learner centred thing so far…. 🙂
Of course, I am soaking up sage words of wisdom from Vic! All sounds great to me?
Hi Ben. Long time lurker, first time poster in JC.
I came across PEARLS framework at a point where I had been fairly frequently debriefing for about 4 years in different programs and seemed to still be having hit and miss moments (determined only by self-reflection).
PEARLS seems to resonate with me as the closest and most adaptable fit that helps me manage my own cognitive load and metaphorically hang and integrate learner comments into broader themes for discussion. I have also been more and more extending structured debriefing into resus area post clinical events and have found Walter et al. (2016) ‘Let’s talk about it’ article very helpful in considering how to slide a technique or framework from simulation into the clinical arena https://www.researchgate.net/publication/305628527_Let's_talk_about_it_translating_lessons_from_healthcare_simulation_to_clinical_event_debriefings_and_clinical_coaching_conversations
My thoughts in general terms are:
1. Agree and try to emulate Vic’s approach, except I will often task a participant or co-debriefer with the Descriptive facts (especially in situ / clinical event) this does a couple of things for me – buys me some bandwidth to arrange the diagnosis from the Reactions phase and consider ‘parent’ issues for deep dive and also shift the power of owning the perceived reality to within the group. But mainly again I just have fewer working memory slots than the more expert debriefers and having a forced partition between reactions and analysis saves me from chasing the last or most obvious rabbit down the rabbit-hole into a premature analysis.
2. Consistency in framework is not only helpful for debriefing faculty development, but also for teaching the unwritten curriculum to our in-house clinical teams of a model and structure that can be used in the 90% of clinical time when I am not around – with the real humans not the plastic ones.
3. Reactions phase is seriously tricky and when you get more comfortable in harvesting this, it is a game changer – as VB put it ‘diagnostic’. I’ve learned a lot from teaching with Vic (carrying her bags) last week that are really useful for this phase. ‘Minimal encouragers’ are genuinely effective – nods, gestures, ‘ok tell me more’, etc. Also the mental model of emotion-before-cognition. I am seriously guilty of this at home and work communication – trying to ‘fix’ or reassure emotional offerings from others with facts and alter the emotional tone of the group (or my wife) rather than meet them where they are, happy, sad or indifferent.
I’ve really enjoyed watching this unfold and think the classic papers months are a treasure trove of nostalgia and reflection.
Thanks everyone. Looking forward to the summary.
Jesse
Thanks for coming along Jesse! I look forward to learning from you more often! I agree the response to the classic papers are strong and I hope with time we’ll have made a good curriculum for new educators to browse through in our annual publications.
I’m an ED registrar working in Austaralia and I’ve had the pleasure of working along with Vic and Warwick. I’d start by echoing what Warwick has said below, debriefing is something that looks easy when done well but it really is a mine field when starting out.
Previously I felt debriefing was essentially the senior person in the room saying ‘so what did we do well and what can we improve on?’. The more exposure to debriefing I get, the more I realise it’s so much more than that.
Having done some simulation in a previous job without any formal training on techniques VS now debriefing using this tool I find it to be far superior. It allows me to be much more structured and less haphazard in my approach. I now actually address specific learning objectives rather than just picking a few things to discuss at random. Hopefully this means that student learning is more uniform, without losing the flexibility of sim which is all part of the fun!
I would advocate for anyone who runs simulations to have a look at debriefing tools and this paper is an excellent example. There’s much more to it than just showing up.
Chris it’s been great to have you with us this month, I think sometimes new simulationists are pretty intimidated by some of the comments, but it’s so important to hear fresh perspectives from those who are relatively new at this stuff and see it with fresh eyes and fresh perspectives. Thanks for your comments, and hope to see you again soon!
It has been incredibly gratifying to read this discussion thread—thank you to all of those who have shared such supportive comments about our PEARLS paper! It took several years for the conceptual framework to mature; Adam Cheng and I had countless Skype meetings to flesh out our thinking—and multiple revisions of the manuscript! David Gaba, then editor of Simulation in Healthcare, and the other reviewers provided most helpful feedback. That these ideas would have such an impact is certainly beyond our wildest expectations.
In particular, I would love to thank Ben Symon for his enthusiastic and expert moderation of this discussion—I look forward to hanging out in Boston in May!
Thanks Walter, it’s an honour to have you guys even listening. May is coming with alarming rapidity, can’t wait to hang out with you properly this time!
Hi Ben and our sim community,
Like many who have discussed before, the PEARLS paper really provided me with a solid foundation for what sometimes is a sea of varied debrief processes and educational theory, all of which can sometimes swirl in the moments of cognitive load while facilitating feedback. For a while I have occasional found engaging (particularly novice) learners in attempts at learning conversations difficult. Plus Delta often can run into it’s own challenges when learner’s ‘don’t know yet what they don’t know,’ or have the illusion of competency, as Demian points out. Having a structure to guide the feedback, I think counter-intuitively, allows for more dynamic learner centric conversations, as it gives the facilitator time to process what is being offered and to actively listen to what is being prioritised for discussion during the session. As much as it is gratifying for facility to have a framework to hang a discussion on when giving feedback, especially on days when things seem to be misfiring, so to do learners feel satisfied when a framework or expectation of how a discussions will progress, so that they can decide on what they want to discuss in the allocated time. To this end I think the set up phase is so important. More time spent in this phase allows for more productive analysis and really provides the learners the evidence that they can trust your guidance during debrief. A confident and organised start can do wonders to the participants faith in your competence as a facilitator. I do use participants to provide the description of the case, as like others, I need some decrease in CPU load to help plan the next phase, but it can also be a good tool to help participants practice information assimilation and communication, a complex task vital to daily clinical practice.
In the discussion, the authors address the need for a structured approach to debriefing in the absence of local mentors or experts and I think this is the real benefit of this paper. As simulation becomes almost universally incorporated into teaching programs, many faculty are developing their skills in the absence of mentors. We would not ask trainees to learn in isolation and yet debriefing is a high stakes, highly complex enterprise with the potential of unintentional or unrecognised harm. Beginning the development of a structure, not necessarily a script, is important for the continued success of sim and this paper starts us down that journey as a kind of version 2.0 for models of feedback. I think it is a great paper, and the comments have been equally illuminating. Thanks Ben for highlighting this classic.
Hey Matt
Thanks for the input, and loved that phrase “A confident and organised start can do wonders to the participants faith in your competence as a facilitator.”
I find that when we do short ( eg half day ) debriefing faculty development we seem to spend most of it practising reactions phase and setting the agenda – hopefully a testament to your experience of how crucial that phase is.
Agreed, thanks Matt for commenting, excited to see you move from lurker status! See you on the floor!
Hi Folks
I just wanted to send a quick message of thanks to Ben for hosting the journal club and stimulating discussion around PEARLS.
It has been really fun following along and reading the reflections of those who have used PEARLS. Great comments ….. and very helpful for me as I continue to shape my understanding of debriefing methods, expertise, and faculty development!
As an aside – Ben – several of your Aussie colleagues have made Calgary a trip on the way home from CMS / Boston – if your schedule accommodates, please come pay us a visit. Email me if your are interested.
Thanks
Adam
Thanks so much Adam, I’m taking my family with me on this trip (including grandparents) and we’re heading straight to NYC after the course so I won’t be able to come visit you in what I assume in my head to be some kind of mystical Sim palace in the frozen north, but I would be keen to DM you some time about maybe visiting you on some long service leave!
Hi everyone, long time listener, first time poster. =) Thanks for creating and maintaining a great CoP!
At our health authority we heavily promote the use of the PEARLS framework for all novice simulation enthusiasts. We facilitate a two-day workshop for novice simulation facilitators, in which our only mandatory pre-reading article is the PEARLS article. This workshop has lots of debriefing practice built-in, and afterwards we continue to help coach educators in simulation-based education with just-in-time teaching and elbow support.
What we’ve found is that the majority of learners appreciate learning about PEARLS and enjoy being taught a blended debriefing model, but the ability of novices to apply the framework varies. Novice debriefers tend to rely more heavily on a scripted debrief, or sticking with the plus/delta branch when it comes to the analysis phase. Advocacy-Inquiry tends to be a challenge for novices, and I personally attribute it to many people being unable to actually identify their own point of view. We try not to emphasize directive feedback unless necessary to close a glaring knowledge gap, and find this approach to have good effect, as the majority of educators lean towards directive feedback to begin with. For many, emphasizing facilitation vs instruction is needed. (The “guide on the side vs sage on the stage,” as Adam Cheng and Vincent Grant would say.) When it comes to elbow / 1:1 support, we usually advise novice to intermediate debriefers to come up with a couple of personal goals for their simulation facilitation, one of which is to develop one or two AI statements in the analysis phase of the debrief. As support, we then debrief the debriefers, and I would say a common theme is that the majority find using AI challenging for quite some time. Almost everyone enjoys the flow of the phases.
The cognitive aids which were posted by the Debrief2Learn team last year (thanks!) seem to be most liked by those already quite familiar with the PEARLS format. Personally, I enjoy it for its design and the “Setting the scene” prompt. That being said, I rarely use it because at this point, I feel I have the framework already burned into my brain. In our workshop, we haven’t yet switched to using the new tools for two reasons. The first is that, when explaining how to use PEARLS – especially the “analysis” phase – I enjoy having the entire framework on display to speak to and reference. Sometimes learners forget that plus/delta, directive feedback, and AI are all in the analysis phase, or they forget where within the framework analysis falls, so using the chart from the original article helps keep everyone on the same page. The second reason we haven’t switched is because of the reliance of novices on a scripted debrief template, which includes multiple sample questions within each phase. The PEARLS aid doesn’t seem to have enough sample questions for novices to choose from.
What our learners ask for most is a pocket cognitive aid which lists phases and multiple sample questions within each phase on one side, and an AI “how-to,” with examples of how to start each part of the AI statement, on the other.
TLDR: Love PEARLS. Novices tend to appreciate it theoretically, have difficulty applying it, prefer scripted debrief templates, have challenges with advocacy-inquiry. Intermediate to advanced simulation facilitators have a better grasp of PEARLS but by then usually don’t need a cognitive aid. Personally haven’t been using the new cognitive aid much, although I do appreciate its use of instructional design principles.
Thanks Christina for such an in depth response, and I particularly appreciated your observations regarding some practical challenges in utilising the new debriefing tool itself. I know that Brent Thoma et al are keen for some practical feedback still. Correct me if I’ve misinterpreted, but I understand from your comments that in some ways the debriefing tool would be most useful for novice debriefers, but that you feel there is not enough data in the abbreviated format to be as useful as the traditional diagrams? It’s an interesting observation.
I am a program manager at a hospital simulation center. I have been planning a journal club for a nurse educator group and I selected this article several weeks ago. Imagine my delight when this coincided with the Simulcast Journal Club!
In addition to inviting the group to read the paper, I also shared the Podcast: Blended Approaches to debriefing.
We had a great discussion today with 12 participants. Many of them are less experienced with simulation and debriefing, and they found the PEARLS structure a useful framework for their clinical work. Some were delighted to share examples of times that they had successfully used these strategies.
Some educators planned to share the article with the charge nurses, who are often involved in debriefing clinical events such as rapid responses. They thought it would be useful for preceptor/ orientee conversations, and they will consider how to include this in our preceptor development program.
We also appreciated that the title referred to Healthcare Debriefing Tool. We felt that this encourages a broader application, guiding our conversations in clinical teaching, feedback conversations, not just for simulation.
Hi Ann! It’s so lovely to hear from you!
Thanks so much for coming along and posting, I was wondering how your preceptors felt PEARLS about whether it would fit within critical event debriefing?
Most people are accustomed to plus /delta debriefing, which definitely works for certain situations. They probably learned it by the “see one, do one, teach one” method; that’s how I learned it at the beginning!
AI is much more useful for situations that call for deeper exploration. An example might be a preceptor or charge nurse working with an experienced clinician who is new to the organization. They see the new person doing things that may be different, but not necessarily incorrect. Since they have not been trained in debriefing they have trouble knowing how to have that conversation.
They found the suggested wording of the questions extremely helpful.
A well structured debriefing sounds like a good conversation; the structure is not necessarily visible, but it is there. The PEARLS cognitive aid helps by making the structure clear and provides a roadmap, so the conversation doesn’t meander off in random direction.
Hi all. Some great dialogue about approaches to debriefing. There is such a range of options, and currently I’m contributing to some new INACSL online simulation education offerings for those relatively new to healthcare simulation. In the ‘scenario design’ module, I’m briefly touching on/ giving an overview of debriefing and will be inviting participants to come to this site and read/post comments. Be interested to see how this adds to the lines of discussion. This particular module will really be triggering the ‘big overview’ concepts, which also helps them to think about ‘arranging for learning’ – a concept that, with education colleagues from UTS, we’ve woven through our work and outputs. Keep up the great dialogue.
Hi Michelle,
Thanks so much for sending people along. Will be interesting to see if the conversation continues!
Ben
Was directed to this site via my Simulation Education Program from INACSL. Thank you for your great work on PEARLS, which (as a relative newcomer), I’m just learning about. It’s so clearly laid-out and I really like the prompts. Thank you for promoting its use. I see that Adam Cheng may be from my old stomping grounds, Calgary.
Do you know Catriona Booker, she’s a nurse researcher from Brisbane I created a mentoring program in association with Sigma Theta Tau.
Thanks Ben, catch you on the flippity-flop.