Introduction :
Simulcast Journal Club is a monthly series heavily inspired by the ALiEM MEdIC Series. It aims to encourage simulation educators to explore and learn from publications on Healthcare Simulation Education. Each month we publish a case and link a paper with associated questions for discussion. We moderate and summarise the discussion at the end of the month, including exploring the opinions of experts from the field.
The journal club relies heavily on your participation and comments, and while it can be confronting to post your opinions on an article online, we hope we can generate a sense of “online psychological safety” enough to empower you to post! Your thoughts are highly valued and appreciated, however in depth or whatever your level of experience. We look forward to hearing from you!
Title : “Who watches the watchmen?”
The Case :
Nimali had been surprised with Nitin. For a new fellow, he had been surprisingly perceptive about what the running of the Simulation Centre involved.
“It’s true,” she said. “We could be a lot better, but it’s hard!”
“We get great feedback from the learners, sure, but we’re not going to improve if we just gauge ourselves based on the Likert scales of a bunch of interns who are so relieved to be taught in a non threatening environment that their dopamine levels are through the roof.”
Nitin shrugged philosophically, “It’s nice to be liked at least.”.
Nimali agreed. “But I think we’re addicted to it! We’re so busy being non-threatening and nice and intellectually cuddly that we’re not growing as a unit! We talk about debriefing with good judgement all the time, but I watched Catherine debrief the debrief yesterday, and it was basically ‘You guys are tops! High five!’”.
She gestured out the window towards the sim centre.
“We’ve been given this gift.” she said to Nitin, and it was clear that she meant it.
“I want us to be world class. But we’ve been too busy educating others to improve ourselves. I’ve got to get my staff on board, but first I need a plan.”
She gazed outside again and smiled.
“I want to make this place sing.”.
The Article :
Peterson, Dawn Taylor PhD; Watts, Penni I. PhD, RN, CHSE-A; Epps, Chad A. MD; White, Marjorie Lee MD, MPPM, MA, CHSE (2017)
“Simulation Faculty Development: A Tiered Approach.”
Simulation in Healthcare : The Journal of the Society for Simulation in Healthcare.
Publish Ahead of Print, POST AUTHOR CORRECTIONS, 18 March 2017
Discussion :
Running a simulation program can be work enough on it’s own, let alone worrying about your own faculty’s development. But as Peterson et al suggest in this month’s article, educating the educators can be a significant challenge and one which many institutions ignore. Peterson et al provide information on their certification process and explore lessons learned from its implementation.
To get the discussion started :
- What are your thoughts on the principles raised in this article?
- What’s your experience of faculty development in your program?
- Is the proposed framework within this article feasible for your institution? And if not, what lessons and principles are still translatable to your service?
We are privileged to have the authors as our expert commenters this month, so we look forward to your thoughts!
References :
Peterson, Dawn Taylor PhD; Watts, Penni I. PhD, RN, CHSE-A; Epps, Chad A. MD; White, Marjorie Lee MD, MPPM, MA, CHSE (2017)
“Simulation Faculty Development: A Tiered Approach.”
Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare.
Publish Ahead of Print, POST AUTHOR CORRECTIONS, 18 M
Thanks again for another interesting article Ben. I’ve been involved with faculty development is a few different contexts and so very interested in their systematic, institutional approach.
The paper starts with a rationale that is easy to appreciate – simulation educators need specific skills and ‘evidence informed’ approaches are needed to make their professional development efficient and effective.
The authors also outline various (US based) professional organisations that have supported/ required faculty development in sim. They (rightly I believe) suggest that an institutional faculty development structure can support consistency of simulation delivery. It sets up the context for their program.
The article then described the ‘tiered approach’ they use, and seek to justify it, based on broad review of the literature.
I found it interesting that the ‘tiers’ denoted increasing complexity across all domains of sim practice, rather than offerings within each domain. Although I note they do offer specific courses in RCDP and in situ etc. I do think the increasing levels and labels they describe (apprentice, expert 1,2,3, etc) will feel more culturally congruent for US based practitioners. Our local approach is to offer different workshops with variable take up according to simulation role. Idea that everyone does ‘general’ workshops on design, delivery and debriefing, but then there are deep dives into each. I think UAB has a much bigger enterprise and hence a more comprehensive approach
I like the multimodal nature of the progression, with some apprenticeship activities, some structured workshop and other development activities.
I am very interested in the ‘certification’ issue. Once again this appears more common and accepted in American context. My personal view is that it has some very good aspects ( like the authors mention re consistency and upskilling), but some downsides (eg excluding those not ‘certified’ even if skilled, and sometimes leading to a ‘cookie cutter’ approach). Like all accreditation – strict adherence to standards can stifle innovation if applied inappropriately.
I found the terminology change to ‘facilitator’ a nice albeit brief item. I think many sim programs struggle with nomenclature in job descriptions for ‘sim people’.
There was no mention of ‘failure to progress’ and what that meant. There was a reference to the requirement to be certified for participation in the sim program, but it seems for many staff this is outside their mainstream performance management. This is a struggle when we want our sim folks to be embedded in universities and hospitals, and less ‘separate’ from core business, but it can also lead to disconnects
So I think the paper gives us food for thought in how to have a systematic approach to our faculty development – both in the opportunities available and to give structure for individual staff.
Look forward to further discussions
Thanks Ben for choosing an interesting topic for this month journal club.
Faculty development in simulation is not only “teaching the teacher or debriefing the debriefer “as the article highlights the elements of simulation faculty development eloquently then goes on to explain the five tiers to their OIPS certification plan and its benefits to the facilitators, learners and the simulation service/centre.
I agree with Victoria Brazil that our local approach in Australia that everyone does “general “workshop then follow that with more specialised workshop depending on individual interest and institutional needs. But I can see this changing in the near future as the simulation centres are progressively expanding and there will be a need for more comprehensive and standardised approach similar to what has been described in the paper.
I am pleased that Victoria has raised the point of “cookie cutter approach “in the “certification issue “this is often a topic people don’t explore when thinking about certification programmes.
The paper also identified “continues struggle in encouraging faculty to participate when certification is not required” but without exploring the factors that might influence it. It will be interesting to see what are the factors and how to overcome them.
In our Sim program, we are doing the” tier approach “without the nomenclature and in contrast to having checklist requiring signature and documentation of completion as suggested by the paper we use dropbox to track progress (I guess this is easy as the numbers of facilitators doing such training are small compared to UAB).
Finally thank you again Ben for your contribution in Simulation world by hosting these monthly journal clubs that gives us an opportunity for ongoing faculty development through Networking.
Regards
Nemat
Thanks Vic and Nemat for your comments so far.
I chose this paper because I think faculty development is a recurring issue with educators that I talk to and one that I don’t feel I’ve gotten a good handle on yet.
In the public system in Australia, I’d have to say that my experience is that many sim educators are often trying to do more with less, have done a simulation course or two, and then get caught up in service delivery and trying to squeeze out maximal education for minimal budget. Their own educational needs appear frozen after an initial investment in a simulation course, completion of which automatically notes them as a ‘local expert’.
So it was exciting for me to see a facility that appears to have both the critical mass and the funding behind it to support a more structured faculty development program. And as someone who has had to drive and fund their own development on their downtime and personal income, I was drooling at the thought of a centre that not only provided a structured feedback program but also funded some courses as well! I wish I’d had such an opportunity.
I concede to Vic’s opinion that a one size fits all approach may not fit certain cultural contexts, and to be honest it surprised me in a simulation centre the size of the OIPS that they didn’t have the volumes to require a more fine tuned approach for Sim Cos vs Clinical debriefers etc. That approach appears strongly egalitarian in many ways, despite the thoughts about the fairly hierarchical structure.
To me the strength of the system appeared to be :
– The clear sense of progression, which for many of us can be a strong motivator in itself (maybe it’s the nerd in me, but I do love to ‘level up’)
– A clearly set of resources for educators to participate and grow from
– The built in structured clinical feedback and mentoring
What surprised and disappointed me then (and reading between the lines, maybe disappointed the authors as well?) was the 19% take up rate. And I get it, if you’re donating your non clinical time to teaching, it might be confronting to then be asked to also meet a certain standard or supply evidence of qualification, because I think mentally we sometimes internally correlate ‘medical education’ with ‘charity’ in our heads, and thus don’t want to put up barriers to clinicians participating in education. I remember my Dad once pushing the ‘assist’ button on the plane when the intercom asked for a doctor, but then the air hostesses asked him to produce evidence, on the plane, of his medical degree, to which his response was then ‘well that’s fine if you don’t want my help’. I worry that maybe there would be a similar situation if this setup was instituted in the hospitals where I have worked.
I think the other challenge, not specifically acknowledged in this paper, is just how expensive some of these courses are. And while I’m a big believer in paying for quality and paying for experts so they can actually be experts, the sad truth is that the levels of financial investment required to become a formally trained expert in simulation education is high. As an emergency consultant, that’s no a huge deal for me because I am very well paid, but for our nursing staff in particular, the cost of many courses is an enormous proportion of their income and it’s a significant barrier to enabling expert nursing debriefers, and that investment is not necessarily linked to an increase in income or seniority.
So I agree with you Nemat, that I would have loved some more exploration to the barriers to uptake within the OIPS program. I hope maybe that the authors, who have kindly agreed to be our expert commentary this month, will illuminate us with more detail!
So in summary, I’m envious of this program, I enjoyed seeing an example of a structured, considered approach to faculty development and particularly appreciated that it wasn’t purely based around courses, but also around multiple different types of learning including peer feedback and networking. I can also see significant barriers to the program uptake in a variety of settings including my own, but I think if nothing else there’s some great role modelling of a faculty of sim facilitators who passionately care about improving their service.
Thanks again Ben for sharing yet another interesting article! I found it interesting reading to see how well established some of the simulation centres in the world are, to be able to run a programme like this. The nomenclature is almost like a video game – and I would be very much like you in this regard, and be pushing to achieve the “simulation expert 3” level!
I wonder though, if a programme like this would be limited to very large (>24,000 learner hours per year!!!) sim centres like this. For the rest of us in smaller, local centres, we might have to resort to the peer coaching model highlighted in last months journal club. I don’t think we have enough numbers of simulation facilitators, or “experts” to provide supervision and performance reviews to be able to sustain it.
Also agree with your point about creating further barriers to providing education. The requirement of more hoops to jump through may discourage people from engaging at all, particularly in this era where a requirement for “credentialing” to do things that people have already been doing for years (like procedural sedation, or ultrasound for example) may paint it in a negative light.
Thanks Ben for choosing such an interesting article. I also enjoyed reading all the comments.
I can see the benefits of having a very structured program for junior trainees. Predetermined goals coupled with a multimodal approach makes achieving objectives more engaging. Their curriculum gives the impression of creating a very strong basis for beginners, onto which more “expert” skills can be built. In a sense, they are trying to train very “well-rounded” facilitators. But as Vic said, I can also see that this type of structure might feel constraining, especially if you have specific goals in mind or if you already have acquired some of these skills.
I must admit, I find the levelling up approach slightly alluring, although it might be due to my North-American perspective or my love of Nintendo! I agree with Ben and Suneth, this design can serve as a stimulant since it gives such an explicit sense of progression and skill acquisition. But I do appreciate how Vic’s local approach allows for a more individualization.
Suneth also made a very valid point when he mentioned the obstacles (expertise, funding, etc) of applying such a program in smaller centres. This article made me realize the obvious, some of us are not lucky enough to have access to such impressive resources to maintain or improve our sim skills. But it also made me recognize how beneficial having a network of sim enthusiasts can be. I appreciate the importance and effort the authors put into creating a sense of community (through networking and mentorship) in their program. I think we can apply this approach to our own faculty development endeavours no matter how big or small.
Lastly, although I understand the premise and need for faculty development, I also find the requirement for certification would deter many of my colleagues, who “donate” their time, from getting involved in the first place. And as Nemat said, it would be interesting to explore the reasons why they only have 19% uptake.
Thanks again Ben and the simulcast community for sharing your thoughts and knowledge!
Thanks for sharing your thoughts Shaghi, where are you based?
It seems that there’s enough nerdiness in our little sim educator cohort here that we could motivate an entire faculty development program simply by allocating experience points and extra lives :p
I believe creating a culture in which “facilitators” have common goals, share ideas, and invite constant feedback from colleagues is more important than rigorous commitment to a enforced curriculum.
Thanks Bruce! I feel like this struck a nerve for you, would you care to elaborate?
Excellent choice again Ben! This is really tough and I think the authors really have had a good go at both doing this sensibly and being realistic about some of the limitations. I particularly like the way they have made deliberate efforts to include networking amongst simulation facilitators across their institution and attendance at a national conference, both of which will help to minimise the silo effect. I feel really vulnerable to this silo effect in the context in which I work as a lot of us have been trained with the same model and the community is so small that it is in danger of acting like an echo chamber for our own ideas. Forums like national/international conferences and this journal club are an invaluable protective factor against that silo effect and therefore an important part of developing as a sim educator in our context.
There is also some acknowledgement of the different roles required in running a simulation based education event. Appreciating that all the sim I do is within quite a narrow spectrum, I would agree that those roles of sim technology expert, debriefing expert and clinical content expert all need to be present to run a good quality sim event but they can be played by 1, 2 or 3 people depending on the skill sets available. Therefore, someone can be very capable of acting as a facilitator in the company of particular colleagues (with complimentary skill sets) but not in the company of others.
I would disagree with the authors that debriefing is a complicated skill. I think the socio-cultural dimension really makes it a complex skill and this creates a problem in that it is really hard to define the “correct” way to perform any complex skill. An expert debriefer may choose to disregard the model against which they are being measured in order to optimise the encounter and this positive variance may count against them in any attempt to measure the quality of a debrief.
As with many things (clinical medicine included) it is probably an attainable goal to define standards at a fairly basic level but as facilitators progress to more advanced levels our knowledge as a professional group of what is the “right” or “wrong” thing to do in a particular circumstance gets less and less well defined. In this context I think the best we can do sometimes is to promote reflective practice in a broad range of domains and consider engagement with that reflective practice as evidence of commitment to a journey towards excellence. Otherwise we get sucked back into using behaviourism to re-enforce sometimes fairly arbitrary “work as prescribed” rather than supporting the real improvement of “work as done”.
I do find setting standards for qualification to act as faculty on our courses difficult as there really isn’t a universal marker of competence in our context. There are a couple of well known courses that we tend to take as a marker of competence but truly attending any of those has to be backed by some experience to be useful I suspect. They also carry real barriers to accessibility in terms of cost and time and there are many excellent facilitators around who have not attended any of them. Nhet-sim has been an excellent thing for Australia as it is just so much more accessible than many other courses. Peer feedback is invaluable but has to be psychologically safe, authentic and both provided and received respectfully which is also harder than it should be to achieve.
I do think this paper has value as a starting point for a conversation about how we do this better in our own context. Thanks Ben
Hi, coming in on this late! Thanks Ben and I agree with many of the above comments on what a useful article this is to read. I think it’s great that the sim community can share its processes in this way. I also echo the comment regarding the interest in the 19% (or the other 81%) and I wonder if there is any particularly influence of professional background. The one thing I found puzzling was that the Simulation Apprentice levels 1 & 2 have no experiential learning component except perhaps what might be in the workshop. (is it just me or is that a bit ironic?). Perhaps those 8 Sim expert 3’s are very busy certifying Sim Expert 1 and 2’s? Great article and conversation – thanks Ben and others.
Thanks for joining us Sarah, my impression is that the apprentice levels are just that, entry level with a low gateway for entry. Will be interesting to hear from the authors soon in our expert commentary. Hope to hear from you again!