Validity Evidence for a Novel, Comprehensive Bag-Mask Ventilation Assessment Tool.

J Pediatr. 2022 Jun;245:165-171.e13. doi: 10.1016/j.jpeds.2022.02.017. Epub 2022 Feb 16.

Whalen AM(1), Merves MH(2), Kharayat P(3), Barry JS(4), Glass KM(5), Berg RA(6), Sawyer T(7), Nadkarni V(6), Boyer DL(6), Nishisaki A(6).

Author information: (1)Division of Pediatric Critical Care Medicine, Department of Pediatrics, Medical University of South Carolina, Charleston, SC. Electronic address: whalen@musc.edu. (2)Division of Neonatology, Department of Pediatrics, University of Arkansas for Medical Sciences and Arkansas Children’s Hospital, Little Rock, AR. (3)Department of Pediatrics, Albert Einstein Medical Center, Philadelphia, PA. (4)Section of Neonatology, Department of Pediatrics, University of Colorado School of Medicine, Aurora, CO. (5)Division of Neonatal-Perinatal Medicine, Department of Pediatrics, Penn State College of Medicine, Milton S. Hershey Medical Center, Hershey, PA. (6)Division of Critical Care Medicine, Children’s Hospital of Philadelphia, Philadelphia, PA; Department of Anesthesiology & Critical Care, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, PA. (7)Division of Neonatology, Department of Pediatrics, University of Washington School of Medicine, Seattle Children’s Hospital, Seattle, WA.

OBJECTIVE: To develop a comprehensive competency assessment tool for pediatric bag-mask ventilation (pBMV) and demonstrate multidimensional validity evidence for this tool.

STUDY DESIGN: A novel pBMV assessment tool was developed consisting of 3 components: a 22-item-based checklist (trichotomized response), global rating scale (GRS, 5-point), and entrustment assessment (4-point). Participants’ performance in a realistic simulation scenario was video-recorded and assessed by blinded raters. Multidimensional validity evidence for procedural assessment, including evidence for content, response-process, internal structure, and relation to other variables, was assessed. The scores of each scale were compared with training level. Item-based checklist scores also were correlated with GRS and entrustment scores.

RESULTS: Fifty-eight participants (9 medical students, 10 pediatric residents, 18 critical care/neonatology fellows, 21 critical care/neonatology attendings) were evaluated. The pBMV tool was supported by high internal consistency (Cronbach α = 0.867). Inter-rater reliability for the item-based checklist component was acceptable (r = 0.65, P < .0001). The item-based checklist scores differentiated between medical students and other providers (P < .0001), but not by other trainee level. GRS and entrustment scores significantly differentiated between training levels (P < .001). Correlation between skill item-based checklist and GRS was r = 0.489 (P = .0001) and between item-based checklist and entrustment score was r = 0.52 (P < .001). This moderate correlation suggested each component measures pBMV skills differently. The GRS and entrustment scores demonstrated moderate inter-rater reliability (0.42 and 0.46).

CONCLUSIONS: We established evidence of multidimensional validity for a novel entrustment-based pBMV competence assessment tool, incorporating global and entrustment-based assessments. This comprehensive tool can provide learner feedback and aid in entrustment decisions as learners progress through training.

Copyright © 2022 Elsevier Inc. All rights reserved.

DOI: 10.1016/j.jpeds.2022.02.017 PMID: 35181294 [Indexed for MEDLINE]

Comparison of a dichotomous versus trichotomous checklist for neonatal intubation.

BMC Med Educ. 2022 Aug 26;22(1):645. doi: 10.1186/s12909-022-03700-4.

Johnston L(1), Sawyer T(2), Nishisaki A(3), Whitfill T(4), Ades A(5), French H(5), Glass K(6), Dadiz R(7), Bruno C(4), Levit O(4), Auerbach M(4).

Author information: (1)Department of Pediatrics, Yale University School of Medicine, 333 Cedar Street, New Haven, CT, 06510, USA. lindsay.johnston@yale.edu. (2)Department of Pediatrics, University of Washington School of Medicine, Seattle, USA. (3)Department of Anesthesiology and Critical Care Medicine, University of Pennsylvania Perelman School of Medicine, Philadelphia, USA. (4)Department of Pediatrics, Yale University School of Medicine, 333 Cedar Street, New Haven, CT, 06510, USA. (5)Department of Pediatrics, University of Pennsylvania Perelman School of Medicine, Philadelphia, USA. (6)Department of Pediatrics, Penn State College of Medicine, Hershey, USA. (7)School of Medicine and Dentistry, Department of Pediatrics, University of Rochester, Rochester, USA.

BACKGROUND: To compare validity evidence for dichotomous and trichotomous versions of a neonatal intubation (NI) procedural skills checklist.

METHODS: NI skills checklists were developed utilizing an existing framework. Experts were trained on scoring using dichotomous and trichotomous checklists, and rated recordings of 23 providers performing simulated NI. Videolaryngoscope recordings of glottic exposure were evaluated using Cormack-Lehane (CL) and Percent of Glottic Opening scales. Internal consistency and reliability of both checklists were analyzed, and correlations between checklist scores, airway visualization, entrustable professional activities (EPA), and global skills assessment (GSA) were calculated.

RESULTS: During rater training, raters gave significantly higher scores on better provider performance in standardized videos (both p < 0.001). When utilized to evaluate study participants’ simulated NI attempts, both dichotomous and trichotomous checklist scores demonstrated very good internal consistency (Cronbach’s alpha 0.868 and 0.840, respectively). Inter-rater reliability was higher for dichotomous than trichotomous checklists [Fleiss kappa of 0.642 and 0.576, respectively (p < 0.001)]. Sum checklist scores were significantly different among providers in different disciplines (p < 0.001, dichotomous and trichotomous). Sum dichotomous checklist scores correlated more strongly than trichotomous scores with GSA and CL grades. Sum dichotomous and trichotomous checklist scores correlated similarly well with EPA.

CONCLUSIONS: Neither dichotomous or trichotomous checklist was superior in discriminating provider NI skill when compared to GSA, EPA, or airway visualization assessment. Sum scores from dichotomous checklists may provide sufficient information to assess procedural competence, but trichotomous checklists may permit more granular feedback to learners and educators. The checklist selected may vary with assessment needs.

© 2022. The Author(s).

DOI: 10.1186/s12909-022-03700-4 PMCID: PMC9419414 PMID: 36028871 [Indexed for MEDLINE]

Conflict of interest statement: The authors have no conflicts of interest to disclose.

Flight Teams’ Learning Needs Assessment on Ultrasound: A Mixed Methods Approach.

Air Med J. 2022 Mar-Apr;41(2):237-242. doi: 10.1016/j.amj.2021.11.001. Epub 2021 Nov 30.

Nordell RH 4th(1), Van Scoy LJ(2), Witt PD(3), Flamm A(4).

Author information: (1)The Pennsylvania State University College of Medicine, Hershey, PA. (2)Department of Medicine, Penn State College of Medicine, Hershey, PA; Department of Humanities, Penn State College of Medicine, Hershey, PA; Department of Public Health Sciences, Penn State College of Medicine, Hershey, PA. (3)Department of Medicine, Penn State College of Medicine, Hershey, PA. (4)Department of Public Health Sciences, Penn State College of Medicine, Hershey, PA; Department of Emergency Medicine, Penn State Health Milton S. Hershey Medical Center, Hershey, PA. Electronic address: aflamm1@pennstatehealth.psu.edu.

OBJECTIVE: The goal of this study was to understand flight clinicians’ learning needs and attitudes with regard to a prehospital ultrasound curriculum. METHODS: In this convergent mixed methods study, 21 prehospital clinicians completed a questionnaire, and 20 attended a 1-hour focus group to explore attitudes regarding learning ultrasound. These participants were from a single emergency medical service agency.

RESULTS: Five themes emerged from the focus group transcripts and were supported by the quantitative data: 1) theme 1, hands-on training in ultrasound is a highly preferred modality; 2) theme 2, emergency medical service providers desire learning integrated into shifts and real-life practice; 3) theme 3, prehospital providers express concerns about training and maintenance of competency; 4) theme 4, participants recognize the need for quality control during the training phase and after; and 5) theme 5, participants were enthusiastic about how ultrasound could help guide clinical decision making and potentially improve patient outcomes.

CONCLUSION: Those who participated in an evidence-based assessment of prehospital ultrasound needs and barriers were experienced flight clinicians who would use prehospital ultrasound if made available. These adult learners indicated their preferred learning method would be using standardized patients, simulators, and hands-on in the field with physicians. They preferred follow-up courses and simulators to maintain competency.

Copyright © 2021 Air Medical Journal Associates. Published by Elsevier Inc. All rights reserved.

DOI: 10.1016/j.amj.2021.11.001 PMID: 35307150 [Indexed for MEDLINE]

A novel algorithm-driven hybrid simulation learning method to improve acquisition of endotracheal intubation skills: a randomized controlled study.

BMC Anesthesiol. 2022 Feb 8;22(1):42. doi: 10.1186/s12871-021-01557-6.

Mankute A(1), Juozapaviciene L(2), Stucinskas J(3), Dambrauskas Z(4), Dobozinskas P(5), Sinz E(6)(7), Rodgers DL(7), Giedraitis M(3), Vaitkaitis D(5).

Author information: (1)Department of Emergency Medicine, Lithuanian University of Health Sciences, Kaunas, Lithuania. Aida.Mankute@lsmuni.lt. (2)Department of Anaesthesiology, Lithuanian University of Health Sciences, Kaunas, Lithuania. (3)Department of Orthopaedics Traumatology, Lithuanian University of Health Sciences, Kaunas, Lithuania. (4)Department of Surgery, Lithuanian University of Health Sciences, Kaunas, Lithuania. (5)Department of Disaster Medicine, Lithuanian University of Health Sciences, Kaunas, Lithuania. (6)Department of Anesthesiology and Perioperative Medicine, Penn State Health Milton S. Hershey Medical Center, Hershey, USA. (7)Medical Simulation Center, Penn State Health Milton S. Hershey Medical Center, Hershey, PA, USA.

BACKGROUND: Simulation-based training is a clinical skill learning method that can replicate real-life situations in an interactive manner. In our study, we compared a novel hybrid learning method with conventional simulation learning in the teaching of endotracheal intubation.

METHODS: One hundred medical students and residents were randomly divided into two groups and were taught endotracheal intubation. The first group of subjects (control group) studied in the conventional way via lectures and classic simulation-based training sessions. The second group (experimental group) used the hybrid learning method where the teaching process consisted of distance learning and small group peer-to-peer simulation training sessions with remote supervision by the instructors. After the teaching process, endotracheal intubation (ETI) procedures were performed on real patients under the supervision of an anesthesiologist in an operating theater. Each step of the procedure was evaluated by a standardized assessment form (checklist) for both groups.

RESULTS: Thirty-four subjects constituted the control group and 43 were in the experimental group. The hybrid group (88%) showed significantly better ETI performance in the operating theater compared with the control group (52%). Further, all hybrid group subjects (100%) followed the correct sequence of actions, while in the control group only 32% followed proper sequencing. CONCLUSIONS: We conclude that our novel algorithm-driven hybrid simulation learning method improves acquisition of endotracheal intubation with a high degree of acceptability and satisfaction by the learners’ as compared with classic simulation-based training.

© 2022. The Author(s).

DOI: 10.1186/s12871-021-01557-6 PMCID: PMC8822842 PMID: 35135495 [Indexed for MEDLINE]

Conflict of interest statement: We declare that the author Aida Mankute works at the Crisis Research Center. Authors Dinas Vaitkaitis and Paulius Dobozinskas are the founders of the Crisis Research Center and HybridLab.

Impact of the COVID-19 pandemic on American College of Surgeons-Accredited Education Institutes & American Society of Anesthesiologists-Simulation Education Network: Opportunities for interdisciplinary collaboration.

Surgery. 2022 Jul 6:S0039-6060(22)00438-X. doi: 10.1016/j.surg.2022.06.012. Online ahead of print.

Wisbach GG(1), Johnson KA(2), Sormalis C(2), Johnson A(2), Ham J(3), Blair PG(2), Houg S(3), Burden AR(4), Sinz EH(5), Fortner SA(6), Steadman RH(7), Sachdeva AK(2), Rooney DM(8).

Author information: (1)General Surgery Department, Navy Medicine Readiness & Training Command, San Diego, CA. Electronic address: gwisbach@gmail.com. (2)American College of Surgeons, Division of Education, Chicago, IL. (3)American Society of Anesthesiologists, Department of Education, Schaumburg, IL. (4)Department of Anesthesiology, Cooper Medical School of Rowan University, Camden, NJ. (5)Department of Anesthesiology, Pennsylvania State University, Hershey, PA. (6)Department of Anesthesiology, University of New Mexico, Albuquerque, NM. (7)Department of Anesthesiology and Critical Care, Houston Methodist Hospital, Houston, TX. (8)Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI.

BACKGROUND: The COVID-19 pandemic presented challenges for simulation programs including American College of Surgeons Accredited Education Institutes and American Society of Anesthesiologists Simulation Education Network. American College of Surgeons Accredited Education Institutes and American Society of Anesthesiologists Simulation Education Network leadership were surveyed to identify opportunities to enhance patient safety through simulation.

METHODS: Between January and June 2021, surveys consisting of 3 targeted domains: (I) Changing practice; (II) Contributions and recognition; and (III) Moving ahead were distributed to 100 American College of Surgeons Accredited Education Institutes and 54 American Society of Anesthesiologists Simulation Education Network centers. Responses were combined and percent frequencies reported.

RESULTS: Ninety-six respondents, representing 51 (51%) American College of Surgeons Accredited Education Institutes, 17 (31.5%) American Society of Anesthesiologists Simulation Education Network, and 28 dually accredited centers, completed the survey. Change of practice. Although 20.3% of centers stayed fully operational at the COVID-19 onset, 82% of all centers closed: 32% were closed less than 3 months, 28% were closed 3 to 6 months, 8% were closed 7 to 9 months, and 32% remained closed as of June 6, 2021. Most impacted activities were large-group instruction and team training. Sixty-nine percent of programs converted in-person to virtual programs. Contributions. The top reported innovative contributions included policies (80%), curricula (80%), and scholarly work (74%), Moving ahead. The respondents’ top concerns were returning to high-quality training to best address learners’ deficiencies and re-engagement of re-directed training programs. When asked “How the American College of Surgeons/American Society of Anesthesiologists Programs could best assist your simulation center goals?” the top responses were “facilitate collaboration” and “publish best practices from this work.”

CONCLUSION: The Pandemic presented multiple challenges and opportunities for simulation centers. Opportunities included collaboration between American College of Surgeons Accredited Education Institutes and the American Society of Anesthesiologists Simulation Education Network to identify best practices and resources needed to enhance patient safety through simulation.

Published by Elsevier Inc.

DOI: 10.1016/j.surg.2022.06.012 PMCID: PMC9257111 PMID: 36041927

Going the (social) distance: Comparing the effectiveness of online versus in-person Internal Jugular Central Venous Catheterization procedural training.

Am J Surg. 2022 Sep;224(3):903-907. doi: 10.1016/j.amjsurg.2021.12.006. Epub 2021 Dec 7.

Gonzalez-Vargas JM(1), Tzamaras HM(1), Martinez J(2), Brown DC(3), Moore JZ(3), Han DC(4), Sinz E(5), Ng P(6), Yang MX(6), Miller SR(7).

Author information: (1)Department of Industrial and Manufacturing Engineering, Penn State, University Park, PA, 16802, USA. (2)Department of Surgery, Penn State Health Milton S. Hershey Medical Center, Hershey, PA, 17033, USA. (3)Department of Mechanical and Nuclear Engineering, Penn State, University Park, PA, 16802, USA. (4)Department of Surgery, Penn State Health Milton S. Hershey Medical Center, Hershey, PA, 17033, USA; Penn State Heart and Vascular Institute, Penn State Health Milton S. Hershey Medical Center, Hershey, PA, 17033, USA. (5)Department of Anesthesiology and Perioperative Medicine, Penn State Health Milton S. Hershey Medical Center, Hershey, PA, 17033, USA; Department of Neurosurgery, Penn State Health Milton S. Hershey Medical Center, Hershey, PA, 17033, USA. (6)Department of Internal Medicine, Cedars Sinai Medical Center, Los Angeles, CA, 90048, USA. (7)Department of Industrial and Manufacturing Engineering, Penn State, University Park, PA, 16802, USA; School of Engineering Design, Technology, and Professional Programs, Penn State, University Park, PA, 16802, USA. Electronic address: scarlettmiller@psu.edu.

BACKGROUND: This study compares surgical residents’ knowledge acquisition of ultrasound-guided Internal Jugular Central Venous Catheterization (US-IJCVC) between in-person and online procedural training cohorts before receiving independent in-person Dynamic Haptic Robotic Simulation training.

METHODS: Three surgical residency procedural training cohorts, two in-person (N = 26) and one online (N = 14), were compared based on their performance on a 24-item US-IJCVC evaluation checklist completed by an expert physician completed after training. Pre- and post-training US-IJCVC knowledge was also compared for the online cohort.

RESULTS: No significant change in the pass rates on the US-IJCVC checklist was found between in-person and online cohorts (p = 0.208). There were differences in the Economy of Time and Motion between in-person and online cohorts (p < 0.005). The online cohort had significant increases in US-IJCVC knowledge pre-to post-training (p < 0.008).

CONCLUSION: Online training with independent simulation practice was as effective as in-person training for US-IJCVC.

Copyright © 2021. Published by Elsevier Inc.

DOI: 10.1016/j.amjsurg.2021.12.006 PMCID: PMC9170828 PMID: 34930583 [Indexed for MEDLINE]