We have located links that may give you full text access.
JOURNAL ARTICLE
RESEARCH SUPPORT, NON-U.S. GOV'T
Evaluation of a Measurement System to Assess ICU Team Performance.
Critical Care Medicine 2018 December
OBJECTIVE: Measuring teamwork is essential in critical care, but limited observational measurement systems exist for this environment. The objective of this study was to evaluate the reliability and validity of a behavioral marker system for measuring teamwork in ICUs.
DESIGN: Instances of teamwork were observed by two raters for three tasks: multidisciplinary rounds, nurse-to-nurse handoffs, and retrospective videos of medical students and instructors performing simulated codes. Intraclass correlation coefficients were calculated to assess interrater reliability. Generalizability theory was applied to estimate systematic sources of variance for the three observed team tasks that were associated with instances of teamwork, rater effects, competency effects, and task effects.
SETTING: A 15-bed surgical ICU at a large academic hospital.
SUBJECTS: One hundred thirty-eight instances of teamwork were observed. Specifically, we observed 88 multidisciplinary rounds, 25 nurse-to-nurse handoffs, and 25 simulated code exercises.
INTERVENTIONS: No intervention was conducted for this study.
MEASUREMENTS AND MAIN RESULTS: Rater reliability for each overall task ranged from good to excellent correlation (intraclass correlation coefficient, 0.64-0.81), although there were seven cases where reliability was fair and one case where it was poor for specific competencies. Findings from generalizability studies provided evidence that the marker system dependably distinguished among teamwork competencies, providing evidence of construct validity.
CONCLUSIONS: Teamwork in critical care is complex, thereby complicating the judgment of behaviors. The marker system exhibited great potential for differentiating competencies, but findings also revealed that more context specific guidance may be needed to improve rater reliability.
DESIGN: Instances of teamwork were observed by two raters for three tasks: multidisciplinary rounds, nurse-to-nurse handoffs, and retrospective videos of medical students and instructors performing simulated codes. Intraclass correlation coefficients were calculated to assess interrater reliability. Generalizability theory was applied to estimate systematic sources of variance for the three observed team tasks that were associated with instances of teamwork, rater effects, competency effects, and task effects.
SETTING: A 15-bed surgical ICU at a large academic hospital.
SUBJECTS: One hundred thirty-eight instances of teamwork were observed. Specifically, we observed 88 multidisciplinary rounds, 25 nurse-to-nurse handoffs, and 25 simulated code exercises.
INTERVENTIONS: No intervention was conducted for this study.
MEASUREMENTS AND MAIN RESULTS: Rater reliability for each overall task ranged from good to excellent correlation (intraclass correlation coefficient, 0.64-0.81), although there were seven cases where reliability was fair and one case where it was poor for specific competencies. Findings from generalizability studies provided evidence that the marker system dependably distinguished among teamwork competencies, providing evidence of construct validity.
CONCLUSIONS: Teamwork in critical care is complex, thereby complicating the judgment of behaviors. The marker system exhibited great potential for differentiating competencies, but findings also revealed that more context specific guidance may be needed to improve rater reliability.
Full text links
Related Resources
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app