Add like
Add dislike
Add to saved papers

Real-Time, Automated Detection of Ventilator-Associated Events: Avoiding Missed Detections, Misclassifications, and False Detections Due to Human Error.

OBJECTIVETo validate a system to detect ventilator associated events (VAEs) autonomously and in real time.DESIGNRetrospective review of ventilated patients using a secure informatics platform to identify VAEs (ie, automated surveillance) compared to surveillance by infection control (IC) staff (ie, manual surveillance), including development and validation cohorts.SETTINGThe Massachusetts General Hospital, a tertiary-care academic health center, during January-March 2015 (development cohort) and January-March 2016 (validation cohort).PATIENTSVentilated patients in 4 intensive care units.METHODSThe automated process included (1) analysis of physiologic data to detect increases in positive end-expiratory pressure (PEEP) and fraction of inspired oxygen (FiO2); (2) querying the electronic health record (EHR) for leukopenia or leukocytosis and antibiotic initiation data; and (3) retrieval and interpretation of microbiology reports. The cohorts were evaluated as follows: (1) manual surveillance by IC staff with independent chart review; (2) automated surveillance detection of ventilator-associated condition (VAC), infection-related ventilator-associated complication (IVAC), and possible VAP (PVAP); (3) senior IC staff adjudicated manual surveillance-automated surveillance discordance. Outcomes included sensitivity, specificity, positive predictive value (PPV), and manual surveillance detection errors. Errors detected during the development cohort resulted in algorithm updates applied to the validation cohort.RESULTSIn the development cohort, there were 1,325 admissions, 479 ventilated patients, 2,539 ventilator days, and 47 VAEs. In the validation cohort, there were 1,234 admissions, 431 ventilated patients, 2,604 ventilator days, and 56 VAEs. With manual surveillance, in the development cohort, sensitivity was 40%, specificity was 98%, and PPV was 70%. In the validation cohort, sensitivity was 71%, specificity was 98%, and PPV was 87%. With automated surveillance, in the development cohort, sensitivity was 100%, specificity was 100%, and PPV was 100%. In the validation cohort, sensitivity was 85%, specificity was 99%, and PPV was 100%. Manual surveillance detection errors included missed detections, misclassifications, and false detections.CONCLUSIONSManual surveillance is vulnerable to human error. Automated surveillance is more accurate and more efficient for VAE surveillance.Infect Control Hosp Epidemiol 2018;826-833.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app