Add like
Add dislike
Add to saved papers

A collaborative computer aided diagnosis (C-CAD) system with eye-tracking, sparse attentional model, and deep learning.

Computer aided diagnosis (CAD) tools help radiologists to reduce diagnostic errors such as missing tumors and misdiagnosis. Vision researchers have been analyzing behaviors of radiologists during screening to understand how and why they miss tumors or misdiagnose. In this regard, eye-trackers have been instrumental in understanding visual search processes of radiologists. However, most relevant studies in this aspect are not compatible with realistic radiology reading rooms. In this study, we aim to develop a paradigm shifting CAD system, called collaborative CAD (C-CAD), that unifies CAD and eye-tracking systems in realistic radiology room settings. We first developed an eye-tracking interface providing radiologists with a real radiology reading room experience. Second, we propose a novel algorithm that unifies eye-tracking data and a CAD system. Specifically, we present a new graph based clustering and sparsification algorithm to transform eye-tracking data (gaze) into a graph model to interpret gaze patterns quantitatively and qualitatively. The proposed C-CAD collaborates with radiologists via eye-tracking technology and helps them to improve their diagnostic decisions. The C-CAD uses radiologists' search efficiency by processing their gaze patterns. Furthermore, the C-CAD incorporates a deep learning algorithm in a newly designed multi-task learning platform to segment and diagnose suspicious areas simultaneously. The proposed C-CAD system has been tested in a lung cancer screening experiment with multiple radiologists, reading low dose chest CTs. Promising results support the efficiency, accuracy and applicability of the proposed C-CAD system in a real radiology room setting. We have also shown that our framework is generalizable to more complex applications such as prostate cancer screening with multi-parametric magnetic resonance imaging (mp-MRI).

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app