We have located links that may give you full text access.
Disentangling Visual Exploration Differences in Cognitive Impairment.
IEEE Transactions on Bio-medical Engineering 2023 November 10
OBJECTIVE: Individuals with cognitive impairment (CI) exhibit different oculomotor functions and viewing behaviors. In this work we aimed to quantify the differences in these functions with CI severity, and assess general CI and specific cognitive functions related to visual exploration behaviors.
METHODS: A validated passive viewing memory test with eyetracking was administered to 348 healthy controls and CI individuals. Spatiotemporal properties of the scanpath, the semantic category of the viewed regions, and other composite features were extracted from the estimated eyegaze locations on the corresponding pictures displayed during the test. These features were then used to characterize viewing patterns, classify cognitive impairment, and estimate scores in various neuropsychological tests using machine learning.
RESULTS: Statistically significant differences in spatial, spatiotemporal, and semantic features were found between healthy controls and individuals with CI. The CI group spent more time gazing at the center of the image, looked at more regions of interest (ROI), transitioned less often between ROI yet in a more unpredictable manner, and exhibited different semantic preferences. A combination of these features achieved an area under the receiver-operator curve of 0.78 in differentiating CI individuals from controls. Statistically significant correlations were identified between actual and estimated CI scores and other neuropsychological tests.
CONCLUSION: Evaluating visual exploration behaviors provided quantitative and systematic evidence of differences in CI individuals, leading to an improved approach for passive cognitive impairment screening.
SIGNIFICANCE: The proposed passive, accessible, and scalable approach could help with earlier detection and a better understanding of cognitive impairment.
METHODS: A validated passive viewing memory test with eyetracking was administered to 348 healthy controls and CI individuals. Spatiotemporal properties of the scanpath, the semantic category of the viewed regions, and other composite features were extracted from the estimated eyegaze locations on the corresponding pictures displayed during the test. These features were then used to characterize viewing patterns, classify cognitive impairment, and estimate scores in various neuropsychological tests using machine learning.
RESULTS: Statistically significant differences in spatial, spatiotemporal, and semantic features were found between healthy controls and individuals with CI. The CI group spent more time gazing at the center of the image, looked at more regions of interest (ROI), transitioned less often between ROI yet in a more unpredictable manner, and exhibited different semantic preferences. A combination of these features achieved an area under the receiver-operator curve of 0.78 in differentiating CI individuals from controls. Statistically significant correlations were identified between actual and estimated CI scores and other neuropsychological tests.
CONCLUSION: Evaluating visual exploration behaviors provided quantitative and systematic evidence of differences in CI individuals, leading to an improved approach for passive cognitive impairment screening.
SIGNIFICANCE: The proposed passive, accessible, and scalable approach could help with earlier detection and a better understanding of cognitive impairment.
Full text links
Related Resources
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app