keyword
MENU ▼
Read by QxMD icon Read
search

"eye movements", "visual search"

keyword
https://www.readbyqxmd.com/read/28659850/increased-complexities-in-visual-search-behavior-in-skilled-players-for-a-self-paced-aiming-task
#1
Jingyi S Chia, Stephen F Burns, Laura A Barrett, Jia Y Chow
The badminton serve is an important shot for winning a rally in a match. It combines good technique with the ability to accurately integrate visual information from the shuttle, racket, opponent, and intended landing point. Despite its importance and repercussive nature, to date no study has looked at the visual search behaviors during badminton service in the singles discipline. Unlike anticipatory tasks (e.g., shot returns), the serve presents an opportunity to explore the role of visual search behaviors in movement control for self-paced tasks...
2017: Frontiers in Psychology
https://www.readbyqxmd.com/read/28638974/categorical-templates-are-more-useful-when-features-are-consistent-evidence-from-eye-movements-during-search-for-societally-important-vehicles
#2
Michael C Hout, Arryn Robbins, Hayward J Godwin, Gemma Fitzsimmons, Collin Scarince
Unlike in laboratory visual search tasks-wherein participants are typically presented with a pictorial representation of the item they are asked to seek out-in real-world searches, the observer rarely has veridical knowledge of the visual features that define their target. During categorical search, observers look for any instance of a categorically defined target (e.g., helping a family member look for their mobile phone). In these circumstances, people may not have information about noncritical features (e...
June 21, 2017: Attention, Perception & Psychophysics
https://www.readbyqxmd.com/read/28637052/subtle-eye-movement-metrics-reveal-task-relevant-representations-prior-to-visual-search
#3
Anouk M van Loon, Katya Olmos-Solis, Christian N L Olivers
Visual search is thought to be guided by an active visual working memory (VWM) representation of the task-relevant features, referred to as the search template. In three experiments using a probe technique, we investigated which eye movement metrics reveal which search template is activated prior to the search, and distinguish it from future relevant or no longer relevant VWM content. Participants memorized a target color for a subsequent search task, while being instructed to keep central fixation. Before the search display appeared, we briefly presented two task-irrelevant colored probe stimuli to the left and right from fixation, one of which could match the current target template...
June 1, 2017: Journal of Vision
https://www.readbyqxmd.com/read/28623108/acting-seeing-and-conscious-awareness
#4
R E Passingham, H C Lau
We argue that there is a relation between the judgements that 'I did it' and 'I saw it'. Both are statements are about the individual, not just the world. We show that the dorsal prefrontal cortex is activated both when human subjects judge that they are the agents of their actions and when they judge that they are confident that they have seen a masked visual stimulus. Macaque monkeys have also been taught to report whether they have or have not seen visual stimuli and cells can be found in the dorsal prefrontal cortex that distinguish between 'seen' and 'not seen'...
June 13, 2017: Neuropsychologia
https://www.readbyqxmd.com/read/28612679/influence-of-social-presence-on-eye-movements-in-visual-search-tasks
#5
Na Liu, Ruifeng Yu
This study employed an eye-tracking technique to investigate the influence of social presence on eye movements in visual search tasks. A total of 20 male subjects performed visual search tasks in a 2 (target presence: present vs. absent) × 2 (task complexity: complex vs. simple) × 2 (social presence: alone vs. a human audience) within-subject experiment. Results indicated that the presence of an audience could evoke a social facilitation effect on response time in visual search tasks. Compared with working alone, the participants made fewer and shorter fixations, larger saccades and shorter scan path in simple search tasks and more and longer fixations, smaller saccades and longer scan path in complex search tasks when working with an audience...
June 14, 2017: Ergonomics
https://www.readbyqxmd.com/read/28541187/enhancement-of-group-perception-via-a-collaborative-brain-computer-interface
#6
Davide Valeriani, Riccardo Poli, Caterina Cinel
OBJECTIVE: We aimed at improving group performance in a challenging visual search task via a hybrid collaborative brain-computer interface (cBCI). METHODS: Ten participants individually undertook a visual search task where a display was presented for 250 ms, and they had to decide whether a target was present or not. Local temporal correlation common spatial pattern (LTCCSP) was used to extract neural features from response- and stimulus-locked EEG epochs. The resulting feature vectors were extended by including response times and features extracted from eye movements...
June 2017: IEEE Transactions on Bio-medical Engineering
https://www.readbyqxmd.com/read/28508116/comparing-visual-search-and-eye-movements-in-bilinguals-and-monolinguals
#7
Ileana Ratiu, Michael C Hout, Stephen C Walenchok, Tamiko Azuma, Stephen D Goldinger
Recent research has suggested that bilinguals show advantages over monolinguals in visual search tasks, although these findings have been derived from global behavioral measures of accuracy and response times. In the present study we sought to explore the bilingual advantage by using more sensitive eyetracking techniques across three visual search experiments. These spatially and temporally fine-grained measures allowed us to carefully investigate any nuanced attentional differences between bilinguals and monolinguals...
May 15, 2017: Attention, Perception & Psychophysics
https://www.readbyqxmd.com/read/28424915/optimal-eye-movement-strategies-a-comparison-of-neurosurgeons-gaze-patterns-when-using-a-surgical-microscope
#8
Shahram Eivazi, Ahmad Hafez, Wolfgang Fuhl, Hoorieh Afkari, Enkelejda Kasneci, Martin Lehecka, Roman Bednarik
BACKGROUND: Previous studies have consistently demonstrated gaze behaviour differences related to expertise during various surgical procedures. In micro-neurosurgery, however, there is a lack of evidence of empirically demonstrated individual differences associated with visual attention. It is unknown exactly how neurosurgeons see a stereoscopic magnified view in the context of micro-neurosurgery and what this implies for medical training. METHOD: We report on an investigation of the eye movement patterns in micro-neurosurgery using a state-of-the-art eye tracker...
June 2017: Acta Neurochirurgica
https://www.readbyqxmd.com/read/28383964/adding-depth-to-overlapping-displays-can-improve-visual-search-performance
#9
Hayward J Godwin, Tamaryn Menneer, Simon P Liversedge, Kyle R Cave, Nick S Holliman, Nick Donnelly
Standard models of visual search have focused upon asking participants to search for a single target in displays where the objects do not overlap one another, and where the objects are presented on a single depth plane. This stands in contrast to many everyday visual searches wherein variations in overlap and depth are the norm, rather than the exception. Here, we addressed whether presenting overlapping objects on different depths planes to one another can improve search performance. Across 4 different experiments using different stimulus types (opaque polygons, transparent polygons, opaque real-world objects, and transparent X-ray images), we found that depth was primarily beneficial when the displays were transparent, and this benefit arose in terms of an increase in response accuracy...
April 6, 2017: Journal of Experimental Psychology. Human Perception and Performance
https://www.readbyqxmd.com/read/28375688/oculomotor-capture-is-influenced-by-expected-reward-value-but-maybe-not-predictiveness
#10
Mike E Le Pelley, Daniel Pearson, Alexis Porter, Hannah Yee, David Luque
A large body of research has shown that learning about relationships between neutral stimuli and events of significance-rewards or punishments-influences the extent to which people attend to those stimuli in future. However, different accounts of this influence differ in terms of the critical variable that is proposed to determine learned changes in attention. We describe two experiments using eye-tracking with a rewarded visual search procedure to investigate whether attentional capture is influenced by the predictiveness of stimuli (i...
April 4, 2017: Quarterly Journal of Experimental Psychology: QJEP
https://www.readbyqxmd.com/read/28371467/time-limits-in-testing-an-analysis-of-eye-movements-and-visual-attention-in-spatial-problem-solving
#11
Victoria A Roach, Graham M Fraser, James H Kryklywy, Derek G V Mitchell, Timothy D Wilson
Individuals with an aptitude for interpreting spatial information (high mental rotation ability: HMRA) typically master anatomy with more ease, and more quickly, than those with low mental rotation ability (LMRA). This article explores how visual attention differs with time limits on spatial reasoning tests. Participants were assorted to two groups based on their mental rotation ability scores and their eye movements were collected during these tests. Analysis of salience during testing revealed similarities between MRA groups in untimed conditions but significant differences between the groups in the timed one...
March 30, 2017: Anatomical Sciences Education
https://www.readbyqxmd.com/read/28368160/dual-target-cost-in-visual-search-for-multiple-unfamiliar-faces
#12
Natalie Mestry, Tamaryn Menneer, Kyle R Cave, Hayward J Godwin, Nick Donnelly
The efficiency of visual search for one (single-target) and either of two (dual-target) unfamiliar faces was explored to understand the manifestations of capacity and guidance limitations in face search. The visual similarity of distractor faces to target faces was manipulated using morphing (Experiments 1 and 2) and multidimensional scaling (Experiment 3). A dual-target cost was found in all experiments, evidenced by slower and less accurate search in dual- than single-target conditions. The dual-target cost was unequal across the targets, with performance being maintained on one target and reduced on the other, which we label "preferred" and "non-preferred" respectively...
April 3, 2017: Journal of Experimental Psychology. Human Perception and Performance
https://www.readbyqxmd.com/read/28358960/searching-for-objects-in-everyday-scenes-measuring-performance-in-people-with-dry-age-related-macular-degeneration
#13
COMPARATIVE STUDY
Deanna J Taylor, Nicholas D Smith, David P Crabb
Purpose: Treatment success in clinical trials for AMD would ideally be aligned to measurable performance in visual tasks rather than imperceptible changes on clinical charts. We test the hypothesis that patients with dry AMD perform worse than visually healthy peers on computer-based surrogates of "real-world" visual search tasks. Methods: A prospective case-control study was conducted in which patients with dry AMD performed a computer-based "real-world" visual search task...
March 1, 2017: Investigative Ophthalmology & Visual Science
https://www.readbyqxmd.com/read/28355625/predicting-rhesus-monkey-eye-movements-during-natural-image-search
#14
Mark A Segraves, Emory Kuo, Sara Caddigan, Emily A Berthiaume, Konrad P Kording
There are three prominent factors that can predict human visual-search behavior in natural scenes: the distinctiveness of a location (salience), similarity to the target (relevance), and features of the environment that predict where the object might be (context). We do not currently know how well these factors are able to predict macaque visual search, which matters because it is arguably the most popular model for asking how the brain controls eye movements. Here we trained monkeys to perform the pedestrian search task previously used for human subjects...
March 1, 2017: Journal of Vision
https://www.readbyqxmd.com/read/28293433/eye-tracking-as-a-tool-to-evaluate-functional-ability-in-everyday-tasks-in-glaucoma
#15
REVIEW
Enkelejda Kasneci, Alex A Black, Joanne M Wood
To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis...
2017: Journal of Ophthalmology
https://www.readbyqxmd.com/read/28287759/beyond-scene-gist-objects-guide-search-more-than-scene-background
#16
Kathryn Koehler, Miguel P Eckstein
Although the facilitation of visual search by contextual information is well established, there is little understanding of the independent contributions of different types of contextual cues in scenes. Here we manipulated 3 types of contextual information: object co-occurrence, multiple object configurations, and background category. We isolated the benefits of each contextual cue to target detectability, its impact on decision bias, confidence, and the guidance of eye movements. We find that object-based information guides eye movements and facilitates perceptual judgments more than scene background...
March 13, 2017: Journal of Experimental Psychology. Human Perception and Performance
https://www.readbyqxmd.com/read/28270868/preserved-search-asymmetry-in-the-detection-of-fearful-faces-among-neutral-faces-in-individuals-with-williams-syndrome-revealed-by-measurement-of-both-manual-responses-and-eye-tracking
#17
Masahiro Hirai, Yukako Muramatsu, Seiji Mizuno, Naoko Kurahashi, Hirokazu Kurahashi, Miho Nakamura
BACKGROUND: Individuals with Williams syndrome (WS) exhibit an atypical social phenotype termed hypersociability. One theory accounting for hypersociability presumes an atypical function of the amygdala, which processes fear-related information. However, evidence is lacking regarding the detection mechanisms of fearful faces for individuals with WS. Here, we introduce a visual search paradigm to elucidate the mechanisms for detecting fearful faces by evaluating the search asymmetry; the reaction time when both the target and distractors were swapped was asymmetrical...
2017: Journal of Neurodevelopmental Disorders
https://www.readbyqxmd.com/read/28265652/chess-players-eye-movements-reveal-rapid-recognition-of-complex-visual-patterns-evidence-from-a-chess-related-visual-search-task
#18
Heather Sheridan, Eyal M Reingold
To explore the perceptual component of chess expertise, we monitored the eye movements of expert and novice chess players during a chess-related visual search task that tested anecdotal reports that a key differentiator of chess skill is the ability to visualize the complex moves of the knight piece. Specifically, chess players viewed an array of four minimized chessboards, and they rapidly searched for the target board that allowed a knight piece to reach a target square in three moves. On each trial, there was only one target board (i...
March 1, 2017: Journal of Vision
https://www.readbyqxmd.com/read/28245502/temporal-and-peripheral-extraction-of-contextual-cues-from-scenes-during-visual-search
#19
Kathryn Koehler, Miguel P Eckstein
Scene context is known to facilitate object recognition and guide visual search, but little work has focused on isolating image-based cues and evaluating their contributions to eye movement guidance and search performance. Here, we explore three types of contextual cues (a co-occurring object, the configuration of other objects, and the superordinate category of background elements) and assess their joint contributions to search performance in the framework of cue-combination and the temporal unfolding of their extraction...
February 1, 2017: Journal of Vision
https://www.readbyqxmd.com/read/28202816/human-visual-search-behaviour-is-far-from-ideal
#20
Anna Nowakowska, Alasdair D F Clarke, Amelia R Hunt
Evolutionary pressures have made foraging behaviours highly efficient in many species. Eye movements during search present a useful instance of foraging behaviour in humans. We tested the efficiency of eye movements during search using homogeneous and heterogeneous arrays of line segments. The search target is visible in the periphery on the homogeneous array, but requires central vision to be detected on the heterogeneous array. For a compound search array that is heterogeneous on one side and homogeneous on the other, eye movements should be directed only to the heterogeneous side...
February 22, 2017: Proceedings. Biological Sciences
keyword
keyword
115722
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"