keyword
MENU ▼
Read by QxMD icon Read
search

"eye movements", "visual search"

keyword
https://www.readbyqxmd.com/read/28293433/eye-tracking-as-a-tool-to-evaluate-functional-ability-in-everyday-tasks-in-glaucoma
#1
REVIEW
Enkelejda Kasneci, Alex A Black, Joanne M Wood
To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis...
2017: Journal of Ophthalmology
https://www.readbyqxmd.com/read/28287759/beyond-scene-gist-objects-guide-search-more-than-scene-background
#2
Kathryn Koehler, Miguel P Eckstein
Although the facilitation of visual search by contextual information is well established, there is little understanding of the independent contributions of different types of contextual cues in scenes. Here we manipulated 3 types of contextual information: object co-occurrence, multiple object configurations, and background category. We isolated the benefits of each contextual cue to target detectability, its impact on decision bias, confidence, and the guidance of eye movements. We find that object-based information guides eye movements and facilitates perceptual judgments more than scene background...
March 13, 2017: Journal of Experimental Psychology. Human Perception and Performance
https://www.readbyqxmd.com/read/28270868/preserved-search-asymmetry-in-the-detection-of-fearful-faces-among-neutral-faces-in-individuals-with-williams-syndrome-revealed-by-measurement-of-both-manual-responses-and-eye-tracking
#3
Masahiro Hirai, Yukako Muramatsu, Seiji Mizuno, Naoko Kurahashi, Hirokazu Kurahashi, Miho Nakamura
BACKGROUND: Individuals with Williams syndrome (WS) exhibit an atypical social phenotype termed hypersociability. One theory accounting for hypersociability presumes an atypical function of the amygdala, which processes fear-related information. However, evidence is lacking regarding the detection mechanisms of fearful faces for individuals with WS. Here, we introduce a visual search paradigm to elucidate the mechanisms for detecting fearful faces by evaluating the search asymmetry; the reaction time when both the target and distractors were swapped was asymmetrical...
2017: Journal of Neurodevelopmental Disorders
https://www.readbyqxmd.com/read/28265652/chess-players-eye-movements-reveal-rapid-recognition-of-complex-visual-patterns-evidence-from-a-chess-related-visual-search-task
#4
Heather Sheridan, Eyal M Reingold
To explore the perceptual component of chess expertise, we monitored the eye movements of expert and novice chess players during a chess-related visual search task that tested anecdotal reports that a key differentiator of chess skill is the ability to visualize the complex moves of the knight piece. Specifically, chess players viewed an array of four minimized chessboards, and they rapidly searched for the target board that allowed a knight piece to reach a target square in three moves. On each trial, there was only one target board (i...
March 1, 2017: Journal of Vision
https://www.readbyqxmd.com/read/28245502/temporal-and-peripheral-extraction-of-contextual-cues-from-scenes-during-visual-search
#5
Kathryn Koehler, Miguel P Eckstein
Scene context is known to facilitate object recognition and guide visual search, but little work has focused on isolating image-based cues and evaluating their contributions to eye movement guidance and search performance. Here, we explore three types of contextual cues (a co-occurring object, the configuration of other objects, and the superordinate category of background elements) and assess their joint contributions to search performance in the framework of cue-combination and the temporal unfolding of their extraction...
February 1, 2017: Journal of Vision
https://www.readbyqxmd.com/read/28202816/human-visual-search-behaviour-is-far-from-ideal
#6
Anna Nowakowska, Alasdair D F Clarke, Amelia R Hunt
Evolutionary pressures have made foraging behaviours highly efficient in many species. Eye movements during search present a useful instance of foraging behaviour in humans. We tested the efficiency of eye movements during search using homogeneous and heterogeneous arrays of line segments. The search target is visible in the periphery on the homogeneous array, but requires central vision to be detected on the heterogeneous array. For a compound search array that is heterogeneous on one side and homogeneous on the other, eye movements should be directed only to the heterogeneous side...
February 22, 2017: Proceedings. Biological Sciences
https://www.readbyqxmd.com/read/28124293/influence-of-simple-action-on-subsequent-manual-and-ocular-responses
#7
Fei Wang, Ji Sun, Pei Sun, Blaire J Weidler, Richard A Abrams
Recent investigations into how action affects perception have revealed an interesting "action effect"-that is, simply acting upon an object enhances its processing in subsequent tasks. The previous studies, however, relied only on manual responses, allowing an alternative stimulus-response binding account of the effect. The current study examined whether the action effect occurs in the presence of changes in response modalities. In Experiment 1, participants completed a modified action effect paradigm, in which they first produced an arbitrary manual response to a shape and then performed a visual search task in which the previous shape was either a valid or invalid cue-responding with a manual or saccadic response...
January 25, 2017: Attention, Perception & Psychophysics
https://www.readbyqxmd.com/read/28098521/semantic-and-syntactic-associations-during-word-search-modulate-the-relationship-between-attention-and-subsequent-memory
#8
Wei Zhou, Fei Mo, Yunhong Zhang, Jinhong Ding
Two experiments were conducted to investigate how linguistic information influences attention allocation in visual search and memory for words. In Experiment 1, participants searched for the synonym of a cue word among five words. The distractors included one antonym and three unrelated words. In Experiment 2, participants were asked to judge whether the five words presented on the screen comprise a valid sentence. The relationships among words were sentential, semantically related or unrelated. A memory recognition task followed...
January 2017: Journal of General Psychology
https://www.readbyqxmd.com/read/28087402/accuracy-is-in-the-eyes-of-the-pathologist-the-visual-interpretive-process-and-diagnostic-accuracy-with-digital-whole-slide-images
#9
Tad T Brunyé, Ezgi Mercan, Donald L Weaver, Joann G Elmore
Digital whole slide imaging is an increasingly common medium in pathology, with application to education, telemedicine, and rendering second opinions. It has also made it possible to use eye tracking devices to explore the dynamic visual inspection and interpretation of histopathological features of tissue while pathologists review cases. Using whole slide images, the present study examined how a pathologist's diagnosis is influenced by fixed case-level factors, their prior clinical experience, and their patterns of visual inspection...
February 2017: Journal of Biomedical Informatics
https://www.readbyqxmd.com/read/28044017/cat-and-mouse-search-the-influence-of-scene-and-object-analysis-on-eye-movements-when-targets-change-locations-during-search
#10
Anne P Hillstrom, Joice D Segabinazi, Hayward J Godwin, Simon P Liversedge, Valerie Benson
We explored the influence of early scene analysis and visible object characteristics on eye movements when searching for objects in photographs of scenes. On each trial, participants were shown sequentially either a scene preview or a uniform grey screen (250 ms), a visual mask, the name of the target and the scene, now including the target at a likely location. During the participant's first saccade during search, the target location was changed to: (i) a different likely location, (ii) an unlikely but possible location or (iii) a very implausible location...
February 19, 2017: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences
https://www.readbyqxmd.com/read/28027382/training-eye-movements-for-visual-search-in-individuals-with-macular-degeneration
#11
Christian P Janssen, Preeti Verghese
We report a method to train individuals with central field loss due to macular degeneration to improve the efficiency of visual search. Our method requires participants to make a same/different judgment on two simple silhouettes. One silhouette is presented in an area that falls within the binocular scotoma while they are fixating the center of the screen with their preferred retinal locus (PRL); the other silhouette is presented diametrically opposite within the intact visual field. Over the course of 480 trials (approximately 6 hr), we gradually reduced the amount of time that participants have to make a saccade and judge the similarity of stimuli...
December 1, 2016: Journal of Vision
https://www.readbyqxmd.com/read/28000252/dealing-with-ocular-artifacts-on-lateralized-erps-in-studies-of-visual-spatial-attention-and-memory-ica-correction-versus-epoch-rejection
#12
Brandi Lee Drisdelle, Sébrina Aubin, Pierre Jolicoeur
The objective of the present study was to assess the robustness and reliability of independent component analysis (ICA) as a method for ocular artifact correction in electrophysiological studies of visual-spatial attention and memory. The N2pc and sustained posterior contralateral negativity (SPCN), electrophysiological markers of visual-spatial attention and memory, respectively, are lateralized posterior ERPs typically observed following the presentation of lateral stimuli (targets and distractors) along with instructions to maintain fixation on the center of the visual search for the entire trial...
January 2017: Psychophysiology
https://www.readbyqxmd.com/read/27981521/the-influence-of-action-video-game-playing-on-eye-movement-behaviour-during-visual-search-in-abstract-in-game-and-natural-scenes
#13
Elham Azizi, Larry A Abel, Matthew J Stainer
Action game playing has been associated with several improvements in visual attention tasks. However, it is not clear how such changes might influence the way we overtly select information from our visual world (i.e. eye movements). We examined whether action-video-game training changed eye movement behaviour in a series of visual search tasks including conjunctive search (relatively abstracted from natural behaviour), game-related search, and more naturalistic scene search. Forty nongamers were trained in either an action first-person shooter game or a card game (control) for 10 hours...
December 15, 2016: Attention, Perception & Psychophysics
https://www.readbyqxmd.com/read/27973992/character-complexity-effects-in-chinese-reading-and-visual-search-a-comparison-and-theoretical-implications
#14
Lili Yu, Qiaoming Zhang, Caspian Priest, Erik D Reichle, Heather Sheridan
Three eye-movement experiments were conducted to examine how the complexity of characters in Chinese words (i.e., number of strokes per character) influences their processing and eye-movement behaviour. In Experiment 1, English speakers with no significant knowledge of Chinese searched for specific low-, medium-, and high-complexity target characters in a multi-page narrative containing characters of varying complexity (3-16 strokes). Fixation durations and skipping rates were influenced by the visual complexity of both the target characters and the characters being searched even though participants had no knowledge of Chinese...
January 13, 2017: Quarterly Journal of Experimental Psychology: QJEP
https://www.readbyqxmd.com/read/27933016/task-irrelevant-expectation-violations-in-sequential-manual-actions-evidence-for-a-check-after-surprise-mode-of-visual-attention-and-eye-hand-decoupling
#15
Rebecca M Foerster
When performing sequential manual actions (e.g., cooking), visual information is prioritized according to the task determining where and when to attend, look, and act. In well-practiced sequential actions, long-term memory (LTM)-based expectations specify which action targets might be found where and when. We have previously demonstrated (Foerster and Schneider, 2015b) that violations of such expectations that are task-relevant (e.g., target location change) cause a regression from a memory-based mode of attentional selection to visual search...
2016: Frontiers in Psychology
https://www.readbyqxmd.com/read/27923149/the-effect-of-cerebral-asymmetries-and-eye-scanning-on-pseudoneglect-for-a-visual-search-task
#16
Michael E R Nicholls, Amelia Hobson, Joanne Petty, Owen Churches, Nicole A Thomas
Pseudoneglect is the tendency for the general population to over-attend to the left. While pseudoneglect is classically demonstrated using line bisection, it also occurs for visual search. The current study explored the influence of eye movements and functional cerebral asymmetry on asymmetries for visual search. In Experiment 1, 24 participants carried out a conjunction search for a target within a rectangular array. A leftward advantage for detecting targets was observed when the eyes were free to move, but not when they were restricted by short exposure durations...
February 2017: Brain and Cognition
https://www.readbyqxmd.com/read/27922659/eye-movement-analysis-and-cognitive-assessment-the-use-of-comparative-visual-search-tasks-in-a-non-immersive-vr-application
#17
Pedro J Rosa, Pedro Gamito, Jorge Oliveira, Diogo Morais, Matthew Pavlovic, Olivia Smyth, Inês Maia, Tiago Gomes
BACKGROUND: An adequate behavioral response depends on attentional and mnesic processes. When these basic cognitive functions are impaired, the use of non-immersive Virtual Reality Applications (VRAs) can be a reliable technique for assessing the level of impairment. However, most non-immersive VRAs use indirect measures to make inferences about visual attention and mnesic processes (e.g., time to task completion, error rate). OBJECTIVES: To examine whether the eye movement analysis through eye tracking (ET) can be a reliable method to probe more effectively where and how attention is deployed and how it is linked with visual working memory during comparative visual search tasks (CVSTs) in non-immersive VRAs...
December 6, 2016: Methods of Information in Medicine
https://www.readbyqxmd.com/read/27875158/an-evaluation-of-visual-search-support-in-maps
#18
Rudolf Netzel, Marcel Hlawatsch, Michael Burch, Sanjeev Balakrishnan, Hansjorg Schmauder, Daniel Weiskopf
Visual search can be time-consuming, especially if the scene contains a large number of possibly relevant objects. An instance of this problem is present when using geographic or schematic maps with many different elements representing cities, streets, sights, and the like. Unless the map is well-known to the reader, the full map or at least large parts of it must be scanned to find the elements of interest. In this paper, we present a controlled eye-tracking study (30 participants) to compare four variants of map annotation with labels: within-image annotations, grid reference annotation, directional annotation, and miniature annotation...
January 2017: IEEE Transactions on Visualization and Computer Graphics
https://www.readbyqxmd.com/read/27869764/drivers-visual-search-patterns-during-overtaking-maneuvers-on-freeway
#19
Wenhui Zhang, Jing Dai, Yulong Pei, Penghui Li, Ying Yan, Xinqiang Chen
Drivers gather traffic information primarily by means of their vision. Especially during complicated maneuvers, such as overtaking, they need to perceive a variety of characteristics including the lateral and longitudinal distances with other vehicles, the speed of others vehicles, lane occupancy, and so on, to avoid crashes. The primary object of this study is to examine the appropriate visual search patterns during overtaking maneuvers on freeways. We designed a series of driving simulating experiments in which the type and speed of the leading vehicle were considered as two influential factors...
November 19, 2016: International Journal of Environmental Research and Public Health
https://www.readbyqxmd.com/read/27795743/typical-visual-search-performance-and-atypical-gaze-behaviors-in-response-to-faces-in-williams-syndrome
#20
Masahiro Hirai, Yukako Muramatsu, Seiji Mizuno, Naoko Kurahashi, Hirokazu Kurahashi, Miho Nakamura
BACKGROUND: Evidence indicates that individuals with Williams syndrome (WS) exhibit atypical attentional characteristics when viewing faces. However, the dynamics of visual attention captured by faces remain unclear, especially when explicit attentional forces are present. To clarify this, we introduced a visual search paradigm and assessed how the relative strength of visual attention captured by a face and explicit attentional control changes as search progresses. METHODS: Participants (WS and controls) searched for a target (butterfly) within an array of distractors, which sometimes contained an upright face...
2016: Journal of Neurodevelopmental Disorders
keyword
keyword
115722
1
2
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read
×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"