We have located links that may give you full text access.
An automated behavioral apparatus to combine parameterized reaching and grasping movements in 3D space.
Journal of Neuroscience Methods 2018 November 29
BACKGROUND: The neural principles underlying reaching and grasping movements have been studied extensively in primates for decades. However, few experimental apparatuses have been developed to enable a flexible combination of reaching and grasping in one task in three-dimensional (3D) space.
NEW METHOD: By combining a custom turning table with a 3D translational device, we have developed a highly flexible apparatus that enables the subject to reach multiple positions in 3D space, and grasp differently shaped objects with multiple grip types in each position. Meanwhile, hand trajectory and grip types can be recorded via optical motion tracking cameras and touch sensors, respectively.
RESULTS: We have used the apparatus to successfully train a macaque monkey to accomplish a visually-guided reach-to-grasp task, in which, six objects, fixed on the turning table, were grasped appropriately when they were transported to multiple positions in 3D space. A preliminary analysis of neural signals recorded in primary motor cortex, shows that plenty of neurons exhibit significant tuning to both target position and grip type.
COMPARISON WITH EXISTING METHOD(S): Our apparatus realizes an arbitrary combination of parameterized reaching and grasping movements in a single task, which were usually separated or fixed in other systems. Meanwhile, the apparatus has high expansibility in terms of dynamic range, object shapes and applicable subjects.
CONCLUSIONS: The apparatus provides a valuable platform to study upper limb functions in behavioral and neurophysiological studies, and may facilitate simultaneous reconstruction of reaching and grasping movements in brain-machine interfaces (BMIs).
NEW METHOD: By combining a custom turning table with a 3D translational device, we have developed a highly flexible apparatus that enables the subject to reach multiple positions in 3D space, and grasp differently shaped objects with multiple grip types in each position. Meanwhile, hand trajectory and grip types can be recorded via optical motion tracking cameras and touch sensors, respectively.
RESULTS: We have used the apparatus to successfully train a macaque monkey to accomplish a visually-guided reach-to-grasp task, in which, six objects, fixed on the turning table, were grasped appropriately when they were transported to multiple positions in 3D space. A preliminary analysis of neural signals recorded in primary motor cortex, shows that plenty of neurons exhibit significant tuning to both target position and grip type.
COMPARISON WITH EXISTING METHOD(S): Our apparatus realizes an arbitrary combination of parameterized reaching and grasping movements in a single task, which were usually separated or fixed in other systems. Meanwhile, the apparatus has high expansibility in terms of dynamic range, object shapes and applicable subjects.
CONCLUSIONS: The apparatus provides a valuable platform to study upper limb functions in behavioral and neurophysiological studies, and may facilitate simultaneous reconstruction of reaching and grasping movements in brain-machine interfaces (BMIs).
Full text links
Related Resources
Trending Papers
Challenges in Septic Shock: From New Hemodynamics to Blood Purification Therapies.Journal of Personalized Medicine 2024 Februrary 4
Molecular Targets of Novel Therapeutics for Diabetic Kidney Disease: A New Era of Nephroprotection.International Journal of Molecular Sciences 2024 April 4
The 'Ten Commandments' for the 2023 European Society of Cardiology guidelines for the management of endocarditis.European Heart Journal 2024 April 18
A Guide to the Use of Vasopressors and Inotropes for Patients in Shock.Journal of Intensive Care Medicine 2024 April 14
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app