Add like
Add dislike
Add to saved papers

Unsupervised neural decoding for concurrent and continuous multi-finger force prediction.

Reliable prediction of multi-finger forces is crucial for neural-machine interfaces. Various neural decoding methods have progressed substantially for accurate motor output predictions. However, most neural decoding methods are performed in a supervised manner, i.e., the finger forces are needed for model training, which may not be suitable in certain contexts, especially in scenarios involving individuals with an arm amputation. To address this issue, we developed an unsupervised neural decoding approach to predict multi-finger forces using spinal motoneuron firing information. We acquired high-density surface electromyogram (sEMG) signals of the finger extensor muscle when subjects performed single-finger and multi-finger tasks of isometric extensions. We first extracted motor units (MUs) from sEMG signals of the single-finger tasks. Because of inevitable finger muscle co-activation, MUs controlling the non-targeted fingers can also be recruited. To ensure an accurate finger force prediction, these MUs need to be teased out. To this end, we clustered the decomposed MUs based on inter-MU distances measured by the dynamic time warping technique, and we then labeled the MUs using the mean firing rate or the firing rate phase amplitude. We merged the clustered MUs related to the same target finger and assigned weights based on the consistency of the MUs being retained. As a result, compared with the supervised neural decoding approach and the conventional sEMG amplitude approach, our new approach can achieve a higher R2 (0.77 ± 0.036 vs. 0.71 ± 0.11 vs. 0.61 ± 0.09) and a lower root mean square error (5.16 ± 0.58 %MVC vs. 5.88 ± 1.34 %MVC vs. 7.56 ± 1.60 %MVC). Our findings can pave the way for the development of accurate and robust neural-machine interfaces, which can significantly enhance the experience during human-robotic hand interactions in diverse contexts.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app