Add like
Add dislike
Add to saved papers

A feasibility study of depth image based intent recognition for lower limb prostheses.

This paper presents our preliminary work on a depth camera based intent recognition system intended for future use in robotic prosthetic legs. The approach infers the activity mode of the subject for standing, walking, running, stair ascent and stair descent modes only using data from the depth camera. Depth difference images are also used to increase the performance of the approach by discriminating between static and dynamic instances. After confidence map based filtering, simple features such as mean, maximum, minimum and standard deviation are extracted from rectangular regions of the frames. A support vector machine with a cubic kernel is used for the classification task. The classification results are post-processed by a voting filter to increase the robustness of activity mode recognition. Experiments conducted with a healthy subject donning the depth camera to his lower leg showed the efficacy of the approach. Specifically, the depth camera based recognition system was able to identify 28 activity mode transitions successfully. The only case of incorrect mode switching was an intended run to stand transition, where an intermediate transition from run to walk was recognized before transitioning to the intended standing mode.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app