Add like
Add dislike
Add to saved papers

Lower-Limb Motion Estimation Using Ultrasound Imaging: A Framework For Assistive Device Control.

OBJECTIVE: Powered assistive devices need improved control intuitiveness to enhance their clinical adoption. Therefore, the intent of individuals should be identified and the device movement should adhere to it. Skeletal muscles contract synergistically to produce defined lower-limb movements so unique contraction patterns in lower-extremity musculature may provide a means of device joint control. Ultrasound (US) imaging enables direct measurement of the local deformation of muscle segments. Hence, the objective of this study was to assess the feasibility of using US to estimate human lower-limb movements.

METHODS: A novel algorithm was developed to calculate US features of the rectus femoris muscle during a non-weight-bearing knee flexion/extension experiment by 9 able-bodied subjects. Five US features of the skeletal muscle tissue were studied, namely, thickness, angle between aponeuroses, pennation angle, fascicle length, and echogenicity. A multiscale ridge filter was utilized to extract the structures in the image and a random sample consensus (RANSAC) model was used to segment muscle aponeuroses and fascicles. A localization scheme further guided RANSAC to enable tracking in an US image sequence. Gaussian process regression (GPR) models were trained using segmented features to estimate both knee joint angle and angular velocity.

RESULTS: The proposed segmentation-estimation approach could estimate knee joint angle and angular velocity with an average root mean square error value of 7.45 degrees and 0.262 rad/s, respectively. The average processing rate was 3-6 frames per second that is promising towards real-time implementation.

CONCLUSION: Experimental results demonstrate the feasibility of using US to estimate human lower-extremity motion. The ability of algorithm to work in real-time may enable the use of US as a neural interface for lower-limb applications.

SIGNIFICANCE: Intuitive intent recognition of human lower-extremity movements using wearable US imaging, may enable volitional assistive device control, and enhance locomotor outcomes for those with mobility impairments.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app