Add like
Add dislike
Add to saved papers

A facial expression controlled wheelchair for people with disabilities.

BACKGROUND AND OBJECTIVES: In order to improve assistive technologies for people with reduced mobility, this paper develops a new intelligent real-time emotion detection system to control equipment, such as electric wheelchairs (EWC) or robotic assistance vehicles. Every year, degenerative diseases and traumas prohibit thousands of people to easily control the joystick of their wheelchairs with their hands. Most current technologies are considered invasive and uncomfortable such as those requiring the user to wear some body sensor to control the wheelchair.

METHODS: In this work, the proposed Human Machine Interface (HMI) provides an efficient hands-free option that does not require sensors or objects attached to the user's body. It allows the user to drive the wheelchair using its facial expressions which can be flexibly updated. This intelligent solution is based on a combination of neural networks (NN) and specific image preprocessing steps. First, the Viola-Jones combination is used to detect the face of the disability from a video. Subsequently, a neural network is used to classify the emotions displayed on the face. This solution called "The Mathematics Behind Emotion" is capable of classifying many facial expressions in real time, such as smiles and raised eyebrows, which are translated into signals for wheelchair control. On the hardware side, this solution only requires a smartphone and a Raspberry Pi card that can be easily mounted on the wheelchair.

RESULTS: Many experiments have been conducted to evaluate the efficiency of the control acquisition process and the user experience in driving a wheelchair through facial expressions. The classification accuracy can expect 98.6% and it can offer an average recall rate of 97.1%. Thus, all these experiments have proven that the proposed system is able of accurately recognizing user commands in real time. Indeed, the obtained results indicate that the suggested system is more comfortable and better adapted to severely disabled people in their daily lives, than conventional methods. Among the advantages of this system, we cite its real time ability to identify facial emotions from different angles.

CONCLUSIONS: The proposed system takes into account the patient's pathology. It is intuitive, modern, doesn't require physical effort and can be integrated into a smartphone or tablet. The results obtained highlight the efficiency and reliability of this system, which ensures safe navigation for the disabled patient.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app