keyword
https://read.qxmd.com/read/37395094/facial-reactivity-to-emotional-stimuli-is-related-to-empathic-concern-empathic-distress-and-depressive-symptoms-in-social-work-students
#21
JOURNAL ARTICLE
Pierrich Plusquellec, Kaylee Smart, Vincent Denault
Helping professionals are exposed daily to the emotional burden of their vulnerable clients and are at risk of unconscious emotional contagion that may lead to stress and emotional distress. Being aware of their own susceptibility to emotional contagion, however, can improve their well-being. This study aimed to propose an objective measure of emotional contagion, complementary to the Emotional Contagion Scale, and to evaluate its construct and predictive validity. To do so, we turned to FACET, an automatic facial coding software using the Facial Action Coding System, to measure participants' facial expressions as they watched movie clips eliciting specific emotional responses...
July 3, 2023: Psychological Reports
https://read.qxmd.com/read/37361792/examining-the-effects-of-the-utility-value-intervention-on-learners-emotions-and-conceptual-understanding-in-online-video-based-learning
#22
JOURNAL ARTICLE
Seunghye Ha, Hyo-Jeong So
In asynchronous online video-based learning, learners experience various affective states, which may make them disengaged and negatively influence learning outcomes. This study aimed to examine the effect of the utility value (UV) intervention to help learners emotionally and behaviorally engage in online learning. The UV intervention includes pre-learning writing activity and UV feedback messages to help learners perceive the relevance between the lecture topic and their lives. In particular, we examined the effects of the UV intervention on learners' negative emotions (i...
April 26, 2023: Education and Information Technologies
https://read.qxmd.com/read/37349604/brain-mechanisms-associated-with-facial-encoding-of-affective-states
#23
JOURNAL ARTICLE
Miriam Kunz, Jen-I Chen, Stefan Lautenbacher, Pierre Rainville
Affective states are typically accompanied by facial expressions, but these behavioral manifestations are highly variable. Even highly arousing and negative valent experiences, such as pain, show great instability in facial affect encoding. The present study investigated which neural mechanisms are associated with variations in facial affect encoding by focusing on facial encoding of sustained pain experiences. Facial expressions, pain ratings, and brain activity (BOLD-fMRI) during tonic heat pain were recorded in 27 healthy participants...
June 22, 2023: Cognitive, Affective & Behavioral Neuroscience
https://read.qxmd.com/read/37279041/enhancing-nonverbal-communication-through-virtual-human-technology-protocol-for-a-mixed-methods-study
#24
JOURNAL ARTICLE
Analay Perez, Michael D Fetters, John W Creswell, Mark Scerbo, Frederick W Kron, Richard Gonzalez, Lawrence An, Masahito Jimbo, Predrag Klasnja, Timothy C Guetterman
BACKGROUND: Communication is a critical component of the patient-provider relationship; however, limited research exists on the role of nonverbal communication. Virtual human training is an informatics-based educational strategy that offers various benefits in communication skill training directed at providers. Recent informatics-based interventions aimed at improving communication have mainly focused on verbal communication, yet research is needed to better understand how virtual humans can improve verbal and nonverbal communication and further elucidate the patient-provider dyad...
June 6, 2023: JMIR Research Protocols
https://read.qxmd.com/read/37246747/differences-between-high-and-low-self-critics-in-compassionate-facial-expression
#25
JOURNAL ARTICLE
Martina Baránková, Júlia Halamová, Bronislava Strnádelová, Martin Kanovský
The goal of this study was to identify differences between high and low self-critical participants in relation to compassionate facial expressions. Our convenience sample consisted of 151 participants aged 18-59 years old (M = 25.17; SD = 7.81). The highest and the lowest scoring participants in self-criticism were selected for final analysis ( N = 35). Participants, at home alone, watched a short video stimulus eliciting compassion while their facial expressions were recorded using webcams. Out of the sample we selected the highest 10% and the lowest 10% of self-critical participants according to the Slovak norms of The Forms of Self-Criticizing/Attacking and Self-Reassuring Scale...
May 29, 2023: Psychological Reports
https://read.qxmd.com/read/37244937/horses-equus-caballus-facial-micro-expressions-insight-into-discreet-social-information
#26
JOURNAL ARTICLE
Claude Tomberg, Maxime Petagna, Lucy-Anne de Selliers de Moranville
Facial micro-expressions are facial expressions expressed briefly (less than 500 ms) and involuntarily. Described only in humans, we investigated whether micro-expressions could also be expressed by non-human animal species. Using the Equine Facial action coding system (EquiFACS), an objective tool based on facial muscles actions, we demonstrated that a non-human species, Equus caballus, is expressing facial micro-expressions in a social context. The AU17, AD38 and AD1 were selectively modulated as micro-expression-but not as standard facial expression (all durations included)-in presence of a human experimenter...
May 27, 2023: Scientific Reports
https://read.qxmd.com/read/37179867/automated-facial-expression-analysis-of-participants-self-criticising-via-the-two-chair-technique-exploring-facial-behavioral-markers-of-self-criticism
#27
JOURNAL ARTICLE
Júlia Halamová, Martin Kanovský, Guilherme Brockington, Bronislava Strnádelová
INTRODUCTION: As self-rating scales are prone to many measurement distortions, there is a growing call for more objective measures based on physiological or behavioural indicators. Self-criticism is one of the major transdiagnostic factor of all mental disorders therefore it is important to be able to distinguish what are the characteristic facial features of self-criticizing. To the best of our knowledge, there has been no automated facial emotion expression analysis of participants self-criticising via the two-chair technique...
2023: Frontiers in Psychology
https://read.qxmd.com/read/37166944/primate-socio-ecology-shapes-the-evolution-of-distinctive-facial-repertoires
#28
JOURNAL ARTICLE
Brittany N Florkiewicz, Linda S Oña, Leonardo Oña, Matthew W Campbell
Primate facial musculature enables a wide variety of movements during bouts of communication, but how these movements contribute to signal construction and repertoire size is unclear. The facial mobility hypothesis suggests that morphological constraints shape the evolution of facial repertoires: species with higher facial mobility will produce larger and more complex repertoires. In contrast, the socio-ecological complexity hypothesis suggests that social needs shape the evolution of facial repertoires: as social complexity increases, so does communicative repertoire size...
May 11, 2023: Journal of Comparative Psychology
https://read.qxmd.com/read/36842950/malocclusion-severity-and-smile-features-is-there-an-association
#29
JOURNAL ARTICLE
Hisham Mohammed, Reginald Kumar, Hamza Bennani, John Perry, Jamin B Halberstadt, Mauro Farella
INTRODUCTION: This observational study investigated the relationship between malocclusion and smiling. METHODS: Adolescents and young adults (n = 72; aged 16-25 years) were identified according to their Dental Aesthetic Index (DAI) and allocated to 3 groups: (1) malocclusion group (n = 24; DAI ≥31), (2) retention group (n = 24; pretreatment DAI ≥31) with a prior malocclusion that had been corrected by orthodontic treatment, (3) control group with no-to-minor malocclusion (n = 24; DAI ≤25)...
February 24, 2023: American Journal of Orthodontics and Dentofacial Orthopedics
https://read.qxmd.com/read/36730168/exploring-facial-expressions-and-action-unit-domains-for-parkinson-detection
#30
JOURNAL ARTICLE
Luis F Gomez, Aythami Morales, Julian Fierrez, Juan Rafael Orozco-Arroyave
BACKGROUND AND OBJECTIVE: Patients suffering from Parkinson's disease (PD) present a reduction in facial movements called hypomimia. In this work, we propose to use machine learning facial expression analysis from face images based on action unit domains to improve PD detection. We propose different domain adaptation techniques to exploit the latest advances in automatic face analysis and face action unit detection. METHODS: Three different approaches are explored to model facial expressions of PD patients: (i) face analysis using single frame images and also using sequences of images, (ii) transfer learning from face analysis to action units recognition, and (iii) triplet-loss functions to improve the automatic classification between patients and healthy subjects...
2023: PloS One
https://read.qxmd.com/read/36727808/-facial-expression-after-face-transplant-the-first-international-face-transplant-cohort-comparison
#31
JOURNAL ARTICLE
Miguel I Dorante, Alice T Wang, Branislav Kollar, Bridget J Perry, Mustafa G Ertosun, Andrew J Lindford, Emma-Lotta Kiukas, Ömer Özkan, Özlenen Özkan, Patrik Lassus, Bohdan Pomahac
BACKGROUND: Assessment of motor function restoration following face transplant (FT) is difficult as standardized, bilateral tests are lacking. This study aims to bolster support for software-based analysis through international collaboration. METHODS: FaceReader (Noldus, Wageningen, Netherlands), a facial expression analysis software, was used to analyze post-transplant videos of 8 FT patients from Boston, USA (range, 1-9 years), 2 FT patients from Helsinki, FIN (range, 3-4 years), and 3 FT patients from Antalya, TUR (range, 6...
January 24, 2023: Plastic and Reconstructive Surgery
https://read.qxmd.com/read/36633530/clinical-thresholds-in-pain-related-facial-activity-linked-to-differences-in-cortical-network-activation-in-neonates
#32
JOURNAL ARTICLE
Oana Bucsea, Mohammed Rupawala, Ilana Shiff, Xiaogang Wang, Judith Meek, Maria Fitzgerald, Lorenzo Fabrizi, Rebecca Pillai Riddell, Laura Jones
In neonates, a noxious stimulus elicits pain-related facial expression changes and distinct brain activity as measured by electroencephalography, but past research has revealed an inconsistent relationship between these responses. Facial activity is the most commonly used index of neonatal pain in clinical settings, with clinical thresholds determining if analgesia should be provided; however, we do not know if these thresholds are associated with differences in how the neonatal brain processes a noxious stimulus...
May 1, 2023: Pain
https://read.qxmd.com/read/36585439/explainable-automated-recognition-of-emotional-states-from-canine-facial-expressions-the-case-of-positive-anticipation-and-frustration
#33
JOURNAL ARTICLE
Tali Boneh-Shitrit, Marcelo Feighelstein, Annika Bremhorst, Shir Amir, Tomer Distelfeld, Yaniv Dassa, Sharon Yaroshetsky, Stefanie Riemer, Ilan Shimshoni, Daniel S Mills, Anna Zamansky
In animal research, automation of affective states recognition has so far mainly addressed pain in a few species. Emotional states remain uncharted territories, especially in dogs, due to the complexity of their facial morphology and expressions. This study contributes to fill this gap in two aspects. First, it is the first to address dog emotional states using a dataset obtained in a controlled experimental setting, including videos from (n = 29) Labrador Retrievers assumed to be in two experimentally induced emotional states: negative (frustration) and positive (anticipation)...
December 30, 2022: Scientific Reports
https://read.qxmd.com/read/36277167/classification-of-elderly-pain-severity-from-automated-video-clip-facial-action-unit-analysis-a-study-from-a-thai-data-repository
#34
JOURNAL ARTICLE
Patama Gomutbutra, Adisak Kittisares, Atigorn Sanguansri, Noppon Choosri, Passakorn Sawaddiruk, Puriwat Fakfum, Peerasak Lerttrakarnnon, Sompob Saralamba
Data from 255 Thais with chronic pain were collected at Chiang Mai Medical School Hospital. After the patients self-rated their level of pain, a smartphone camera was used to capture faces for 10 s at a one-meter distance. For those unable to self-rate, a video recording was taken immediately after the move that causes the pain. The trained assistant rated each video clip for the pain assessment in advanced dementia (PAINAD). The pain was classified into three levels: mild, moderate, and severe. OpenFace© was used to convert the video clips into 18 facial action units (FAUs)...
2022: Frontiers in artificial intelligence
https://read.qxmd.com/read/36253599/pain-e-motion-faces-database-pemf-pain-related-micro-clips-for-emotion-research
#35
JOURNAL ARTICLE
Roberto Fernandes-Magalhaes, Alberto Carpio, David Ferrera, Dimitri Van Ryckeghem, Irene Peláez, Paloma Barjola, María Eugenia De Lahoz, María Carmen Martín-Buro, José Antonio Hinojosa, Stefaan Van Damme, Luis Carretié, Francisco Mercado
A large number of publications have focused on the study of pain expressions. Despite the growing knowledge, the availability of pain-related face databases is still very scarce compared with other emotional facial expressions. The Pain E-Motion Faces Database (PEMF) is a new open-access database currently consisting of 272 micro-clips of 68 different identities. Each model displays one neutral expression and three pain-related facial expressions: posed, spontaneous-algometer and spontaneous-CO2 laser. Normative ratings of pain intensity, valence and arousal were provided by students of three different European universities...
October 17, 2022: Behavior Research Methods
https://read.qxmd.com/read/36241674/automated-detection-of-pain-levels-using-deep-feature-extraction-from-shutter-blinds-based-dynamic-sized-horizontal-patches-with-facial-images
#36
JOURNAL ARTICLE
Prabal Datta Barua, Nursena Baygin, Sengul Dogan, Mehmet Baygin, N Arunkumar, Hamido Fujita, Turker Tuncer, Ru-San Tan, Elizabeth Palmer, Muhammad Mokhzaini Bin Azizan, Nahrizul Adib Kadri, U Rajendra Acharya
Pain intensity classification using facial images is a challenging problem in computer vision research. This work proposed a patch and transfer learning-based model to classify various pain intensities using facial images. The input facial images were segmented into dynamic-sized horizontal patches or "shutter blinds". A lightweight deep network DarkNet19 pre-trained on ImageNet1K was used to generate deep features from the shutter blinds and the undivided resized segmented input facial image. The most discriminative features were selected from these deep features using iterative neighborhood component analysis, which were then fed to a standard shallow fine k-nearest neighbor classifier for classification using tenfold cross-validation...
October 14, 2022: Scientific Reports
https://read.qxmd.com/read/36205621/automated-detection-of-smiles-as-discrete-episodes
#37
JOURNAL ARTICLE
Hisham Mohammed, Reginald Kumar, Hamza Bennani, Jamin B Halberstadt, Mauro Farella
BACKGROUND: Patients seeking restorative and orthodontic treatment expect an improvement in their smiles and oral health-related quality of life. Nonetheless, the qualitative and quantitative characteristics of dynamic smiles are yet to be understood. OBJECTIVE: To develop, validate, and introduce open-access software for automated analysis of smiles in terms of their frequency, genuineness, duration, and intensity. MATERIALS AND METHODS: A software script was developed using the Facial Action Coding System (FACS) and artificial intelligence to assess activations of (1) cheek raiser, a marker of smile genuineness; (2) lip corner puller, a marker of smile intensity; and (3) perioral lip muscles, a marker of lips apart...
October 7, 2022: Journal of Oral Rehabilitation
https://read.qxmd.com/read/36139191/can-ponies-equus-caballus-distinguish-human-facial-expressions
#38
JOURNAL ARTICLE
Katrina Merkies, Yuliia Sudarenko, Abigail J Hodder
Communication within a species is essential for access to resources, alerting to dangers, group facilitation and social bonding; human facial expressions are considered to be an important factor in one's ability to communicate with others. Evidence has shown that dogs and horses are able to distinguish positive and negative facial expressions by observing photographs of humans, however there is currently no research on how facial expressions from a live human are perceived by horses. This study investigated how ponies distinguish facial expressions presented by live actors...
September 7, 2022: Animals: An Open Access Journal From MDPI
https://read.qxmd.com/read/36080983/predicting-perceived-exhaustion-in-rehabilitation-exercises-using-facial-action-units
#39
JOURNAL ARTICLE
Christopher Kreis, Andres Aguirre, Carlos A Cifuentes, Marcela Munera, Mario F Jiménez, Sebastian Schneider
Physical exercise has become an essential tool for treating various non-communicable diseases (also known as chronic diseases). Due to this, physical exercise allows to counter different symptoms and reduce some risk of death factors without medication. A solution to support people in doing exercises is to use artificial systems that monitor their exercise progress. While one crucial aspect is to monitor the correct physical motions for rehabilitative exercise, another essential element is to give encouraging feedback during workouts...
August 30, 2022: Sensors
https://read.qxmd.com/read/36018766/unique-pain-responses-in-different-etiological-subgroups-of-intellectual-and-developmental-disabilities
#40
JOURNAL ARTICLE
Ruth Defrin, Tali Benromano, Chaim G Pick
We studied whether there exist variations in pain responses between different intellectual and developmental disability (IDD) etiologies. Self-reports and facial expressions (Facial Action Coding System = FACS) were recorded during experimental pressure stimuli and compared among 31 individuals with IDD-13 with cerebral palsy (CP), nine with Down syndrome (DS), nine with unspecified origin (UIDD)-and among 15 typically developing controls (TDCs). The CP and DS groups had higher pain ratings and FACS scores compared to the UIDD and TDC groups, and steeper stimulus-response functions...
September 1, 2022: American Journal on Intellectual and Developmental Disabilities
keyword
keyword
9328
2
3
Fetch more papers »
Fetching more papers... Fetching...
Remove bar
Read by QxMD icon Read
×

Save your favorite articles in one place with a free QxMD account.

×

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"

We want to hear from doctors like you!

Take a second to answer a survey question.