We have located links that may give you full text access.
Using person-specific neural networks to characterize heterogeneity in eating disorders: Illustrative links between emotional eating and ovarian hormones.
OBJECTIVE: Emotional eating has been linked to ovarian hormone functioning, but no studies to-date have considered the role of brain function. This knowledge gap may stem from methodological challenges: Data are heterogeneous, violating assumptions of homogeneity made by between-subjects analyses. The primary aim of this paper is to describe an innovative within-subjects analysis that models heterogeneity and has potential for filling knowledge gaps in eating disorder research. We illustrate its utility in an application to pilot neuroimaging, hormone, and emotional eating data across the menstrual cycle.
METHOD: Group iterative multiple model estimation (GIMME) is a person-specific network approach for estimating sample-, subgroup-, and individual-level connections between brain regions. To illustrate its potential for eating disorder research, we apply it to pilot data from 10 female twins (N = 5 pairs) discordant for emotional eating and/or anxiety, who provided two resting state fMRI scans and hormone assays. We then demonstrate how the multimodal data can be linked in multilevel models.
RESULTS: GIMME generated person-specific neural networks that contained connections common across the sample, shared between co-twins, and unique to individuals. Illustrative analyses revealed positive relations between hormones and default mode connectivity strength for control twins, but no relations for their co-twins who engage in emotional eating or who had anxiety.
DISCUSSION: This paper showcases the value of person-specific neuroimaging network analysis and its multimodal associations in the study of heterogeneous biopsychosocial phenomena, such as eating behavior.
METHOD: Group iterative multiple model estimation (GIMME) is a person-specific network approach for estimating sample-, subgroup-, and individual-level connections between brain regions. To illustrate its potential for eating disorder research, we apply it to pilot data from 10 female twins (N = 5 pairs) discordant for emotional eating and/or anxiety, who provided two resting state fMRI scans and hormone assays. We then demonstrate how the multimodal data can be linked in multilevel models.
RESULTS: GIMME generated person-specific neural networks that contained connections common across the sample, shared between co-twins, and unique to individuals. Illustrative analyses revealed positive relations between hormones and default mode connectivity strength for control twins, but no relations for their co-twins who engage in emotional eating or who had anxiety.
DISCUSSION: This paper showcases the value of person-specific neuroimaging network analysis and its multimodal associations in the study of heterogeneous biopsychosocial phenomena, such as eating behavior.
Full text links
Related Resources
Trending Papers
Challenges in Septic Shock: From New Hemodynamics to Blood Purification Therapies.Journal of Personalized Medicine 2024 Februrary 4
Molecular Targets of Novel Therapeutics for Diabetic Kidney Disease: A New Era of Nephroprotection.International Journal of Molecular Sciences 2024 April 4
The 'Ten Commandments' for the 2023 European Society of Cardiology guidelines for the management of endocarditis.European Heart Journal 2024 April 18
A Guide to the Use of Vasopressors and Inotropes for Patients in Shock.Journal of Intensive Care Medicine 2024 April 14
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app