We have located links that may give you full text access.
A miniature U-net for k -space-based parallel magnetic resonance imaging reconstruction with a mixed loss function.
Quantitative Imaging in Medicine and Surgery 2022 September
Background: Deep learning-based magnetic resonance imaging (MRI) methods require in most cases a separate dataset with thousands of images for each anatomical site to train the network model. This paper proposes a miniature U-net method for k -space-based parallel MRI where the network model is trained individually for each scan using scan-specific autocalibrating signal data.
Methods: The original U-net was tailored with fewer layers and channels, and the network was trained using the autocalibrating signal data with a mixing loss function involving magnitude loss and phase loss. The performance of the proposed method was measured using both phantom and in vivo datasets compared to scan-specific robust artificial-neural-networks for k -space interpolation (RAKI) and generalized autocalibrating partially parallel acquisitions (GRAPPA).
Results: The proposed method alleviates aliasing artifacts and reduces noise with an acceleration factor of four for phantom and in vivo data. Compared with RAKI and GRAPPA, the proposed method represents an improvement with a structural similarity index measure of between 0.02 and 0.05 and a peak signal-to-noise ratio (PSNR) of between 0.1 and 3.
Conclusions: The proposed method introduces a miniature U-net to reconstruct the missing k -space data, which can provide an optimal trade-off between network performance and requirement of training samples. Experimental results indicate that the proposed method can improve image quality compared with the deep learning-based k -space parallel magnitude resonance imaging method.
Methods: The original U-net was tailored with fewer layers and channels, and the network was trained using the autocalibrating signal data with a mixing loss function involving magnitude loss and phase loss. The performance of the proposed method was measured using both phantom and in vivo datasets compared to scan-specific robust artificial-neural-networks for k -space interpolation (RAKI) and generalized autocalibrating partially parallel acquisitions (GRAPPA).
Results: The proposed method alleviates aliasing artifacts and reduces noise with an acceleration factor of four for phantom and in vivo data. Compared with RAKI and GRAPPA, the proposed method represents an improvement with a structural similarity index measure of between 0.02 and 0.05 and a peak signal-to-noise ratio (PSNR) of between 0.1 and 3.
Conclusions: The proposed method introduces a miniature U-net to reconstruct the missing k -space data, which can provide an optimal trade-off between network performance and requirement of training samples. Experimental results indicate that the proposed method can improve image quality compared with the deep learning-based k -space parallel magnitude resonance imaging method.
Full text links
Related Resources
Trending Papers
Renin-Angiotensin-Aldosterone System: From History to Practice of a Secular Topic.International Journal of Molecular Sciences 2024 April 5
Albumin: a comprehensive review and practical guideline for clinical use.European Journal of Clinical Pharmacology 2024 April 13
Revascularization Strategy in Myocardial Infarction with Multivessel Disease.Journal of Clinical Medicine 2024 March 27
Clinical practice guidelines on the management of status epilepticus in adults: A systematic review.Epilepsia 2024 April 13
Interstitial Lung Disease: A Review.JAMA 2024 April 23
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app