We have located links that may give you full text access.
Deep learning enables automatic quantitative assessment of the puborectalis muscle and the urogenital hiatus in the plane of minimal hiatal dimensions.
Ultrasound in Obstetrics & Gynecology 2018 November 22
OBJECTIVES: Measuring the length, width and area of the urogenital hiatus (UH), and the length and mean echo intensity (MEP) of the Puborectalis muscle (PRM) automatically and observer independently in the plane of minimal hiatal dimensions from transperineal ultrasound (TPUS) images by automatic segmentation of the UH and the PRM using deep learning.
METHODS: In 1318 3D/4D TPUS volume datasets, images of the plane of minimal hiatal dimensions were manually obtained and the UH and the PRM were manually segmented. Those images were obtained from 253 nulliparae at 12 and 36 weeks pregnancy with the PRM at rest, contraction and Valsalva. A total of 713 images were used to train a convolutional neural network (CNN) to automatically segment the UH and the PRM in the plane of minimal hiatal dimensions. In the remaining dataset (test set 1, TS1, 601 images, 4 images were excluded), the performance of the CNN was evaluated and compared to the manual segmentations. The performance of the CNN was also tested on 119 images of an independent dataset of 40 nulliparae at 12 weeks pregnancy. This dataset was acquired and manually segmented by another different observer (TS2, 2 images were excluded). For these segmentations, the segmentation success was manually scored. Based on the CNN segmentations the following clinically relevant parameters were measured; the length, width and area of the UH, and the length and mean echo intensity of the PRM. The overlap (Dice similarity index (DSI)), surface distance (mean absolute distance (MAD) and Hausdorff distance(HDD)) between manual and CNN segmentations were measured to investigate the similarity of both segmentations. For the measured clinically relevant parameters, the intraclass correlation coefficients (ICC) between manual and CNN results were determined.
RESULTS: Fully automatic CNN segmentation was successful in 99.0% and 93.2% for TS1 and TS2, respectively. DSI, MAD and HDD showed good overlap and distance between manual and CNN segmentations in both test sets. This was reflected in the ICC values of the length (0.96 resp. 0.95), width (0.77 resp. 0.87) and area (0.96 resp. 0.91) of the UH and the length of the PRM (0.87 resp. 0.73) and the MEP (0.95 resp. 0.97), which showed a good to very good agreement.
CONCLUSION: Deep learning can be used to automatically and reliably segment the PRM and UH in 2D, in the plane of minimal hiatal dimensions, of the nulliparous female pelvic floor. These segmentations can be used to reliably measure the parameters; hiatal dimensions, PRM length and MEP. This article is protected by copyright. All rights reserved.
METHODS: In 1318 3D/4D TPUS volume datasets, images of the plane of minimal hiatal dimensions were manually obtained and the UH and the PRM were manually segmented. Those images were obtained from 253 nulliparae at 12 and 36 weeks pregnancy with the PRM at rest, contraction and Valsalva. A total of 713 images were used to train a convolutional neural network (CNN) to automatically segment the UH and the PRM in the plane of minimal hiatal dimensions. In the remaining dataset (test set 1, TS1, 601 images, 4 images were excluded), the performance of the CNN was evaluated and compared to the manual segmentations. The performance of the CNN was also tested on 119 images of an independent dataset of 40 nulliparae at 12 weeks pregnancy. This dataset was acquired and manually segmented by another different observer (TS2, 2 images were excluded). For these segmentations, the segmentation success was manually scored. Based on the CNN segmentations the following clinically relevant parameters were measured; the length, width and area of the UH, and the length and mean echo intensity of the PRM. The overlap (Dice similarity index (DSI)), surface distance (mean absolute distance (MAD) and Hausdorff distance(HDD)) between manual and CNN segmentations were measured to investigate the similarity of both segmentations. For the measured clinically relevant parameters, the intraclass correlation coefficients (ICC) between manual and CNN results were determined.
RESULTS: Fully automatic CNN segmentation was successful in 99.0% and 93.2% for TS1 and TS2, respectively. DSI, MAD and HDD showed good overlap and distance between manual and CNN segmentations in both test sets. This was reflected in the ICC values of the length (0.96 resp. 0.95), width (0.77 resp. 0.87) and area (0.96 resp. 0.91) of the UH and the length of the PRM (0.87 resp. 0.73) and the MEP (0.95 resp. 0.97), which showed a good to very good agreement.
CONCLUSION: Deep learning can be used to automatically and reliably segment the PRM and UH in 2D, in the plane of minimal hiatal dimensions, of the nulliparous female pelvic floor. These segmentations can be used to reliably measure the parameters; hiatal dimensions, PRM length and MEP. This article is protected by copyright. All rights reserved.
Full text links
Related Resources
Trending Papers
Challenges in Septic Shock: From New Hemodynamics to Blood Purification Therapies.Journal of Personalized Medicine 2024 Februrary 4
Molecular Targets of Novel Therapeutics for Diabetic Kidney Disease: A New Era of Nephroprotection.International Journal of Molecular Sciences 2024 April 4
The 'Ten Commandments' for the 2023 European Society of Cardiology guidelines for the management of endocarditis.European Heart Journal 2024 April 18
A Guide to the Use of Vasopressors and Inotropes for Patients in Shock.Journal of Intensive Care Medicine 2024 April 14
Diagnosis and Management of Cardiac Sarcoidosis: A Scientific Statement From the American Heart Association.Circulation 2024 April 19
Essential thrombocythaemia: A contemporary approach with new drugs on the horizon.British Journal of Haematology 2024 April 9
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app