We have located links that may give you full text access.
Deep learning-based osteochondritis dissecans detection in ultrasound images with humeral capitellum localization.
International Journal of Computer Assisted Radiology and Surgery 2024 January 18
PURPOSE: Osteochondritis dissecans (OCD) of the humeral capitellum is a common cause of elbow disorders, particularly among young throwing athletes. Conservative treatment is the preferred treatment for managing OCD, and early intervention significantly influences the possibility of complete disease resolution. The purpose of this study is to develop a deep learning-based classification model in ultrasound images for computer-aided diagnosis.
METHODS: This paper proposes a deep learning-based OCD classification method in ultrasound images. The proposed method first detects the humeral capitellum detection using YOLO and then estimates the OCD probability of the detected region probability using VGG16. We hypothesis that the performance will be improved by eliminating unnecessary regions. To validate the performance of the proposed method, it was applied to 158 subjects (OCD: 67, Normal: 91) using five-fold-cross-validation.
RESULTS: The study demonstrated that the humeral capitellum detection achieved a mean average precision (mAP) of over 0.95, while OCD probability estimation achieved an average accuracy of 0.890, precision of 0.888, recall of 0.927, F1 score of 0.894, and an area under the curve (AUC) of 0.962. On the other hand, when the classification model was constructed for the entire image, accuracy, precision, recall, F1 score, and AUC were 0.806, 0.806, 0.932, 0.843, and 0.928, respectively. The findings suggest the high-performance potential of the proposed model for OCD classification in ultrasonic images.
CONCLUSION: This paper introduces a deep learning-based OCD classification method. The experimental results emphasize the effectiveness of focusing on the humeral capitellum for OCD classification in ultrasound images. Future work should involve evaluating the effectiveness of employing the proposed method by physicians during medical check-ups for OCD.
METHODS: This paper proposes a deep learning-based OCD classification method in ultrasound images. The proposed method first detects the humeral capitellum detection using YOLO and then estimates the OCD probability of the detected region probability using VGG16. We hypothesis that the performance will be improved by eliminating unnecessary regions. To validate the performance of the proposed method, it was applied to 158 subjects (OCD: 67, Normal: 91) using five-fold-cross-validation.
RESULTS: The study demonstrated that the humeral capitellum detection achieved a mean average precision (mAP) of over 0.95, while OCD probability estimation achieved an average accuracy of 0.890, precision of 0.888, recall of 0.927, F1 score of 0.894, and an area under the curve (AUC) of 0.962. On the other hand, when the classification model was constructed for the entire image, accuracy, precision, recall, F1 score, and AUC were 0.806, 0.806, 0.932, 0.843, and 0.928, respectively. The findings suggest the high-performance potential of the proposed model for OCD classification in ultrasonic images.
CONCLUSION: This paper introduces a deep learning-based OCD classification method. The experimental results emphasize the effectiveness of focusing on the humeral capitellum for OCD classification in ultrasound images. Future work should involve evaluating the effectiveness of employing the proposed method by physicians during medical check-ups for OCD.
Full text links
Related Resources
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app