We have located links that may give you full text access.
Multi-atlas Based Segmentation Editing with Interaction-Guided Constraints.
We propose a novel multi-atlas based segmentation method to address the editing scenario, when given an incomplete segmentation along with a set of training label images. Unlike previous multi-atlas based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate training labels and derive their voting weights. Specifically, we divide user interactions, provided on erroneous parts, into multiple local interaction combinations, and then locally search for the training label patches well-matched with each interaction combination and also the previous segmentation. Then, we estimate the new segmentation through the label fusion of selected label patches that have their weights defined with respect to their respective distances to the interactions. Since the label patches are found to be from different combinations in our method, various shape changes can be considered even with limited training labels and few user interactions. Since our method does not need image information or expensive learning steps, it can be conveniently used for most editing problems. To demonstrate the positive performance, we apply our method to editing the segmentation of three challenging data sets: prostate CT, brainstem CT, and hippocampus MR. The results show that our method outperforms the existing editing methods in all three data sets.
Full text links
Related Resources
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app
All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.
By using this service, you agree to our terms of use and privacy policy.
Your Privacy Choices
You can now claim free CME credits for this literature searchClaim now
Get seemless 1-tap access through your institution/university
For the best experience, use the Read mobile app