Add like
Add dislike
Add to saved papers

Improved deep belief network for estimating mango quality indices and grading: A computer vision-based neutrosophic approach.

This research introduces a revolutionary machinet learning algorithm-based quality estimation and grading system. The suggested work is divided into four main parts: Ppre-processing, neutroscopic model transformation, Feature Extraction, and Grading. The raw images are first pre-processed by following five major stages: read, resize, noise removal, contrast enhancement via CLAHE, and Smoothing via filtering. The pre-processed images are then converted into a neutrosophic domain for more effective mango grading. The image is processed under a new Geometric Mean based neutrosophic approach to transforming it into the neutrosophic domain. Finally, the prediction of TSS for the different chilling conditions is done by Improved Deep Belief Network (IDBN) and based on this; the grading of mango is done automatically as the model is already trained with it. Here, the prediction of TSS is carried out under the consideration of SSC, firmness, and TAC. A comparison between the proposed and traditional methods is carried out to confirm the efficacy of various metrics.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app