Add like
Add dislike
Add to saved papers

Task-Oriented GAN for PolSAR Image Classification and Clustering.

Based on a generative adversarial network (GAN), a novel version named Task-Oriented GAN is proposed to tackle difficulties in PolSAR image interpretation, including PolSAR data analysis and small sample problem. Besides two typical parts in GAN, i.e., generator (G-Net) and discriminator (D-Net), there is a third part named TaskNet (T-Net) in the Task-Oriented GAN, where T-Net is employed to accomplish a certain task. Two tasks, PolSAR image classification and clustering, are studied in this paper, where T-Net acts as a Classifier and a Clusterer, respectively. The learning procedure of Task-Oriented GAN consists of two main stages. In the first stage, G-Net and D-Net vie with each other like that in a general GAN; in the second stage, G-Net is adjusted and oriented by T-Net so that more samples, which are benefit for the task and called fake data, are generated. As a result, Task-Oriented GAN not only has the advantage of GAN (no-assumption data modeling) but also overcomes the disadvantage of GAN (task-free). After learning, fake data are employed to enrich training set and avoid overfitting; so Task-Oriented GAN performs well even if the manual-labeled data are small. To verify the effectiveness of T-Net, a visualized comparison is provided, where some fake digits generated from Task-Oriented GAN are illustrated along with that from GAN. What is more, considering that there is a great difference between PolSAR data and general data, in our PolSAR image classification and clustering tasks, the specific PolSAR information is inserted into the structure of the Task-Oriented GAN. This enables researchers to mine inherent information in PolSAR data without any data hypothesis and find ways for small sample problem at the same time. Experiment results tested on three PolSAR images show that the proposed method performs well in dealing with PolSAR image classification and clustering.

Full text links

We have located links that may give you full text access.
Can't access the paper?
Try logging in through your university/institutional subscription. For a smoother one-click institutional access experience, please use our mobile app.

Related Resources

For the best experience, use the Read mobile app

Mobile app image

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app

All material on this website is protected by copyright, Copyright © 1994-2024 by WebMD LLC.
This website also contains material copyrighted by 3rd parties.

By using this service, you agree to our terms of use and privacy policy.

Your Privacy Choices Toggle icon

You can now claim free CME credits for this literature searchClaim now

Get seemless 1-tap access through your institution/university

For the best experience, use the Read mobile app