TY - JOUR
T1 - SAR-to-optical image translation using supervised cycle-consistent adversarial networks
AU - Wang, Lei
AU - Xu, Xin
AU - Yu, Yue
AU - Yang, Rui
AU - Gui, Rong
AU - Xu, Zhaozhuo
AU - Pu, Fangling
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2019
Y1 - 2019
N2 - Optical remote sensing (RS) data suffer from the limitation of bad weather and cloud contamination, whereas synthetic aperture radar (SAR) can work under all weather conditions and overcome this disadvantage of optical RS data. However, due to the imaging mechanism of SAR and the speckle noise, untrained people are difficult to recognize the land cover types visually from SAR images. Inspired by the excellent image-to-image translation performance of Generative Adversarial Networks (GANs), a supervised Cycle-Consistent Adversarial Network (S-CycleGAN) was proposed to generate large optical images from the SAR images. When the optical RS data are unavailable or partly unavailable, the generated optical images can be alternative data that aid in land cover visual recognition for untrained people. The main steps of SAR-to-optical image translation were as follows. First, the large SAR image was split to small patches. Then S-CycleGAN was used to translate the SAR patches to optical image patches. Finally, the optical image patches were stitched to generate the large optical image. A paired SAR-optical image dataset which covered 32 Chinese cities was published to evaluate the proposed method. The dataset was generated from Sentinel-1 (SEN-1) SAR images and Sentinel-2 (SEN-2) multi-spectral images. S-CycleGAN was applied to two experiments, which were SAR-to-optical image translation and cloud removal, and the results showed that S-CycleGAN could keep both the land cover and structure information well, and its performance was superior to some famous image-to-image translation models.
AB - Optical remote sensing (RS) data suffer from the limitation of bad weather and cloud contamination, whereas synthetic aperture radar (SAR) can work under all weather conditions and overcome this disadvantage of optical RS data. However, due to the imaging mechanism of SAR and the speckle noise, untrained people are difficult to recognize the land cover types visually from SAR images. Inspired by the excellent image-to-image translation performance of Generative Adversarial Networks (GANs), a supervised Cycle-Consistent Adversarial Network (S-CycleGAN) was proposed to generate large optical images from the SAR images. When the optical RS data are unavailable or partly unavailable, the generated optical images can be alternative data that aid in land cover visual recognition for untrained people. The main steps of SAR-to-optical image translation were as follows. First, the large SAR image was split to small patches. Then S-CycleGAN was used to translate the SAR patches to optical image patches. Finally, the optical image patches were stitched to generate the large optical image. A paired SAR-optical image dataset which covered 32 Chinese cities was published to evaluate the proposed method. The dataset was generated from Sentinel-1 (SEN-1) SAR images and Sentinel-2 (SEN-2) multi-spectral images. S-CycleGAN was applied to two experiments, which were SAR-to-optical image translation and cloud removal, and the results showed that S-CycleGAN could keep both the land cover and structure information well, and its performance was superior to some famous image-to-image translation models.
KW - cloud removal
KW - GAN
KW - SAR-to-optical image translation
KW - Sentinel
KW - visualization
UR - http://www.scopus.com/inward/record.url?scp=85078034428&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85078034428&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2019.2939649
DO - 10.1109/ACCESS.2019.2939649
M3 - Article
AN - SCOPUS:85078034428
VL - 7
SP - 129136
EP - 129149
JO - IEEE Access
JF - IEEE Access
M1 - 8825802
ER -