Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Luo, Weia | Feng, Taob; c; * | Liang, Honga
Affiliations: [a] University Key Laboratory of Internet of Things Technology and Application, Yunnan Province, College of Information, Yunnan University, Kunming, China | [b] College of Information, Yunnan University of Finance and Economics, Kunming, China | [c] Yunnan Erhai Lake Ecosystem National Field Scientific Observation and Research Station, Dali, China
Correspondence: [*] Corresponding author. Tao Feng, E-mail: [email protected].
Note: [1] This work was supported in part by the National Natural Science Foundation of China under Grant 62166045 and the Yunnan Erhai Lake Ecosystem National Field Scientific Observation and Research Station Seed Project 2020ZZ05. The Postgraduate Research and Innovation Foundation of Yunnan University (2021Y259).
Abstract: Change detection in synthetic aperture radar (SAR) images is an important part of remote sensing (RS) image analysis. Contemporary researchers have concentrated on the spatial and deep-layer semantic information while giving little attention to the extraction of multidimensional and shallow-layer feature representations. Furthermore, change detection relies on patch-wise training and pixel-to-pixel prediction while the accuracy of change detection is sensitive to the introduction of edge noise and the availability of original position information. To address these challenges, we propose a new neural network structure that enables spatial-frequency-temporal feature extraction through end-to-end training for change detection between SAR images from two different points in time. Our method uses image patches fed into three parallel network structures: a densely connected convolutional neural network (CNN), a frequency domain processing network based on a discrete cosine transform (DCT), and a recurrent neural network (RNN). Multi-dimensional feature representations alleviate speckle noise and provide comprehensive consideration of semantic information. We also propose an ensemble multi-region-channel module (MRCM) to emphasize the central region of each feature map, with the most critical information in each channel employed for binary classification. We validate our proposed method on four benchmark SAR datasets. Experimental results demonstrate the competitive performance of our method.
Keywords: Change detection, SFTNet, feature extraction, synthetic aperture radar (SAR) images, deep learning, neural network
DOI: 10.3233/JIFS-220689
Journal: Journal of Intelligent & Fuzzy Systems, vol. 44, no. 1, pp. 783-800, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
[email protected]
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office [email protected]
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
[email protected]
For editorial issues, like the status of your submitted paper or proposals, write to [email protected]
如果您在出版方面需要帮助或有任何建, 件至: [email protected]