Automatic weed monitoring and classification are critical for effective site-specific weed management. With the increasing availability of different sensors, it is possible for weed management to be achieved by processing a wide range of images captured from various remote sensing platforms. A deep learning-based convolutional neural network (CNN) can learn the sophisticated spectral, spatial, and structural features to discriminate weed species. The challenge is to train a CNN architecture for each dataset with limited training samples. In this study, we develop a partial transferable CNN to cope with a new dataset with a different spatial resolution, a different number of bands, and variation in the signal-to-noise ratio. The goal is to make the training for each new dataset less demanding. We conducted a series of experiments on simulated image datasets from two sensors. This study reveals that the dropout layers between the convolutional layers have a significant impact for partial transferable CNN. Even-numbered subset layers from source CNN has a stronger impact on dealing with a task of different spatial resolution. For a different number of bands in source and target datasets, except for the first convolutional layer, the remaining layers are used for the analysis. Results show that network transfer is possible when the numbers of bands of the two datasets are not very different. For the variation in signal-to-noise ratio, it is found that the performance of transfer learning is acceptable when the noise level is not high. Based on these findings, experiments were conducted on two real datasets from two sensors, which includes all the variations. The comparison results using different state-of-the-art models show that partial CNN transfer with even-numbered layers provides better mapping accuracy for the target dataset with a limited number of training samples.
|Number of pages||16|
|Journal||IEEE Transactions on Geoscience and Remote Sensing|
|Early online date||22 Sep 2021|
|Publication status||Published - 2022|