WANG Z P, TANG C Y, DENG T L, CUI H D, ZHANG Y Z, LIU L L. Integration of deep learning and change detection for flood-damaged greenhouse extraction using drone multispectral imagery[J]. Chinese Journal of Eco-Agriculture, 2026, 33(1): 1−11. DOI: 10.12357/cjea.20250247
Citation: WANG Z P, TANG C Y, DENG T L, CUI H D, ZHANG Y Z, LIU L L. Integration of deep learning and change detection for flood-damaged greenhouse extraction using drone multispectral imagery[J]. Chinese Journal of Eco-Agriculture, 2026, 33(1): 1−11. DOI: 10.12357/cjea.20250247

Integration of deep learning and change detection for flood-damaged greenhouse extraction using drone multispectral imagery

  • In past decades, flood disasters occur frequently around the world, making serious damages to greenhouse, as one of agricultural production, are vulnerable to damage in flood disasters. Therefore, rapid and accurate extraction of spatial location information for damaged greenhouses is of great significance for disaster loss assessment and post-disaster reconstruction. With the development of computer technology, deep learning has been widely applied. In order to assess the damaged greenhouses by flood disaster, we chose the Yangying Village, Wuqing District, Tianjin City that was flooded from the end of July to August 2023 because of the catastrophic “23·7” basin-wide flood in the Haihe River Basin. Based on drone bound multispectral remote sensing images, we detected efficiently the damaged greenhouses in a deep learning approach. At first, multidimensional spatial features were constructed to select the better bands and indexes for detecting the greenhouses based on the spectral reflectance profiles and the separability measure. Then, the greenhouses were detected with different resolution images by three kinds of deep learning networks. Finally, the damaged greenhouses were identified more correctly and efficiently, according the status changes of greenhouse detected from the images obtained in different period. As a results, the blue band, green band, and NDVI index were sensitive parameters with higher spectral separability for greenhouse extraction, the blue band exhibited the highest separability, while the red band, red-edge band, and near-infrared band showed lower separability. Three networks, Seg-UNet, Seg-UNet++, and DeepLab V3+, have the overall accuracy of greenhouse recognition and Kappa coefficients respecting better than 97% and 0.8. the model trained with Seg-UNet network demonstrated the best performance in greenhouse identification with a higher classification accuracy. and had an optimal number of epochs of 40. For the image resolution ranging from 0.1~2m, the model with 0.2 m image resolution had the highest accuracy, with the overall accuracy of 99.02% and a Kappa coefficient of 0.93, respectively. For the greenhouse images in different periods, the base model was fine-tuned through transfer learning to improve the overall accuracy and the Kappa coefficient for the modified base model. Using change detection, comparing the greenhouse detection results during the non-flood to the flood periods, damaged greenhouses were identified with the overall accuracy of 98.87%, and a Kappa coefficient of 0.80. This study provides valuable insights for the application of drone-bound multispectral image in damaged greenhouse detection, disaster impact assessment, and science-based post-disaster reconstruction.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return