A Spatio-Temporal Fusion Framework of UAV and Satellite Imagery for Winter Wheat Growth Monitoring

  1. Li, Yan
  2. Yan, Wen
  3. An, Sai
  4. Gao, Wanlin
  5. Jia, Jingdun
  6. Tao, Sha
  7. Wang, Wei
  8. González Aguilera, Diego 1
  1. 1 Universidad de Salamanca
    info

    Universidad de Salamanca

    Salamanca, España

    ROR https://ror.org/02f40zc51

Revista:
Drones

ISSN: 2504-446X

Año de publicación: 2022

Volumen: 7

Número: 1

Páginas: 23

Tipo: Artículo

DOI: 10.3390/DRONES7010023 GOOGLE SCHOLAR lock_openAcceso abierto editor

Otras publicaciones en: Drones

Resumen

Accurate and continuous monitoring of crop growth is vital for the development of precision agriculture. Unmanned aerial vehicle (UAV) and satellite platforms have considerable complementarity in high spatial resolution (centimeter-scale) and fixed revisit cycle. It is meaningful to optimize the cross-platform synergy for agricultural applications. Considering the characteristics of UAV and satellite platforms, a spatio-temporal fusion (STF) framework of UAV and satellite imagery is developed. It includes registration, radiometric normalization, preliminary fusion, and reflectance reconstruction. The proposed STF framework significantly improves the fusion accuracy with both better quantitative metrics and visualized results compared with four existing STF methods with different fusion strategies. Especially for the prediction of object boundary and spatial texture, the absolute values of Robert’s edge (EDGE) and local binary pattern (LBP) decreased by a maximum of more than 0.25 and 0.10, respectively, compared with the spatial and temporal adaptive reflectance fusion model (STARFM). Moreover, the STF framework enhances the temporal resolution to daily, although the satellite imagery is discontinuous. Further, its application potential for winter wheat growth monitoring is explored. The daily synthetic imagery with UAV spatial resolution describes the seasonal dynamics of winter wheat well. The synthetic Normalized Difference Vegetation Index (NDVI) and Enhanced Vegetation Index 2 (EVI2) are consistent with the observations. However, the error in NDVI and EVI2 at boundary changes is relatively large, which needs further exploration. This research provides an STF framework to generate very dense and high-spatial-resolution remote sensing data at a low cost. It not only contributes to precision agriculture applications, but also is valuable for land-surface dynamic monitoring.

Información de financiación

Financiadores

  • Natural Science Foundation of Hebei Province of China
    • D2022407001
  • Soft Science Project of Hebei Science and Technology Program
    • 22557672D
    • 22557674D

Referencias bibliográficas

  • Fang, (2019), Rev. Geophys., 57, pp. 739, 10.1029/2018RG000608
  • Sun, (2022), Front. Plant Sci., 13, pp. 871859, 10.3389/fpls.2022.871859
  • Orusa, T., Orusa, R., Viani, A., Carella, E., and Borgogno Mondino, E. (2020). Geomatics and EO Data to Support Wildlife Diseases Assessment at Landscape Level: A Pilot Experience to Map Infectious Keratoconjunctivitis in Chamois and Phenological Trends in Aosta Valley (NW Italy). Remote Sens., 12.
  • Squadrone, (2019), Sci. Total Environ., 660, pp. 1383, 10.1016/j.scitotenv.2019.01.112
  • Xu, (2022), Remote Sens. Environ., 271, pp. 112905, 10.1016/j.rse.2022.112905
  • Qian, (2022), Agric. For. Meteorol., 322, pp. 109000, 10.1016/j.agrformet.2022.109000
  • Maimaitijiang, M., Sagan, V., Sidike, P., Daloye, A.M., Erkbol, H., and Fritschi, F.B. (2020). Crop monitoring using satellite/UAV data fusion and machine learning. Remote Sens., 12.
  • Bollas, N., Kokinou, E., and Polychronos, V. (2021). Comparison of sentinel-2 and UAV multispectral data for use in precision agriculture: An application from northern Greece. Drones, 5.
  • Liao, K.C., and Lu, J.H. (2021). Using UAV to Detect Solar Module Fault Conditions of a Solar Power Farm with IR and Visual Image Analysis. Appl. Sci., 11.
  • Liao, K.C., Wu, H.Y., and Wen, H.T. (2022). Using Drones for Thermal Imaging Photography and Building 3D Images to Analyze the Defects of Solar Modules. Inventions, 7.
  • Luoto, (2019), Remote Sens. Environ., 224, pp. 119, 10.1016/j.rse.2019.01.030
  • Hegarty-Craver, M., Polly, J., O’Neil, M., Ujeneza, N., Rineer, J., Beach, R.H., Lapidus, D., and Temple, D.S. (2020). Remote crop mapping at scale: Using satellite imagery and UAV-acquired data as ground truth. Remote Sens., 12.
  • Houet, (2020), Remote Sens. Environ., 243, pp. 111780, 10.1016/j.rse.2020.111780
  • Xie, L., Feng, X., Zhang, C., Dong, Y., Huang, J., and Cheng, J. (2022). A Framework for Soil Salinity Monitoring in Coastal Wetland Reclamation Areas Based on Combined Unmanned Aerial Vehicle (UAV) Data and Satellite Data. Drones, 6.
  • Zhao, L., Shi, Y., Liu, B., Hovis, C., Duan, Y., and Shi, Z. (2019). Finer classification of crops by fusing UAV images and Sentinel-2A data. Remote Sens., 11.
  • Jiang, (2022), GISci. Remote Sens., 59, pp. 936, 10.1080/15481603.2022.2083791
  • Mao, (2022), ISPRS J. Photogramm., 192, pp. 361, 10.1016/j.isprsjprs.2022.08.021
  • Abowarda, (2021), Remote Sens. Environ., 255, pp. 112301, 10.1016/j.rse.2021.112301
  • Gao, (2006), IEEE Trans. Geosci. Remote, 44, pp. 2207, 10.1109/TGRS.2006.872081
  • Zhu, (2010), Remote Sens. Environ., 114, pp. 2610, 10.1016/j.rse.2010.05.032
  • Wang, (2018), Remote Sens. Environ., 204, pp. 31, 10.1016/j.rse.2017.10.046
  • Zhukov, (1999), IEEE Trans. Geosci. Remote, 37, pp. 1212, 10.1109/36.763276
  • Clevers, (2008), IEEE Geosci. Remote Sens. Lett., 5, pp. 453, 10.1109/LGRS.2008.919685
  • Ao, (2022), IEEE Trans. Geosci. Remote, 60, pp. 1
  • Zhu, (2016), Remote Sens. Environ., 172, pp. 165, 10.1016/j.rse.2015.11.016
  • Wang, (2020), Remote Sens. Environ., 249, pp. 112009, 10.1016/j.rse.2020.112009
  • Li, (2022), Field Crop. Res., 279, pp. 108452, 10.1016/j.fcr.2022.108452
  • Liu, M., Ke, Y., Yin, Q., Chen, X., and Im, J. (2019). Comparison of five spatiotemporal satellite image fusion models over landscapes with various spatial heterogeneity and temporal variation. Remote Sens., 11.
  • Kong, (2021), Agric. For. Meteorol., 297, pp. 108255, 10.1016/j.agrformet.2020.108255
  • Zhou, (2021), Remote Sens. Environ., 252, pp. 112130, 10.1016/j.rse.2020.112130
  • Lanaras, C., Bioucas-Dias, J., Baltsavias, E., and Schindler, K. (2017, January 21–26). Super-Resolution of multispectral multiresolution images from a single sensor. Proceedings of the 30th IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, USA.
  • Fassnacht, (2021), Int. J. Appl. Earth Obs., 96, pp. 102281
  • Jiang, (2022), IEEE Trans. Geosci. Remote, 60, pp. 1
  • Li, W., Cao, D., Peng, Y., and Yang, C. (2021). MSNet: A multi-stream fusion network for remote sensing spatiotemporal fusion based on transformer and convolution. Remote Sens., 13.
  • Yin, (2017), Agric. For. Meteorol., 233, pp. 209, 10.1016/j.agrformet.2016.11.267
  • Li, (2022), Comput. Electron. Agric., 198, pp. 107037, 10.1016/j.compag.2022.107037
  • Verger, (2013), IEEE Trans. Geosci. Remote, 51, pp. 1963, 10.1109/TGRS.2012.2228653
  • Chen, (2004), Remote Sens. Environ., 91, pp. 332, 10.1016/j.rse.2004.03.014
  • Zhu, (2022), Remote Sens. Environ., 274, pp. 113002, 10.1016/j.rse.2022.113002
  • Tucker, (1979), Remote Sens. Environ., 8, pp. 127, 10.1016/0034-4257(79)90013-0
  • Jiang, (2008), Remote Sens. Environ., 112, pp. 3833, 10.1016/j.rse.2008.06.006
  • Zhang, (2019), Eur. J. Agron., 111, pp. 125938, 10.1016/j.eja.2019.125938
  • Nietupski, (2021), Int. J. App. Earth Obs., 99, pp. 102323
  • Wang, (2021), IEEE Trans. Geosci. Remote, 60, pp. 1
  • Djamai, N., Zhong, D., Fernandes, R., and Zhou, F. (2019). Evaluation of vegetation biophysical variables time series derived from synthetic Sentinel-2 images. Remote Sens., 11.