Summary: | Capturing spatial and temporal dynamics is a key issue for many remote-sensing based applications. Consequently, several image-blending algorithms that can simulate the surface reflectance with high spatial-temporal resolution have been developed recently. However, the performance of the algorithm against the effect of temporal interval length between the base and simulation dates has not been reported. In this study, our aim was to evaluate the effect of different temporal interval lengths on the accuracy using the widely used blending algorithm, Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM), based on Landsat, Moderate-resolution Imaging Spectroradiometer (MODIS) images and National Land Cover Database (NLCD). Taking the southwestern continental United States as the study area, a series of experiments was conducted using two schemes, which were the assessment of STARFM with (i) a fixed base date and varied simulation date and (ii) varied base date and specific simulation date, respectively. The result showed that the coefficient of determination (R2), Root Mean Squared Error (RMSE) varied, and overall trend of R2 decreased along with the increasing temporal interval between the base and simulation dates for six land cover types. The mean R2 value of cropland was lowest, whereas shrub had the highest value for two schemes. The result may facilitate selection of an appropriate temporal interval when using STARFM.
|