Article ID Journal Published Year Pages File Type
6346609 Remote Sensing of Environment 2014 12 Pages PDF
Abstract
The baseline radiometer brightness temperature (Tb) downscaling algorithm for NASA's Soil Moisture Active Passive (SMAP) mission, scheduled for launch in January 2015, is tested using an airborne simulation of the SMAP data stream. The algorithm synergistically uses 3 km Synthetic Aperture Radar (SAR) backscatter (σ) to downscale a 36 km radiometer Tb to 9 km. While the algorithm has already been tested using experimental datasets from field campaigns in the USA, it is imperative that it is tested for a comprehensive range of land surface conditions (i.e. in different hydro-climatic regions) before global application. Consequently, this study evaluates the algorithm using data collected from the Soil Moisture Active Passive Experiments (SMAPEx) in south-eastern Australia, that closely simulate the SMAP data stream for a single SMAP radiometer pixel over a 3-week interval, with repeat coverage every 2-3 days. The results suggest that the average root-mean-square error (RMSE) in downscaled Tb is 3.1 K and 2.6 K for h- and v-polarizations respectively, when downscaled to 9 km resolution. This increases to 8.2 K and 6.6 K when applied at 1 km resolution. Downscaling over the relatively homogeneous grassland areas resulted in 2 K lower RMSE than for the heterogeneous cropping area. Overall, the downscaling error was around 2.4 K when applied at 9 km resolution for five of the nine days, which meets the 2.4 K error target of the SMAP mission.
Related Topics
Physical Sciences and Engineering Earth and Planetary Sciences Computers in Earth Sciences
Authors
, , , , ,