Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
6346609 | Remote Sensing of Environment | 2014 | 12 Pages |
Abstract
The baseline radiometer brightness temperature (Tb) downscaling algorithm for NASA's Soil Moisture Active Passive (SMAP) mission, scheduled for launch in January 2015, is tested using an airborne simulation of the SMAP data stream. The algorithm synergistically uses 3Â km Synthetic Aperture Radar (SAR) backscatter (Ï) to downscale a 36Â km radiometer Tb to 9Â km. While the algorithm has already been tested using experimental datasets from field campaigns in the USA, it is imperative that it is tested for a comprehensive range of land surface conditions (i.e. in different hydro-climatic regions) before global application. Consequently, this study evaluates the algorithm using data collected from the Soil Moisture Active Passive Experiments (SMAPEx) in south-eastern Australia, that closely simulate the SMAP data stream for a single SMAP radiometer pixel over a 3-week interval, with repeat coverage every 2-3Â days. The results suggest that the average root-mean-square error (RMSE) in downscaled Tb is 3.1Â K and 2.6Â K for h- and v-polarizations respectively, when downscaled to 9Â km resolution. This increases to 8.2Â K and 6.6Â K when applied at 1Â km resolution. Downscaling over the relatively homogeneous grassland areas resulted in 2Â K lower RMSE than for the heterogeneous cropping area. Overall, the downscaling error was around 2.4Â K when applied at 9Â km resolution for five of the nine days, which meets the 2.4Â K error target of the SMAP mission.
Related Topics
Physical Sciences and Engineering
Earth and Planetary Sciences
Computers in Earth Sciences
Authors
Xiaoling Wu, Jeffrey P. Walker, Narendra N. Das, Rocco Panciera, Christoph Rüdiger,