Principal Investigator: Sassan Saatchi, NASA's Jet Propulsion Laboratory
Active remote sensing techniques are the only viable measurement approach to quantifying vegetation structure globally. The observational configuration of satellite sensors and the range of sensitivity to vegetation structure, however, introduce uncertainty regarding the use of individual sensors to meet the requirements for different applications.
Lidar sensors from satellite platforms have been proven to provide direct estimates of vegetation vertical profiles but are only available in a sampling configuration and for a limited period of time (e.g., NASA's Global Ecosystems Dynamics Investigation [GEDI] mission). Radar sensors such as the European Space Agency's Sentinel and the upcoming joint NASA-Indian Space Research Organisation Synthetic Aperture Radar (NISAR) mission, on the other hand, provide indirect estimates of vegetation structure with various ranges of sensitivity depending on wavelength and geometry of observation.
Radar sensors are capable of mapping vegetation structure and changes through time from disturbance and recovery processes only when trained and calibrated with a large number of samples (from ground and lidar measurements). These two active sensing approaches are, therefore, inherently synergistic in quantifying the dynamics of global vegetation structure.
This project will develop global training data for vegetation structure and multi-scale machine learning (ML) algorithms to integrate the training data and imagery from active sensors and demonstrate the measurement of landscape-scale changes of vegetation structure and biomass globally. The objectives of the work are:
- To develop quality-controlled training data of vegetation structure at multiple landscape scales (i.e., 50, 100, 250, 500, and 1000 m) from composite lidar waveforms from ground, airborne, and spaceborne lidar observations.
- To assemble multi-objective clustering and feature-extraction machine learning (ML) models to develop multi-scale (landscape scales) from satellite radar imagery (Sentinel, ALOS and future NISAR).
- To demonstrate ML model implementation for large-scale vegetation structure prediction using lidar training data and radar multi-scale feature space on open platforms and cloud computing.
The project will deliver science data products and software that can be readily integrated as part of NASA's Earth Observing System Data and Information System (EOSDIS) and enable the science community to perform large-area data processing and product generation over different platforms, including the cloud, without data download. By creating a large number of training data sets from available ground and airborne data and satellite measurements of vegetation structure from GEDI, and potentially ICESat-2 at multi-scales, the project will provide data sets for calibrating and implementing algorithms from other active (potentially passive) sensors.
The multi-objective clustering and feature-extraction machine learning technique develops textures and spatio-temporal features from radar backscatter measurements (Sentinel, ALOS, NISAR) to provide large-scale estimation of vegetation structure and its changes globally. The result of this work will have a significant impact on quantifying global carbon stocks from human- and climate-induced changes to vegetation consistently and systematically.