While most scientists using remote sensing are familiar with passive, optical images from the U.S. Geological Survey's Landsat, NASA's Moderate Resolution Imaging Spectroradiometer (MODIS), and the European Space Agency's Sentinel-2, another type of remote sensing data is making waves: Synthetic Aperture Radar, or SAR. SAR is a type of active data collection where a sensor produces its own energy and then records the amount of that energy reflected back after interacting with the Earth. While optical imagery is similar to interpreting a photograph, SAR data require a different way of thinking in that the signal is instead responsive to surface characteristics like structure and moisture.
For more information on passive and active remote sensing, view What is Remote Sensing?
What's Synthetic about SAR?
The spatial resolution of radar data is directly related to the ratio of the sensor wavelength to the length of the sensor's antenna. For a given wavelength, the longer the antenna, the higher the spatial resolution. From a satellite in space operating at a wavelength of about 5 cm (C-band radar), in order to get a spatial resolution of 10 m, you would need a radar antenna about 4,250 m long. (That's over 47 football fields!)
An antenna of that size is not practical for a satellite sensor in space. Hence, scientists and engineers have come up with a clever workaround — the synthetic aperture. In this concept, a sequence of acquisitions from a shorter antenna are combined to simulate a much larger antenna, thus providing higher resolution data (view geometry figure to the right).
The Role of Frequency and Wavelength
Optical sensors such as Landsat's Operational Land Imager (OLI) and Sentinel-2's Multispectral Instrument (MSI) collect data in the visible, near-infrared, and short-wave infrared portions of the electromagnetic spectrum. Radar sensors utilize longer wavelengths at the centimeter to meter scale, which gives it special properties, such as the ability to see through clouds (view electromagnetic spectrum to the right). The different wavelengths of SAR are often referred to as bands, with letter designations such as X, C, L, and P. The table below notes the band with associated frequency, wavelength, and the application typical for that band.
|Rarely used for SAR (airport surveillance)
|rarely used (H2O absorption)
|rarely used for SAR (satellite altimetry)
|High resolution SAR (urban monitoring,; ice and snow, little penetration into vegetation cover; fast coherence decay in vegetated areas)
|SAR Workhorse (global mapping; change detection; monitoring of areas with low to moderate penetration; higher coherence); ice, ocean maritime navigation
|Little but increasing use for SAR-based Earth observation; agriculture monitoring (NISAR will carry an S-band channel; expends C-band applications to higher vegetation density)
|Medium resolution SAR (geophysical monitoring; biomass and vegetation mapping; high penetration, InSAR)
|Biomass. First p-band spaceborne SAR will be launched ~2020; vegetation mapping and assessment. Experimental SAR.
Wavelength is an important feature to consider when working with SAR, as it determines how the radar signal interacts with the surface and how far a signal can penetrate into a medium. For example, an X-band radar, which operates at a wavelength of about 3 cm, has very little capability to penetrate into broadleaf forest, and thus mostly interacts with leaves at the top of the tree canopy. An L-band signal, on the other hand, has a wavelength of about 23 cm, achieving greater penetration into a forest and allowing for more interaction between the radar signal and large branches and tree trunks. Wavelength doesn't just impact the penetration depth into forests, but also into other land cover types such as soil and ice.
For example, scientists and archaeologists are using SAR data to help "uncover" lost cities and urban-type infrastructures hidden over time by dense vegetation or desert sands. For information on the use of SAR in space archaeology, view NASA Earth Observatory's Peering through the Sands of Time and Secrets beneath the Sand.
Polarization and Scattering Mechanisms
Radar can also collect signals in different polarizations, by controlling the analyzed polarization in both the transmit and receive paths. Polarization refers to the orientation of the plane in which the transmitted electromagnetic wave oscillates. While the orientation can occur at any angle, SAR sensors typically transmit linearly polarized. The horizontal polarization is indicated by the letter H, and the vertical polarization is indicated by V.
The advantage of radar sensors is that signal polarization can be precisely controlled on both transmit and receive. Signals emitted in vertical (V) and received in horizontal (H) polarization would be indicated by a VH. Alternatively, a signal that was emitted in horizontal (H) and received in horizontal (H) would be indicated by HH, and so on. Examining the signal strength from these different polarizations carries information about the structure of the imaged surface, based on the following types of scattering: rough surface, volume, and double bounce (view figure below).
- Rough surface scattering, such as that caused by bare soil or water, is most sensitive to VV scattering.
- Volume scattering, for example, caused by the leaves and branches in a forest canopy, is most sensitive to cross-polarized data like VH or HV.
- The last type of scattering, double bounce, is caused by buildings, tree trunks, or inundated vegetation and is most sensitive to an HH polarized signal.
It is important to note that the amount of signal attributed to different scattering types may change as a function of wavelength, as wavelength changes the penetration depth of the signal. For example, a C-band signal penetrates only into the top layers of the canopy of a forest, and therefore will experience mostly roughness scattering mixed with a limited amount of volume scattering. However a L-band or P-band signal will have much deeper penetration and therefore experience strongly enhanced volume scattering as well as increasing amounts of double-bounce scattering caused by the tree trunk (view canopy penetration figure below).
SAR data can also enable an analysis method called interferometry, or InSAR. InSAR uses the phase information recorded by the sensor to measure the distance from the sensor to the target. When at least two observations of the same target are made, the distance, with additional geometric information from the sensor, can be used to measure changes in land surface topography. These measurements are very accurate (up to the centimeter level!) and can be used to identify areas of deformation from events like volcanic eruptions and earthquakes (view interferogram to the right).
Only recently have consistent SAR datasets been widely available for free, starting with the launch and open data policy of the European Space Agency's (ESA) Sentinel-1a in 2014. Other sensors have historic data, imagery that is only available for certain areas, or policies that require the purchase of data. The table below lists the SAR sensors that have or are currently producing data, as well as the data parameters and where to access.
|Sentinel Application Platform (SNAP) Sentinel 1 Toolbox
|European Space Agency
|A graphical user interface (GUI) used for both polarimetric and interferometric processing of SAR data. Start to finish processing includes algorithms for calibration, speckle filtering, coregistration, orthorectification, mosaicking, and data conversion.
|John Truckenbrodt, Friedrich-Schiller-University Jena / Deutsches Zentrum
für Luft- und Raumfahrt German Aerospace Center
|A Python framework for large-scale SAR satellite data processing that can access GAMMA and SNAP processing capabilities. Specializes in handling of acquisition metadata, formatting of preprocessed data for further analysis, and options for exporting data to Data Cube.
|Sentinel and various past and present satellite missions
|Generic Mapping Tools Synthetic Aperture Radar
|ConocoPhillips, Scripps Institution of Oceanography, and San Diego State University
|GMTSAR adds interferometric processing capabilities to Generic Mapping Tools (GMT), command line tools used to manipulate geographic data and create maps. GMTSAR includes two main processors: 1. an InSAR processor that can focus and align stacks of images, maps topography into phase, conducts phase unwrapping, and forms complex interferograms and 2. a postprocessor to filter the interferogram and create coherence, phase gradient, and line-of sight displacement products.
|Delft object-oriented radar interferometric software
|Delft Institute of Earth Observation and Space Systems of Delft University of Technology
|Interferometric processing from single look complex (SLC) to complex interferogram and coherence map. Includes geocoding capability, but does not include phase unwrapping.
|Single Look Complex data from ERS, ENVISAT, JERS, RADARSAT
|Statistical-Cost, Network-Flow Algorithm for Phase Unwrapping
|Stanford Radar Interferometry Research Group
|Software written in C that runs on most Unix/Linux platforms. Used for phase unwrapping (an interferometric process). The SNAPHU algorithm has been incorporated into other SAR processing software, including ISCE.
|Input data is interferogram formatted as a raster, with single-precision (float, real*4, or complex*8) floating-point data types
|Hybrid Pluggable Processing Pipeline
|Alaska Satellite Facility
|Online interface for InSAR processing, including steps such as phase unwrapping (using the Minimum Cost Flow algorithm). Includes access to some GAMMA and ISCE processing capablities for interferometry. Also includes Radiometric Terrain Correction (RTC) and change detection tools.
|Dependent on process.
|InSAR Scientific Computing Environment
|Jet Propulsion Laboratory and Stanford University
|Interferometric processing packaged as Python modules. Interferometric processing from raw or SLC to complex interferogram and coherence map. Includes geocoding, phase unwrapping, filtering, and more.
|Alaska Satellite Facility
|A GUI used to terrain-correct, geocode, and apply polarimetric decompositions to multi-polarimetric synthetic aperture radar (PolSAR) data.
|ALOS Palsar and other older datasets in ASF’s catalog (not to be used for Sentinel-1, SNAP S1TBX recommended for Sentinel-1)
|Python Radar Tools
|A GUI implemented in Python for post-processing of both airborne and spaceborne SAR imagery. Includes various filters, geometrical transformations and capabilities for both interferometric and polarimetric processing.
|Airborne and spaceborne SAR data
|Polarimetric SAR data Processing and Education Toolbox
|European Space Agency
|A GUI for high level polarimetric processing. Includes analysis capabilities for PolSAR, PolinSAR, PolTomoSAR and PolTimeSAR data, including functionalities such as elliptical polarimetric basis transformations, speckle filters, decompositions, parameter estimation, and classification/segmentation. Includes a fully polarimetric coherent SAR scattering and imaging simulator for forest and ground surfaces.
Several new sensors are also planned for launch in the next few years. These include the joint NASA-Indian Space Research Organisation SAR (NISAR) satellite, which will collect L-band SAR data, with more limited coverage of S-band. All data will be free and openly available to the public. ESA is also launching the P-band BIOMASS mission, which will have an open data policy as well. View a list of the upcoming SAR missions and data parameters.
All free and publicly available SAR data can be accessed in Earthdata Search.
Data Processing and Analysis
One of the limitations of working with SAR data has been the somewhat tedious preprocessing steps that lower-level SAR data requires. Depending on the type of analysis you want to do, these preprocessing steps can include: applying the orbit file, radiometric calibration, de-bursting, multilooking, speckle filtering, and terrain correction. These steps are described in more detail in this SAR Pre-Processing one pager.
Special software is required to process SAR data, depending on the data provider, starting level of data, and desired level of data. The table below shows a selection of freely-available software packages, what they can be used for, and where you can download them.
More recently, data repositories like NASA's Alaska Satellite Facility Distributed Active Archive Center (ASF DAAC) are starting to provide radiometrically terrain-corrected products for select areas, reducing the amount of time and effort the user has to put into preprocessing on their own.
- SERVIR SAR Handbook: Comprehensive Methodologies for Forest Monitoring and Biomass Estimation
- NISAR Science Users' Handbook
- NASA Applied Remote Sensing Training (ARSET) Program Courses:
- Introduction to Synthetic Aperture Radar (available in English and Spanish)
- Advanced Webinar: Radar Remote Sensing for Land, Water, and Disaster Applications (available in English and Spanish)
- Advanced Webinar: SAR for Disasters and Hydrological Applications
- ASF DAAC SAR Data Recipes
- ESA EO College Echoes in Space Course
- University of Alaska Fairbanks Microwave Remote Sensing Course
- Woodhouse, I. H., 2006, Introduction to Microwave Remote Sensing, Boca Raton, FL, CRC Press, Taylor & Francis Group
Much of the information from this page is drawn from the following chapters in The SAR Handbook: Comprehensive Methodologies for Forest Monitoring and Biomass Estimation:
Meyer, Franz. "Spaceborne Synthetic Aperture Radar – Principles, Data Access, and Basic Processing Techniques." SAR Handbook: Comprehensive Methodologies for Forest Monitoring and Biomass Estimation. Eds. Flores, A., Herndon, K., Thapa, R., Cherrington, E. NASA. 2019.
Kellndorfer, Josef. "Using SAR Data for Mapping Deforestation and Forest Degradation." SAR Handbook: Comprehensive Methodologies for Forest Monitoring and Biomass Estimation. Eds. Flores, A., Herndon, K., Thapa, R., Cherrington, E. NASA. 2019. doi:10.25966/68c9-gw82
Saatchi, Sassan. "SAR Methods for Mapping and Monitoring Forest Biomass." SAR Handbook: Comprehensive Methodologies for Forest Monitoring and Biomass Estimation. Eds. Flores, A., Herndon, K., Thapa, R., Cherrington, E. NASA. 2019. doi:10.25966/hbm1-ej07
Article by Kelsey Herndon, Franz Meyer, Africa Flores, Emil Cherrington, and Leah Kucera in collaboration with the Earth Science Data Systems. Graphics by Leah Kucera.