Cropland mapping with L-band UAVSAR and development of NISAR products

https://doi.org/10.1016/j.rse.2020.112180Get rights and content

Highlights

  • CoV and machine learning methods are deployed to understand L-band crop Monitoring.

  • CoV achieves an overall accuracy greater than 80% for the NISAR L2 product.

  • Time series and the cross-pol term are the most useful for discrimination.

Abstract

Planned satellite launches will provide open access and operational L-band radar data streams at space-time resolutions not previously available. To further prepare, the Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) platform was used to observe cropland sites across the southern United States to support the development of L-band (24 cm) prototype science products. Time series flights flew over four independent areas during the growing season of 2019 while crop ground measurements were collected. Major crops include corn, cotton, pasture, peanut, rice, and soybean. A suite of cropland classification experiments applied a set of machine learning (ML) algorithms (random forest, feedforward fully connected neural network, support vector machine), the recently developed Multi-temporal Binary Tree Classification (MBTC), and a phenology (Coefficient of Variation; CoV) approach to synergistically assess performance, scattering mechanisms, and limitations. Specific objectives of this research application included 1.) evaluation of L-band mapping performance across multiple independent agricultural production areas with field scale training data, and 2.) assessment of the CoV approach for the generation of prototype NISAR Level 2 science products. Collectively, SAR terms with sensitivity to volume scattering performed well and consistently across CoV mapping experiments achieving accuracy greater than 80% for cropland vs not cropland. Dynamic phenology classes, such as herbaceous wetlands, had some confusion with CoV agriculture requiring further regionalized training optimization. Volume scattering and cross-pol terms were most useful across the different ML techniques with overall accuracy and Kappa consistently over 90% and 0.85, respectively, for crop type by late growth stages for L-band observations. As expected, time series information was more valuable compared to any single ML technique, site, or crop schema. Ultimately, as more SAR platforms launch, the user community should leverage physical contributions of different wavelengths and polarizations along with growing open access time series for efficient and meaningful agricultural products.

Introduction

Crop type and crop area are Essential Agricultural Variables (EAVs) for inventory, food security, and production forecasts at multiple scales. For decades, the National Aeronautics and Space Administration (NASA) has supported use of Earth Observations (EO) for monitoring agriculture. Since billions of people depend upon agriculture for their livelihood, it is an important policy driver. Traditionally, research and decision support tools have focused on using optical data from satellite missions such as NASA's Landsat and Moderate Resolution Imaging Spectroradiometer (MODIS), while use of Synthetic Aperture Radar (SAR) for agricultural monitoring has lagged. New opportunities now exist with the recent launches of Sentinel-1A & B and the planned launch of the NISAR (NASA Indian Space Research Organization SAR) Mission (NISAR, 2018; Kellogg et al., 2020), with operational and open access radar data streams.

SAR instruments offer unique abilities for assessing crops due to their all-weather capabilities and sensitivity to crop and field characteristics (i.e., dielectric constant, roughness, orientation) which differ from those derived from optical instruments. Historically, SARs have been used less than optical data for crop monitoring applications. These gaps exist due to limited availability, no consistent large-area operational acquisition strategies at appropriate scales, the uncertainty within historical digital elevation models required for processing, more complex data structures relative to optical data, and lack of standardized workflows. Even more progressive data acquisition plans have not met the requirements for operational monitoring of crop landscapes due to the rapid growth and phenological transitions in these managed landscapes. For example, the Advanced Land Observing Satellite (ALOS-1) L-band acquisition strategy acquired an image in ScanSAR mode only once every 46 days for the same area and viewing geometry. For major crops and typical rotations, this Baseline Observation Scenario (BOS) leads to only a handful of images during the cropping season making higher order products or practical monitoring applications challenging.

A few select institutions, such as Agriculture and Agri-Food Canada (AAFC), have been using SAR for operational, national-scale crop inventory efforts, benefiting from access to ample national C-band Radarsat microwave data (McNairn et al., 2009; Fisette et al., 2013). The consensus from the few national systems and many research studies utilizing SAR data for crop mapping is that SAR alone can effectively drive crop inventory if available, and SAR data blended with optical-only approaches can improve classification accuracies, address gaps due to cloud cover, and provide more comprehensive information. In particular, the combination of optical and SAR imagery usually produces accuracies higher (at least 5%–25%) than optical approaches alone and can help capture unique structural features undetectable by optical instruments (Bouvet et al., 2009, Forkuor et al., 2014, Kussul et al., 2016, McNairn et al., 2009, Nelson et al. 2014, Skakun et al., 2016, Torbick et al., 2010, Torbick et al., 2017, Torbick et al., 2018).

Among the important features of the NISAR mission characteristics are its wide swath (240 km), moderate spatial resolution (<30 m), 12-day repeat orbit cycle, and dual-frequency (24 cm L- and 12 cm S -band) capability. The open access mission includes a large diameter (12 m) deployable reflector and a dual frequency antenna feed to implement SweepSAR wide-swath mapping technology which allows for global access and relatively fast revisitation (NISAR, 2018). With SweepSAR, the entire incidence angle range is imaged as a single strip-map swath, at full resolution depending on the mode, and with full polarization capability planned in some strategic regions. Most of the Earth's land cover surfaces will be imaged using 20-MHz bandwidth that, once terrain geocoded, provides a range resolution similar to current moderate spatial resolution platforms (i.e., Sentinel-2 and Landsat-8 at 10-30 m). Thus, these resolutions will provide a unique ability to generate critical cropland metrics not previously achievable.

The polarimetric capability of the NISAR system provides dual-polarized global background observations and select coverage of quad-polarized observations. Most regions will be observed with the dual-pol system that is based on transmitting a horizontally or vertically polarized waveform and receiving signals in both polarizations. Over land surfaces, the transmit polarization will principally be horizontally (H) polarized, and the receive will be over both vertical (V) and horizontal polarizations, resulting in polarization combinations known as HH and HV. For a limited set of targets, the NISAR mission will make fully- or quasi quad- polarimetric measurements by alternating between transmitting H-, and V-polarized waveforms and by receiving both H and V (giving HH-, HV-, VH-, VV-polarized imagery).

To help prepare for the oncoming growth of L-band SAR a dedicated airborne campaign was carried out in 2019 using the Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) platform. Included in the design of the campaign were observations to support the ‘Ecosystems’ disciplines, including the ‘agriculture’ component. For example, time series UAVSAR observations were carried out to develop the NISAR baseline science requirement to measure cropland area seasonally with 80% or better accuracy at the hectare scale. The L-band UAVSAR has a noise equivalent better than -30 dB and produces single look resolution imagery with a resolution better than 2 m (Fore et al., 2015). Its range swath width is approximately 20 km, and it typically acquires data over hundreds of km2 in a single data-take. The aircraft it flies on is specially modified to fly within a 10 m tube for the duration of a data-take, and the antenna is electronically steerable to compensate for aircraft yaw. Maximum flight duration is 6 h, and it flies at 13,000 m to avoid other civilian air traffic.

Several studies have utilized UAVSAR for crop mapping previously. Whelen and Siqueira, 2017a, Whelen and Siqueira, 2017b, Whelen and Siqueira, 2018 evaluated the validity of using a CoV algorithm, which is the ratio of the standard deviation over the mean for a time series of orthorectified radar cross-section data, as a metric for distinguishing active cropland from non-cropland. They also show use of polarimetric parameters from the H/A/Alpha decomposition to classify California croplands with an overall accuracy around 80%. Transferability of the CoV approach to dual polarization Sentinel-1 was further evaluated across North Dakota and achieved an overall accuracy as high as 90% (Whelen and Siqueira, 2018).

A few recent studies have also implemented machine learning (ML) techniques to leverage UAVSAR for cropland monitoring. For example, Chen and Tao (2018) proposed a polarimetric-feature-driven Convolution Neural Network (CNN) with UAVSAR and, similarly, Li et al. (2020) applied a random forest (RF) technique to UAVSAR decompositions and achieved accuracy upwards of 90% for crop type classes. These recent efforts were built upon to address existing challenges and help shape L-band agricultural strategies. Technical objectives of this research application include 1.) to describe scattering mechanisms and physical interpretation of cropland SAR parameters; 2.) to evaluate efficacy of L-band mapping performance across multiple independent agricultural production areas with field scale training data; and 3.) to assess machine learning and time series phenology algorithms for generating prototype NISAR Level 2 products.

Section snippets

Study sites and training data

Four strategic sites were selected based on landscape composition, ongoing activities and local partnerships, and flight operations (Fig. 1). These sites are independent and considered agricultural production hot spots in their respective regions. Crop type observations were collected within each site, leveraging ongoing agronomic research experiments supplemented with drive-by windshield survey transects during the growing season. These crop type observations served as accurate and precise

Mapping approach

A complementary set of algorithms for analyzing and classifying the time series radar imagery was selected based on NISAR mission objectives and recent trends in crop mapping. An underlying goal of this approach was to understand the advantages, limitations, meanings, and performance of these data and models as opposed to identifying which performed best. The algorithms can be divided into three broad categories (Fig. 2): cropland vs non-cropland phenology protocols, crop type using ML

Results and discussion

Hundreds of classification experiments were run to assess scattering mechanisms, the performance of L-band SAR parameters, and robustness of techniques for cropland monitoring. This was done with UAVSAR, simulated NISAR, and the field-scale training data. For brevity, results and discussion are concatenated given the sheer number of classification experiments.

Conclusion

This research application evaluated time series L-band observations using UAVSAR and simulated NISAR observations for cropland and crop type mapping. Classification experiments showed CoV using HV polarization to perform consistently across independent sites with OA around 80%. CoV HV largely captured the extent and distribution of major row crops across the four regions using the time series phenology information extracted while ancillary data such as a water extent mask will likely help

Declaration of Competing Interest

None.

Acknowledgment

This work is supported by NASA NISAR Award #80NSSC19K1511 and the NASA Harvest Consortium (No. 80NSSC17K0625). We thank the JPL UAVSAR team for the UAVSAR data and collaborators to help collect the ground measurement data, including A. Zadoorian and B. Moreno-García in Arkansas.

References (26)

  • G. Forkuor et al.

    Integration of optical and synthetic aperture radar imagery for improving crop mapping in northwestern Benin, West Africa

    Remote Sens.

    (2014)
  • T. Hastie et al.

    The Elements of Statistical Learning: Data Mining, Inference, and Prediction

    (2009)
  • Y. Huang

    Advances in artificial neural networks–methodological development and application

    Algorithms

    (2009)
  • Cited by (0)

    View full text