Mapping Rangelands with Unmanned Aircraft

At Jornada Experimental Range in southern New Mexico, ongoing research is aimed at determining the utility of UAS for rangeland mapping and monitoring and developing an operational UAS-based remote sensing program for ecological applications.
 
By Andrea Laliberte

Unmanned aerial vehicles (UAV), also known as unmanned aircraft systems (UAS), are widely used for military purposes. However, civilian applications have increased considerably in recent years due to greater availability of small, unmanned aircraft, miniaturization of sensors, and ability for flexible deployment. UAS are being used increasingly for natural resource applications such as fire and natural disaster monitoring, wildlife observations, and vegetation measurements in vineyards, crops, forests, and rangelands. The ability to deploy a UAS relatively quickly, repeatedly, and at low altitudes allows for acquiring remotely sensed data that can reflect changes in landscape processes and dynamics at very high resolutions.

USDA scientists Al Rango and I are leading a team to conduct ongoing research at the USDA Agricultural Research Service’s (ARS) Jornada Experimental Range in southern New Mexico. The goal of this research is to determine the utility of UAS for rangeland mapping and monitoring and to develop an operational UAS-based remote sensing program for ecological applications. Over the last five years, we have developed workflows for acquisition, processing, and analysis of fine-detail UAS imagery and for relating remotely sensed information to ground-based measurements. 
 

UAS and Sensors

We are currently operating two BAT 3 unmanned aircraft (that we acquired from MLB Company in California). The BAT system consists of a fully autonomous GPS-guided UAS with a six-foot wingspan, a catapult launcher, ground station with mission planning and flight software, and telemetry system.

The UAS carries three sensors (Figure 1): a color video camera with live video downlink to the ground station, a Canon SD 900 10-megapixel digital camera, and a multispectral camera (from Tetracam, Inc., another California company) that acquires images in six narrow bands ranging from the blue to the near infrared.

The UAS is launched off a catapult and can be landed autonomously or manually via radio control. We fly generally at 700 feet (213 meters) above ground level, which results in a nominal ground-resolved distance of six centimeters for the Canon imagery and 14 centimeters for the multispectral imagery (Figure 2).

Imagery is acquired with a forward lap of 75% and side lap of 40% for photogrammetric processing into seamless image mosaics. The onboard computer records a timestamp, GPS location, elevation, pitch, roll, and heading for each image.
 

Orthorectification and Mosaicking

Challenges associated with orthorectification of this type of imagery include:
  • the relatively small image footprints (213 meter x 160 meter), 
  • image distortion due to the use of non-metric cameras, 
  • difficulty in locating ground control points and in automatic tie point generation (initial data-processing tests with commercial photogrammetric software showed that auto tie-point matching performed poorly or failed), and
  • relatively large errors in exterior orientation (EO) parameters (X, Y, Z, roll, pitch, heading) due to low accuracy GPS/INS data.
Our goal was to reduce the cost and turnaround time for UAS image mosaic creation and to minimize human input. Our solution was to develop a semi-automated orthorectification approach suitable for handling large numbers of individual, small-footprint UAS imagery without the need for manual tie points and no (or a minimal number of) ground control points.

We calibrate our cameras to determine the interior orientation parameters. Next, our approach uses a combination of off-the-shelf and custom software. We obtain tie points from the image-stitching software Autopano Pro, which has the advantage of automatically deriving tie points without requiring image sequence or flight line information.

The tie points and initial EO data are used as a starting point for our custom software PreSync that employs an image-matching approach between the UAS imagery and a one-meter digital orthoquad (DOQ). Imagery is matched individually, by flight line, and as a block to the DOQ, which provides accurate X, Y data for each tie point. Height information is derived from a five-meter IFSAR DEM. The PreSync process improves the accuracy of the initial EO parameters for subsequent orthorectification and mosaicking in Leica Photogrammetry Suite.

Mosaics created with this process and using no ground control points have a positional accuracy of approximately one to two meters for mosaics consisting of 200 to 400 images. If higher accuracy is required, we use targets and ground control points measured with high-accuracy GPS to improve the accuracy to decimeters. To date, we have acquired over 36,000 UAS images, which have been processed into 125 image mosaics. 
 

High-resolution Terrain Data

For routine processing, we are using the five-meter DEM in the orthorectification process. However, we are also able to extract fine-detail digital surface models (DSM) from the imagery (Figure 3). DSMs at 0.5-1 meter resolution offer valuable insights for hydrological studies and allow us to evaluate elevation, slope, aspect, and curvature, derive watershed boundaries, and channel networks and calculate flow pathways. Even finer detailed information can be derived from dense DSMs based on a 3D point cloud extraction at the pixel level, and we are currently evaluating deriving shrub and tree heights from the dense DSMs.  

Image Processing and Classification

When we acquire multispectral imagery, we use radiometric targets and obtain ground-based reflectance measurements for the targets with a spectroradiometer. An empirical line calibration method is used to convert the digital numbers in the imagery to at-ground reflectance. Comparisons of reflectance measures for vegetation and soils in the image with ground-based measurements have shown good correlations, allowing us to compare data obtained at different time periods for change detection studies.

We use an object-based image analysis (OBIA) approach implemented in Trimble’s eCognition for image classifications. OBIA is more suitable than pixel-based classification for high- and very -high-resolution imagery because it is capable of mimicking a human-image interpretation approach, taking into account not only spectral, but also spatial, textural, and contextual image information. This is especially useful when analyzing imagery only in the visible bands acquired with the Canon camera.

The OBIA classification scheme is often hierarchical and employs a rule-based masking approach, where an image is first classified into easy-to-define classes (bare/vegetated, shadow/non-shadow) using rules, followed by further classification to the vegetation species level with a nearest neighbor classification (Figure 4).

Decision trees can be used to select suitable features for classification. For a study evaluating the relationship between ground- and image-based measurements in arid rangelands, we flew a mission in southwestern Idaho. We used OBIA to classify UAS imagery of 50-meter x 50-meter plots measured concurrently on the ground using standard rangeland monitoring procedures.

Correlations between image- and ground-based estimates of percent cover for bare ground, shrubs, and grass/forbs resulted in high correlations ranging from 0.86 to 0.98. Time estimates indicated a greater efficiency for the image-based method because a large number of plots could be extracted from the mosaics, while ground-based measurements require greater personnel and travel times. 
 

Advantages and Limitations

There is a growing interest in using UAS for remote sensing applications in natural resources. At the Jornada, we have found that UAS show great promise for remote sensing of rangelands due to the low flying heights, resulting high-resolution imagery, and lower image-acquisition costs per image compared to manned aircraft.

In addition, the ability to fly over a study area multiple times in a season and at multiple flying heights offers great opportunities for change detection studies. Current limitations include initial acquisition costs for the UAS, crew training requirements, limited availability of high-quality and lightweight sensors, and FAA regulations for operating a UAS in the national airspace.

Unmanned aircraft operation in the National Airspace fall under the jurisdiction of the Federal Aviation Administration (FAA). Public entities (local, state, and federal agencies) have to apply for a Certificate of Authorization (COA); civil entities require a Special Airworthiness Certificate. The COA provides guidelines for operator qualifications, airworthiness, aircraft maintenance, flying altitudes, communication with air traffic control, visual line of sight, and visual observer requirements.

Current regulations limit UAS flights to visual line of sight even though many UAS are capable of operating autonomously for several hours and at large distances from the ground station. These regulations impose limitations for the size of area that can be mapped and additional crew-training costs. Personnel have to be trained in the safe operation of the UAS and have to fulfill requirements for private pilot ground school and/or private pilot licensing and FAA medical certification. UAS regulations are subject to continuous review and are updated as required. The FAA maintains a website with the latest UAS regulations and policies at http://www.faa.gov/about/initiatives/uas/cert/.

Our six-member UAS team is trained in all UAS operations and fulfills the FAA requirements for operating a UAS in the NAS. Two of our team members hold private pilot licenses. According to regulations, our missions are conducted within visual range of the UAS, which means a limit of about a half-mile horizontal distance at our flying height of 700 feet above ground.

We work closely with New Mexico State University’s UAS flight test center, which has a COA for southern New Mexico. At the Jornada Experimental Range field station, we operate under this COA. Because portions of the Jornada are located in restricted airspace, we also operate the UAS in military airspace controlled by the adjacent White Sands Missile Range. In addition, we have also obtained our own COAs for remote sensing missions at other USDA ARS field sites in Idaho and Arizona where we flew the UAS to conduct rangeland mapping and evaluate invasive species. 
 

Outlook

Even under the current FAA limitations, we have found that UAS can be used to successfully obtain imagery for rangeland monitoring and that the remote sensing approach can either complement or replace some ground-based rangeland measurements. The accuracies achieved with our orthorectification approach are sufficient for rangeland monitoring purposes.

Our UAS imagery has been used for various purposes: adapt field-sampling approaches to very high-resolution imagery, derive parameters for hydrologic models, support repetitive data analysis for a phenology pilot study, assist in evaluation of disturbance experiments, detect invasive species, and contribute high-resolution information for archeological studies. Future plans include streamlining the processing of point clouds for improved 3D data extraction at very high resolution and scaling the image analysis approaches to larger areas.

We are also in the process of acquiring a larger UAS with a 13-foot wingspan and a capability of 35 pounds of payload. This will allow us to integrate and test additional sensors, such as thermal, hyperspectral, and lidar instruments.

Andrea Laliberte is a remote sensing scientist, formerly with the USDA ARS Jornada Experimental Range in Las Cruces, New Mexico, and currently with Earthmetrics in Oregon. She can be contacted at andrea.laliberte@earthmetrics.com.

» Back to our Aerial Mapping Spring '12 Issue