Home > Environmental

The proof is in the vegetation map

The proof is in the vegetation map

Coastal marshland such as in Terrebonne Parish’s area of interest is notoriously difficult to survey.
Photo: Jenneke Visser


One scientist is combining drone data with image analysis technology to map coastal marshlands.

By Mary Jo Wagner

It isn’t often that “exhilarating” is used to describe vegetation mapping surveys. But Whitney Broussard, Ph.D., a senior scientist at JESCO, an environmental and geotechnical services company in Jennings, La., chose that word –– twice –– in describing a drone-based survey along the state’s coastal marsh.

“Our area of interest (AOI) was 4 kilometers south of the nearest terra firma,” he said. “We strapped the catapult down on our research boat and launched the drone off the bow over open water. It was like launching a mini airplane off a mini aircraft carrier. We chased it home doing 40 mph down the canal and then right as the drone was finishing, we sent the final command for it to land on open grass near a boat launch. It was exhilarating.”

Part of Broussard’s excitement might have stemmed from that being his first commercial unmanned aircraft system (UAS) flight. But a larger contributor to that enthusiasm was the fact that the flight was the foundational survey for a successful pilot project testing the viability of using hyperspatial imagery with object-based image analysis (OBIA) technology to improve the accuracy and detail of vegetation mapping in coastal wetlands — environments that are notoriously devilish to survey.

“Coastal marshland is extremely difficult to map, both from the air and the ground,” Broussard said. “Accessing it via airboat and foot can disrupt the vegetation you’re trying to protect. Accuracy is an issue because it can be difficult to establish control and often the GPS technology used isn’t survey grade. And traditional aerial surveys and image processing techniques aren’t fine enough to precisely classify the land/water boundary and varied vegetation.”

However, Broussard’s UAS-OBIA application aims to both provide a way to help resolve these unique, wetlands-mapping problems and improve upon traditional survey methodologies. Pairing the UAS image spectral richness with the OBIA intelligent and rapid classifying abilities, Broussard’s integrated technological solution is not only beginning to yield new business revenue streams for JESCO, it may help redefine the business of vegetation mapping for state and local authorities.

Bucking tradition

A comparison of the hyperspatial resolution of UAS data (above) and the traditional 1-meter aerial imagery over the same site at the Rockefeller Wildlife Refuge.

Louisiana’s coast is home to 2.5 million residents — more than half the state’s population — as well as 37 percent of all the coastal marshes and habitat in the continental U.S. Despite the importance of such vital, natural resource assets, Louisiana has lost nearly 1,900 square miles of land since the 1930s. Without action, the coast could lose up to another 4,100 square miles during the next 50 years.

To help combat the natural challenges of its coastline, the state’s Coastal Protection and Restoration Authority (CPRA), with funding and support from the U.S. Geological Survey (USGS) and the Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA), established 390 coastal reference monitoring stations (CRMS) across the entire Louisiana coast to measure the health of the coastal marsh vegetation and the impacts of erosion.

Although the CPRA and USGS have been routinely mapping and monitoring the CRMS sites with 1-meter aerial imagery, the coarser resolution only enables them to map the land/water interface, not vegetation. Ground surveys have been used to measure the vegetation. Teams of scientists walk the marsh along predetermined transects, drop 1 square meter of PVC pipe in random locations, record the location with a handheld GPS, and visually determine the types of vegetation and their coverage values.

That traditional approach is time consuming, and the 1-meter imagery analysis tends to lead to an overestimation of the land cover percentages. Broussard has been using that CRMS data in his research to calibrate and validate a modeling approach that automatically classifies UAS imagery and creates vegetation models using Trimble’s eCognition OBIA technology. eCognition works by following user-defined processing workflows called rule sets to automatically detect and classify specified objects and map them.

“The traditional CRMS data detail and accuracy haven’t been exact enough to create precise vegetation models,” Broussard said. “When I began working with JESCO’s drone data, I thought the hyperspatial, fine-scale imagery would be a natural fit for developing an OBIA-based technique. Unlike traditional image-processing methodologies, the OBIA software could handle the high spectral variance and subtleties of the hyperspatial data. I wanted to test the feasibility of combining the two technologies to produce meaningful coastal vegetation maps that could supplement the state’s traditional monitoring programs.”

In the spring of 2016, Broussard got his chance. JESCO had been contracted to fly its Trimble UX5 Multispectral UAS over a restoration site in Terrebonne Parish, a dense, marshland region near the Gulf of Mexico. With guidance and field support from his postdoctoral mentor Jenneke Visser, Ph.D., at the University of Louisiana at Lafayette, Broussard successfully pitched the idea of extending the field work to fly over a CRMS site in the same area to test his proof-of-concept.

All the colors of the marsh

Broussard chose the nearest 1-square-kilometer CRMS site for his AOI. Its moderate-sized footprint and simple vegetation pattern offered a nice testbed for the pilot. The UX5 flights were scheduled for late August 2016, allowing them to capture data during the peak biomass season.

To ensure reliability and accuracy of the UAS data, they set out five ground control points (GCPs) for each flight block. Carrying a handheld GPS, they navigated to each predefined location, laid down the elevated target — designed not to disturb the vegetation — and used a Trimble Geo 7X GNSS handheld unit to record the GCP’s position via RTK corrections from a VRS network. After a day’s work, the team had placed 11 GCPs throughout three overlapping flight blocks.

Broussard uses a Trimble Geo7x with a Zypher antenna and a VRS realtime correction to survey the “marsh elevation,” a height measurement on the surface of the marsh soil.

Based on his experience using UAS data for wetlands-mapping research, Broussard knew that collecting imagery over patches of open water would be a photogrammetry challenge. So in addition to outfitting the UX5 with a Sony Alpha 5100 sensor for natural color (RGB) imagery, he also added a Sony NEX5 with a modified, near infrared (NIR) sensor to the drone’s payload. The NIR reflectance values would help them better delineate the land/water interface and differentiate between vegetation species.

Using Trimble’s Aerial Imaging Flight Planning software, Broussard established three flight blocks over the AOI. All three were flown with the RGB sensor. The middle block was also flown with the NIR sensor. Launching the drone off their boat about 4 kilometers from the landing site, the UAS flew at an altitude of 75 meters and speeds of 50 mph. The team followed the drone by boat, maintaining constant communication and line-of-sight for each flight, and then guided the aerial rover back to the ground. In three hours of total flight time, the UX5 had collected 4,106 images over the entire AOI at a ground sample distance (GSD) of 2.5 centimeters — data detail that equates to 98.5 billion pixels over 1 square kilometer.

Of the four flights, Broussard chose to use the overlapping RGB and NIR flights — totaling 1,984 images — as his test-case imagery sources. Using Trimble’s InphoUAS Master software, he first generated two digital surface models (DSM) — one from the RGB data and one from the NIR data — and then used the DSMs to produce orthomosaics of each. The orthomosaics and DSMs had horizontal and vertical accuracies of 2.4 cm. All of those products were used as source data for eCognition.

“A key advantage of drone data over traditional aerial photography is the spatial resolution,” Broussard said. “You can see a shadow behind a leaf, the individual plant stems, and the different color tones from one leaf to the next. Those intricate reflectance and elevation values enable you to build point clouds and elevation models that the OBIA technology can use to accurately delineate land from water and classify vegetation.”

A mapping success

For the eCognition process, Broussard needed to start by building a rule set that would instruct the software to methodically isolate and classify image objects according to his user-defined plan. Integrating the DSM and color infrared orthomosaic as inputs, Broussard developed a two-tiered rule set to first delineate the land and water, and then further delineate the land into three vegetation classes: Grass (Spartina patens), Reed (Phragmites australis), and Other.

Using a multispectral segmentation algorithm defined by NIR, red, green, and DSM thresholds and object scale, shape, and compactness specifications, the rule set first segmented the data stack into meaningful objects that gave more weight to NIR reflectance and compactness of the objects while also considering the height of the vegetation. Relying predominantly on the NIR information, Broussard manually defined a water-mask threshold for the software to use to classify the water/land objects. Objects with values below his user-defined threshold were classified as water and objects above the threshold were classified as land. To further refine the water/land classification, Broussard applied a minimum mapping unit (MMU) of 1.3 square meters to the dataset and instructed eCognition to reclassify all water features less than the MMU as land.

Broussard produced a digital surface model (DSM) of the Rockefeller Wildlife Refuge area of interest based on the UAS natural color and near infrared datasets. The DSM elevation data was key to differentiating vegetation classes.
eCognition delineated the land/water and classified the Rockefeller Wildlife Refuge area of interest into four vegetation classes in less than two hours.

The land class objects were then merged and re-segmented by first running a nearly identical multispectral segmentation algorithm, and then, following user-defined spectral rules, the software identified and combined new objects with similar spectral signatures. This created large objects that were compact, circular in shape, and accounted for the height of the vegetation. Reeds were first classified by analyzing the average height of the objects (reeds are taller than the surrounding grasses), and then studying the average difference between the objects and the surrounding objects.

To classify the Other vegetation, Broussard integrated a Normalized Green Red Difference Index and Grey Level Co-occurrence Matrix (GLCM) dissimilarity and contrast indices to define the texture values. eCognition used that information to identify the Other vegetation objects based on a combination of their greenness values and their texture. The remaining land objects were then classified as Grass, which was the dominant vegetation type for this landscape.

“The magic of eCognition is in its segmentation,” Broussard said. “Once you have set your parameters within the segmentation process, eCognition uses those parameters to group pixels into units that share similar attributes and categorize them. It mimics the human brain’s process of identifying objects through pattern recognition. Using a rule set ensures that you are capturing the objects you want classified because the software won’t deviate from the rules. That’s why OBIA is able to methodically and repeatedly do something that humans can’t do.”

Broussard exported the classifications as shapefiles and used ArcGIS to finalize the cartography and perform a spatial analysis, calculating the percent coverage for each of the vegetation types and for land versus water. Based on a comparative analysis with the CPRA’s 2012 data, he identified 100 percent of the vegetation types, calculated plant heights to within 88 to 94 percent, and produced a land-water interface map that was “strikingly more detailed.”

“With drones and OBIA technology, instead of producing point data every few hundred feet, we produce models every few centimeters,” Broussard said. “That gives users an incredibly data rich product to help them better assess vegetation health and to quantify the rate of wetland loss and changes in the coastal zone.”

Indeed, after presenting their maps to the CPRA in the spring of 2017, a wetland scientist expressed interest in developing a new drone-based method for marsh creation monitoring and incorporate it into their traditional monitoring campaigns this fall.

“That kind of response and validation says that the project was a success,” Broussard said. “And it was a significant test-case success for JESCO too, which hadn’t been focused on vegetation mapping previously. It’s given us the opportunity to take on more of this work.”

New business takes flight

That new business, in fact, began only a few months after completing the Terrebonne Parish project, when survey company C.H. Fenstermaker tasked JESCO to survey and classify a wetlands mitigation bank at the Rockefeller Wildlife Refuge (RWR), a 71,500-acre wildlife and fisheries refuge in southwestern Louisiana.

In an effort to offset ecological losses from infrastructure improvements, the RWR developed a 107-acre wetlands mitigation bank and in 2010 and 2012 planted a variety of wetland grasses across the site. The refuge first surveyed the area in 2016 using airboats and visual inspections to determine the vegetation species present and their coverages within a 2-meter by 2-meter piece of PVC on the ground.

Based on that initial survey, scientists were concerned that repeated measurements of the vegetation with this methodology could be harmful to the wetlands health; airboat trails can lead to permanent damage. The RWR wanted a more non-invasive and accurate approach. In November 2017, Broussard went to the RWR to fly and map the mitigation bank’s vegetation.

Working in tandem with Fenstermaker surveyor Ricardo Johnson, they established eight GCPs, setting them at 300- to 600-meter intervals and using a Trimble R7 base station and R6-4 rover RTK GNSS receiver to record their positions.

They flew four UX5 flights — two RGB missions and two NIR — using the same Sony sensors. Given the size and location of the site, Broussard could control and monitor each 30-minute flight from one location. In total, the UX5 collected 4,899 images with a 2.5-cm GSD. Each dataset was used to produce a DSM and an orthomosaic for input into eCognition.

Having created a “master” rule set from the CRMS project, Broussard only needed to slightly modify the rule set to accommodate the different vegetation classes. In less than two hours, eCognition had delineated the land/water boundaries and then classified four objects: Grass (Spartina sp), Reed (Phragmites australis), Shrub/Scrub, and Impervious.

“One of the key elements that made it possible to distinguish this particular vegetation  was the DSM, which you only get with the drone data,” Broussard said. “Having the elevation values allowed me to differentiate the species, particularly the Phragmites australis and the shrub, which are really challenging because they have similar textures and NIR reflectance.”

Although Broussard is currently analyzing RWR’s 2016 survey report to evaluate the accuracy of the classification, the response to the OBIA-based vegetation map from refuge managers has been incredibly positive.

“A detailed, OBIA-based vegetation map gives managers a meaningful measurement of their wetlands environment,” Broussard said. “It also provides a highly accurate map of their wetland acreage — rather than just an estimation — which they can use in their required reporting to authorities.”

Based on the early successes of this new operational application, Broussard sees a bright future for UAS and OBIA technology.

“Combining these technologies opens up tremendous possibilities for monitoring changes on the ground. Coastal environments will still be challenging and dynamic. But with this integrated approach, we can not only replicate and supplement traditional monitoring methodologies, we can produce precise vegetation and land cover maps at scales and speeds that we couldn’t ever imagine or do as a human.”

If Broussard’s vision proves correct, he may experience more exhilarating fieldwork.


Mary Jo Wagner is a freelance writer, editor, and media consultant based in Vancouver, B.C.