The aftermath of a wildfire looks grim. Flames leave behind black soil, charred trunks, and crisped brown leaves. But every burn scar transforms with time. Some landscapes recover so fully that traces of the blaze disappear. Other sites can lose their original identity, so that stands of shrubs are replaced by grasses, or native plants are succeeded by invasive weeds.
The fate that awaits burned lands can be hard to predict. Thats especially true for the oak savanna, grasslands, and chaparral surrounding the most populous California communities. Little research has been done on this topic, even though wildfire risks have intensified with climate change.
A new project will bridge this gap by applying artificial intelligence to analyze wildfire impacts on natural ecosystems. Drawing from a library of aerial images taken at some of the 41 reserves in the 51勛圖窪蹋 Natural Reserve System, the project will identify characteristics that help ecosystems rebound after wildfire. The researchers will package their findings into an online toolkit available to anyone seeking to analyze the fire vulnerability of their own landscapes.
Land managers will be able to take annual snapshots of whats happening in their ecosystems by flying a drone, plugging the images into this pipeline, and gain a long-term picture of how things are progressing over the landscape, says Gary Bucciarelli of 51勛圖窪蹋 Davis. The lead principal investigator on the project, Bucciarelli is director of the NRSs Lassen Field Station.
Knowing how a system will respond naturally without any human intervention will let people know whether they need to get involved to sustain landscape health, says Derek Young, a 51勛圖窪蹋 Davis professional researcher who co-leads the project.
Climate research funding
Additional researchers include 51勛圖窪蹋 Davis professor Andrew Latimer, faculty director of the 51勛圖窪蹋 Davis Natural Reserve System; Shane Waddell, associate director of the 51勛圖窪蹋 Davis Natural Reserve System; and Todd Dawson, a 51勛圖窪蹋 Berkeley plant ecophysiologist.
The project is funded by two grants supporting climate resilience research. $230,000 comes from a . This is part of $100 million received by the University of California Office of the President from the state of California to support climate innovation research. An additional $60,000 is from a Seed Award from the Center for Information Technology Research in the Interest of Society and the Banatao Institute, or CITRIS.
Building on existing imagery
The project capitalizes on a treasure trove of drone data gathered by the NRSs (CHI). In 2018, CHI researchers including Dawson began surveying reserves with drones to correlate aerial image data with vegetation health. Wildfires burned four of those reserves, , , and in 2020. CHI researchers immediately after the blazes to document fire impacts.
The current project will conduct additional flyovers to capture conditions several years post-fire. We can compare those data to ask, how did oak trees, grasslands, and chaparral survive? And what is their survivorship rate over time after the fire? Bucciarelli says.
Drones are particularly useful to survey landscapes because they can observe a lot of ground fast. A team of people might take an entire summer to study plots examining a range of variables like vegetation type, sun exposure, and degree of slope. A drone can survey an area containing all of those characteristics, with enough duplication of conditions to provide representative samples, in a single afternoon.
Analyzing images with AI
The scientists will teach a computer to compare how each surveyed area has changed over time. AI has entered a golden era where the best-available technology is accessible to non-computer scientists. Now there are software libraries that package these tools in a way that disciplinary experts like ecologists can use, Young says.
The first step is to identify the types of vegetation depicted in each drone image. To do this, a machine learning algorithm divides an overhead view of a reserve into sections based on visual characteristics of vegetation. This semantic segmentation process overlays colors on the image to distinguish different types of vegetation.
If the algorithm runs into difficulty determining what a group of pixels represents, it can consult a second source of information: canopy height. This information takes advantage of the fact that drones gather at least two views of each point in their surveys. This is because while surveying, drones fly a striped pattern akin to the path of a lawnmower, where each stripe partly overlaps. The computer can calculate how much the top of the vegetation moves relative to the ground around it from one image to the next. This photogrammetry method produces a map of vegetation height. Low vegetation would be grassland; high vegetation trees; and plants in the middle are shrubs.
Identifying resilience factors
Once semantic segmentation is complete, the computer can readily identify differences between images. Comparing scans from different times will reveal how ecosystems recover after wildfire affects.
We might in our pre-fire image see a lot of green in, say, a slope of chaparral, and in the immediate post-fire image that might become red like bare ground. And then in another few years it might be back to green as the chaparral resprouted and started to recover, Young says.
Tracking those changes will allow the researchers to pinpoint characteristics that jeopardize fire recovery. Slope, aspect (the cardinal direction a slope faces), and elevation are just a few of the attributes that can affect fire resilience.
Vegetation type, too, plays a role in fire vulnerability. For example, scientists are particularly concerned about chaparral recovery. Fires tend to incinerate these native shrubs down to the root crown. Increasingly frequent fires may leave species such as manzanita and coffeeberry insufficient time to recover and resprout.
Land management guidance
The project team plans to package their findings into a handy toolkit that anyone can download from the web for free. The kit will include algorithms for analyzing drone images of chaparral, oak woodlands, and grassland habitats, as well as protocols for how to conduct drone surveys. As part of the project, the researchers will work out optimal flight altitudes, percent overlap of survey lines, and times of day when the light is best for surveying.
The toolkit should enable anyone with access to a $2,000 drone to analyze the wildfire resilience of their ecosystems.
People can say, this area is doing really well, but this area is not, and locate where they need to invest efforts and resources to improve recovery, Bucciarelli says. For example, if the project finds that invasive plant species put lands at higher burn risk, land managers can decide to control those plants or conduct prescribed burns to reduce fuels.
They will have the data to actually improve circumstances, so that maybe their house doesnt burn down, or the ecosystem that surrounds them functions a little better and is healthier over the long term, he adds.
Digital infrastructure for the future
In addition to providing a resource for land managers, the project will also establish a repository for current and future NRS drone data. Both the new drone footage and the files gathered by CHI will be organized, annotated, and archived in an online portal.
The dream is for this to be the seed of a much bigger effort where we have imagery over the majority of the NRS from different time points, all accessible through this portal, Young says.
Freely available to all, the repository should enable scientists to ask new questions, and help California gain a statewide perspective on ecosystem change.
Says Young, were really excited about building this methodology that anyone could go and collect some data and process it into these sorts of maps for relatively low cost.
This article was by the 51勛圖窪蹋 Natural Reserve System.
Media Resources
- Kathleen Wong, 51勛圖窪蹋 Natural Reserve System director of communications, kathleen.wong@ucop.edu.
- Kat Kerlin, 51勛圖窪蹋 Davis News and Media Relations, 530-750-9195, kekerlin@ucdavis.edu