Part 2 – Why Bother Doing Prescribed Burn Monitoring

The ecological effects of prescribed burns can vary a lot depending on seasonality, frequency, and fire intensity. In recent years, we have been using a standardized protocol to monitor the intensity of prescribed burns. See Part 1 of this blog post for details. Basically, it combines taking post-burn photos from permanent photo point locations with recording the percent burn coverage, average char height, and fire intensity category in the vegetation and substrate layers. By doing standardized monitoring to characterize burn intensity, it gives us a more nuanced way to assess how prescribed burns are affecting plant communities.


I wish we had started using a standardized protocol to assess fire severity decades ago. Here’s why: one of the most interesting and fun projects of my career was to do a 20-year repeat of a 1996 photo point monitoring study. It was a landscape-scale scavenger hunt combined with an ecologically interesting snapshot of two decades of change. When I analyzed the changes between 1996 and 2003, a dominant trend was that most of our woodlands had an increased density of understory trees. In general, the woodland floors had gotten darker and herbaceous vegetation had become sparser during that period. One exception to this trend was at Raccoon Grove Nature Preserve in Monee, IL. Unlike all the other woodlands, Raccoon Grove had less understory trees in 2003 than in 1996 and was visibly sunnier and brighter.




This sunnier understory at Raccoon Grove was mystifying to me because the only ecological management that we had done during that period was routine prescribed burning. When I pulled the burn records for the period between 1996 and 2003, two of the burns at Raccoon Grove were done in the spring (3/28/01 and 4/3/03) under relatively “cool” conditions (rH 50%+, temps 35-50°F). One fall burn (11/24/98) was done under “moderate” conditions (rH 24‐50%, 50°F) in dry leaf litter. All three burns were reported to have 75-90% burn coverage, and they all seemed to be very normal prescribed burns. Based on the general descriptions given in the reports, none of the burns was the type of truly hot fire that I would expect to kill trees and thin the understory density.


An earlier burn’s report had some crumbs that indicate a potentially hotter fire: this burn was done in the spring (4/2/1996), and conditions were described as “hot” in the prairie and along the forest edge but “cool” in the interior forest. Although weather conditions were not unusual for prescribed burns (rH 35-40%, temps 50-70°F), the burn report references smoldering logs and indicates that fire reignited overnight and had to be put out on a neighbor’s property the next day.
I wish we had used a better protocol to characterize the severity of prescribed burns back then. I would love to know if a more complete and intense burn in the substrate layer from just one burn potentially was responsible for the understory tree thinning that resulted. Or, perhaps, the burn frequency of four “solid” burns in seven years was a bigger factor. With better monitoring, we would be able to tease out the different aspects contributing to a fire’s effects. We would know better how to prescribe burn conditions to meet specific management outcomes.


Prescribed burning is one of the ecological management activities that has higher risks to human and wildlife safety. That makes it important to make sure that the prescribed burns are meeting management objectives. It doesn’t take much extra time to do prescribed burn monitoring, and to accumulate data to learn better how fire intensity can influence vegetation communities. No need to wait – start using a standardized protocol to monitor your fire’s effects during this burn season. Our future selves and future ecologists will thank us!
Fire intensity also varies a lot within cover types depending on fuel load, which depends a lot on time since previous burn. It’s really critical to recognize that when trying to predict how fire is going to behave.
Significant portions of many upland prairies, savannas, and woodlands become fuels limited if burned annually or very frequently, meaning fuel loads and continuity are indadequate for the entire site to burn vs. a 3 or 4 year interval that allows fuels to reach levels closer to equilibrium, which often means fire can burn quite intensly unit-wide. In the short term opening things up (and in in very degraded sites preparing a good seed bed…degraded woods, savannas, etc. need it) may hinge more on high intensity, but maintaining ecological integrity in woods and prairies depends more on maintaining herbaceous sod health, which generally means lower intensity. Above, 1996 looks like it was the first fire, and probably the most intense. That’s often the case when sites receive regular fire at reasonably short interval afterward. There were probably both more light and heavy fuels when the 1996 burn was conducted than there were for any of the subsequent burns.
So what I’m saying is don’t just look at the weather, fuel type, or post-fire effects. In order to understand fire effects, you’ve got to also look at the fuel load pre-burn in the context of how long it has been since a site burned last, if at in recent times.
Thanks Chris. Your comments about fire intensity and fire frequency reminds of Gerry Wilhelm’s talk on fire where he touts the benefits of “annual autumnal fire” where the fire “scuds across the landscape” leaving micro patches of cool burned spots rather than “par boiling” a seldom burned site with an intense fire. It is an interesting topic .
This is Dan Carter. Yeah, Wilhelm and Rericha make the same point in their report / plan for the Hitchcock Nature Center in the Loess Hills, and everything in my experience agrees with that. I think the annual, dormant part is most important, but in fall in particular the bases of bunchgrasses (and sedges) and coarse pithy stems of many other herbaceous plants aren’t cured to the point they burn…unless there are high fuel loads. …and also in mesic (yes mesic) to dry prairies soil crusts develop, which create immense surface complexity, only where litter doesn’t build up, and fire is cool and transient enough not kill those crusts only when fire is quite frequent (except on the dryest prairie). The bases of bunch grasses, pithy stems, and nooks and crannies in the soil surface pretty much covers inverts thought to be vulnerable. I personally don’t think flightless plant hoppers were managing to so much dispersal even in an intact landscape. I think the fires that occurred were *mostly* of the benign type Wilhelm talks about.
Hi prairiebotanist, I didn’t include this information, but the area at Raccoon Grove had been burned in 1990 and 1993, so the 1996 burn referenced in the post wasn’t the area’s first burn. I agree that fuel loading is a factor that influences a fire’s intensity (in addition to weather, seasonality, firing techniques, etc), but fuel loading can be roughly estimated from fuel model type and time since last burn. I still wish the ecologists at the time had done basic monitoring and documentation of fire intensity in a standardized, systematic way for those 1990s era burns, because it’s a basic but missing piece of data. It would be useful to be able to tease out if the plant community shifts resulted from a particular burn (due to intensity, seasonality, or both) or if it was a cumulative effect of multiple burns (frequency plus intensity, seasonality, etc.).
You’re all illustrating my point beautifully! Are we collectively doing enough to monitor and document which burns are “par boiling” (our category 2) versus “scudding across the landscape” (category 3.5 or 4), and all of the gradations in between? If we take the time to do some easy monitoring/documentation in a systematic way, we’ll have better information to analyze and discuss the effects of burns.