Part 2 – Why Bother Doing Prescribed Burn Monitoring

The ecological effects of prescribed burns can vary a lot depending on seasonality, frequency, and fire intensity. In recent years, we have been using a standardized protocol to monitor the intensity of prescribed burns. See Part 1 of this blog post for details. Basically, it combines taking post-burn photos from permanent photo point locations with recording the percent burn coverage, average char height, and fire intensity category in the vegetation and substrate layers. By doing standardized monitoring to characterize burn intensity, it gives us a more nuanced way to assess how prescribed burns are affecting plant communities.


I wish we had started using a standardized protocol to assess fire severity decades ago. Here’s why: one of the most interesting and fun projects of my career was to do a 20-year repeat of a 1996 photo point monitoring study. It was a landscape-scale scavenger hunt combined with an ecologically interesting snapshot of two decades of change. When I analyzed the changes between 1996 and 2003, a dominant trend was that most of our woodlands had an increased density of understory trees. In general, the woodland floors had gotten darker and herbaceous vegetation had become sparser during that period. One exception to this trend was at Raccoon Grove Nature Preserve in Monee, IL. Unlike all the other woodlands, Raccoon Grove had less understory trees in 2003 than in 1996 and was visibly sunnier and brighter.




This sunnier understory at Raccoon Grove was mystifying to me because the only ecological management that we had done during that period was routine prescribed burning. When I pulled the burn records for the period between 1996 and 2003, two of the burns at Raccoon Grove were done in the spring (3/28/01 and 4/3/03) under relatively “cool” conditions (rH 50%+, temps 35-50°F). One fall burn (11/24/98) was done under “moderate” conditions (rH 24‐50%, 50°F) in dry leaf litter. All three burns were reported to have 75-90% burn coverage, and they all seemed to be very normal prescribed burns. Based on the general descriptions given in the reports, none of the burns was the type of truly hot fire that I would expect to kill trees and thin the understory density.


An earlier burn’s report had some crumbs that indicate a potentially hotter fire: this burn was done in the spring (4/2/1996), and conditions were described as “hot” in the prairie and along the forest edge but “cool” in the interior forest. Although weather conditions were not unusual for prescribed burns (rH 35-40%, temps 50-70°F), the burn report references smoldering logs and indicates that fire reignited overnight and had to be put out on a neighbor’s property the next day.
I wish we had used a better protocol to characterize the severity of prescribed burns back then. I would love to know if a more complete and intense burn in the substrate layer from just one burn potentially was responsible for the understory tree thinning that resulted. Or, perhaps, the burn frequency of four “solid” burns in seven years was a bigger factor. With better monitoring, we would be able to tease out the different aspects contributing to a fire’s effects. We would know better how to prescribe burn conditions to meet specific management outcomes.


Prescribed burning is one of the ecological management activities that has higher risks to human and wildlife safety. That makes it important to make sure that the prescribed burns are meeting management objectives. It doesn’t take much extra time to do prescribed burn monitoring, and to accumulate data to learn better how fire intensity can influence vegetation communities. No need to wait – start using a standardized protocol to monitor your fire’s effects during this burn season. Our future selves and future ecologists will thank us!

























































