Fire Paper Challenging Traditional Wildfire Science

Much of the debate in fire science is about the historical fire regime in dry conifer forests such as these ponderosa pine in the Ochoco Mountains of Oregon. Photo George Wuerthner 

A critical paper discussing fire ecology and, by implication, fire policy was published in the Journal Fire.

The paper’s title: “Countering Omitted Evidence of Variable Historical Forests and Fire Regime in Western USA Dry Forests: The Low-Severity-Fire Model Rejected,” hints at the paper’s substance. The authors assert their research concludes that there is “a broad pattern of scientific misrepresentations and omissions by government forest and wildfire scientists.”

The paper is authored by William Baker, formerly at the University of Wyoming Program in Ecology department, who has published dozens of peer-reviewed studies. Baker has also published what I consider to be the single best book on fire ecology, “Fire Ecology in Rocky Mountain Landscapes.” I recommend his volume if you had to read only one book on the topic.

Baker is joined by other prominent scientists, including Chad Hanson, Mark Williams, and Dominick Dellasalla.

Ponderosa pine in Wild Basin of Rocky Mountain National Park, Colorado where research suggests mixed to high severity fires where many trees are killed is the norm for these trees. Photo George Wuerthner 

The basic premise of their paper is that the dominant view about the historical fire regime in western forests among many state and federal agencies, politicians, and even too many conservation groups needs revision and overlooks contrary evidence that suggests mixed to high severity blazes were always a characteristic of western forests.

This misinformation is compounded by an ecologically illiterate media that assumes that foresters must be experts in forest ecology and repeat the dominant paradigm that forests are overly dense due to fire suppression and lack of “active forest management.

The dominant worldview they are critiquing is that western forests, particularly lower-elevation pine forests dominated by ponderosa pine and other dry conifer forests, were influenced primarily by frequent low-severity fire regimes that did not kill trees but created open-parklike stands dominated by large old-growth trees.

Ponderosa pine forest on the Coconino National Forest in Arizona is the model for the  Southwest Fire Model which posits that frequent low severity blazes consumed fuel, but did not kill mature trees. Photo George Wuerthner 

Instead, they present evidence that this model sometimes termed the Southwest Fire Model due to its origination in Arizona and New Mexico pine forests, may overlook mixed to high severity fire that occasionally occurred in these woodlands.

The Southwest Fire Model presupposes frequent fires (often attributed to Native American ignitions) occurred so frequently (typically from 1 to 20 years) that the flames seldom killed the larger trees. Baker and his colleagues challenge this version of events.

The reason this is important has to do with US forest policy. Based on the flawed assumption that nearly all dry forests were dominated by low-frequency burns, these agencies often prescribe logging or what I term chainsaw medicine to “cure” what they perceive are overly dense forest stands that they argue contribute to the large wildfires we have seen in recent years. As a result, public agencies have engaged in an aggressive logging program using euphemisms like forest restoration, fuel reduction thinning, and building resilience in forests.

Logging at Kirk Hill on the Custer Gallatin National Forest by Bozeman justified on “restoring” historic forest conditions. Photo George Wuerthner 

So widespread is this biased version of the low-frequency fire model that many agencies even apply it to many other forest types like higher elevation fir, spruce, and lodgepole pine forests that were clearly dominated by stand replacement high severity blazes to justify logging.

The Baker paper focuses on a previously published article by Hagmann et al. (2021), Evidence for widespread changes in the structure, composition, and fire regimes of western North American forests, among other things, critiqued much of the contrary published research, including those done by the authors of Baker et al.

Historic photo from the Deschutes National Forest in Oregon showing dense forest behind the loggers suggesting that not all dry conifer forests were “open and park-like.” Photo Deschutes County Historical Society 

Hagmann et al. posits, “The cumulative results of more than a century of research document a persistent and substantial fire deficit and widespread alterations to ecological structures and functions.”  They assert that forests are denser than historically due to fire suppression and thus more prone to disease, insects, and wildfires (ironically, all natural processes that reduce forest density). At least some of the co-authors of this paper believe the solution is “active forest management” or logging to recreate what they believe were historical forest conditions.

Hagmann et al. is co-authored by many of the most prominent fire scientists in the country, including some who contributed to my book Wildfire: A Century of Failed Forest Policy. So it is easy to see why most in the media, who may not be familiar with fire science, might believe the Hagmann paper is the final word on fire.

The authors of Baker et al. believe this narrative of fire suppression and logging as the cure diverts funding and attention from other community safety policies like home hardening, evacuation, and defensible space that are proven to reduce home vulnerability to wildfire. They also argue that logging contributes to carbon emissions that aggravate the current climate warming, which is the primary driver of large blazes.

There is a lot of  damage from logging including the spread of weeds and sedimentation from logging roads often ignored or minimized by logging advocates. Photo George Wuerthner 

The focus on logging as the preferred response to fire has many other ecological impacts—many far worse for forest ecosystems than any wildfire. Most western plant communities are adapted to wildfire to some degree. Still, logging is an entirely new influence that brings many ecologically harmful issues, such as sedimentation from logging, disturbance of sensitive wildlife, and loss of biomass and structure from forested landscapes. It is also important to note that logging degrades forest genetics by indiscriminately removing trees that may be resilient to drought, disease, insects, and fire.

Among the criticisms of the Hagmann paper by Baker and co-authors is that most fire research and interpretation of historical forest conditions is based on fire scar studies. In several papers, Baker outlines some of the uncertainty associated with fire scar studies. For example, in his book on Fire Ecology in Rocky Mountain Landscapes, Baker details the methodological problems, which are recounted in the recent paper.

I also outline numerous methodological problems with fire scar research.

Baker et al. counter that other proxy methods like air photo interpretation, pollen studies, charcoal studies, sediment studies, Government Land Office reports, fire atlases, and early forest inventories used to reconstruct fire history tend to find more examples of mixed to high-severity fires in all forest types, including dry forests. A commonality of these alternative methods is that they tend to record and present a “landscape” scale perspective on wildfire events.

One of the problems common to this fire ecology debate is the tendency to compare the findings of different studies with dissimilar temporal and spatial scales. For instance, looking back at fire occurrences over 200 years, you may find no evidence of significant high-severity blazes. But if your temporal scale is 500-1000 years, you may detect several large high-severity fires.

Most fire scar studies involve small non-random plots whose results are then extrapolated to characterize larger landscapes. Many earlier fire scar studies also used the “composite” method of combining all the years where any fire is detected. This tends to reduce the fire interval. However, as some critics suggest, it merely counts fires without indicating the geographical scale influenced by those fires.

One critique of Baker’s work was one on ponderosa pine forests near Flagstaff, Arizona. However, Baker et al. do not dispute that the frequent low-severity fire model may characterize the Southwest’s driest forests. Instead, they dispute its applicability to all dry forests.

Most of these alternative means of reconstructing fire influence on plant communities frequently conclude that the fire interval, even in dry conifer stands, was longer than fire scar studies suggest. These findings are of consequence because if the fire intervals were more extended even if fire suppression were effective (another assumption that can be challenged), the historical condition of forests today may not be so far out of whack and thus not need any “active forest management.”

To give just one example of the contrary results, Paul Hessburg et al. used early aerial photography to reconstruct historical fires from forest structures across 178,902 ha of ponderosa pine and Douglas-fir cover types in eastern Washington and Oregon mixed-conifer forests.

Old growth ponderosa pine, Joseph Canyon, Wallowa Whitman National Forest, Oregon. Photo George Wuerthner 

Hessburg concluded that “The structure of mixed conifer patches, in particular, was formed by a mix of disturbance severities . . . Evidence for low-severity fires as the primary influence, or of abundant old park-like patches, was lacking in both the dry and moist mixed conifer forests. The relatively low abundance of old, park-like or similar forest patches, high abundance of young and intermediate-aged patches, and widespread Evidence of partial stand and stand-replacing fire suggested that variable fire severity and non-equilibrium patch dynamics were primarily at work and also: “ . . . before any extensive management had occurred, the influence of fire in the dry forest was of a frequency and severity that intermittently regenerated rather than maintained large areas of old, fire-tolerant forest”

Baker’s work using Government Land Office (GLO) survey notes finds similar results. As someone who once was a cadastral surveyor for the BLM, I can assert that surveyor notes recorded the condition of the vegetation community at the time of the survey.

Since many of these GLO surveys occurred in the late 1800s before any significant settlement, livestock grazing, logging, fire suppression, or other influences typically ascribed as the driving force in altering forest stand composition, these surveys provide an actual account of the dominant forest condition.

Baker et al. also challenge Hagmann et al. for their conclusion that high-severity fires today exceed the historical condition. Again, spatial scale is the problem. One needs to review an adequate sample area for this question to evaluate whether there are more high-severity fires. Study areas need to be several times larger than the largest recent fires, which generally means on the order of 250,000 ha or more [4], because if only one or a few fires nearly fully burned an area, then n = 1 or a small number, and the sample is inadequate. Nearly all the studies cited by Hagmann fail to meet this requirement. Of three studies that did meet this spatial scale requirement, Baker et al. found that high-severity fires were burning at rates similar to or lower than historical conditions.

One of the critical findings of Baker et al. is that you can have both frequent-low severity blazes and occasional mixed to high severity fires that are responsible for much of the acreage burned in extreme fire weather. They are not mutually exclusive.

One must remember that the climate/weather conditions that favor high-severity fires, including those burning under high winds, are impossible to control. While blazes under more moderate fire weather situations are often suppressed, or self-suppressed, high-severity fires almost always burn until the weather/climate conditions change. This may bias conclusions because high-severity blazes are responsible for most acreage burned.

Thinned forest on the Deschutes National Forest by Bend, Oregon. Note the nearly even age of the stand, and lack of any understory vegetation. This is a degraded and sanitized forest stand. Photo George Wuerthner 

The other important idea to keep in mind is that the climatic conditions we are experiencing today with climate warming are different from the conditions that created the forest composition a century or two ago. The only recent analogy to today’s climatic conditions occurred during the Medieval Warm Spell (800-1300 AD), during which time extensive areas of forests burned.

The Baker et al. paper explains how Hagmann misrepresented other studies, including those by Baker and colleagues. This portion of the paper may be too detailed for most people outside of the fire community, but it is still worth reviewing if you are interested in the debate over fire policy.

I find the arguments of Baker et al. compelling. Anyone who questions the Forest Service’s (and other agencies’) fire policies should at least be familiar with this paper as it raises questions about the ongoing justifications for “active forest management” on public lands.

 

Comments

  1. David Avatar
    David

    Hey George,
    Would you be willing to write an article in response to the recent High Country News piece on wildfire management in the West? The reporter rather quickly dismissed any critique of the current standard explanation for our forest wildfire problems.

  2. Michael A. Lewis Avatar

    Thanks for bringing this to the fore, George. It’s high time we get these myths put away for good.

  3. Jeff Hoffman Avatar
    Jeff Hoffman

    Since there is this disagreement between experts in this area, there’s only one solution: no more fire suppression and no more logging! If forests are unnaturally dense due to fire suppression, natural processes like wildfires will fix that soon enough if we just let them take their course. If they’re not overly dense despite fire suppression, then there’s no need to log them.

    1. Maggie Frazier Avatar
      Maggie Frazier

      Jeff – be real! Thats just too simple…

      1. Jeff Hoffman Avatar
        Jeff Hoffman

        Sorry, I forgot.

  4. Rambling Dave Avatar
    Rambling Dave

    The other day I found this little nugget in Cormac McCarthy’s latest:

    “The world does not know that you are here. You think that you understand this. But you don’t. Not in your heart you don’t. If you did you would be terrified.”

    One of the most terrifying things for a human to be is irrelevant. And that is essentially what Baker, Hanson, Wuerthner, etc are saying when they publish these kinds of research papers and essays. These ancient landscapes don’t need us and they never have. Some people (maybe most people) see that as a kind of threat.

    1. Jeff Hoffman Avatar
      Jeff Hoffman

      I don’t understand what you mean. The research papers are reporting facts. Humans are not at all necessary for the Earth or any life here. We provide no ecosystem services, and in fact do nothing but needlessly destroy and kill.

      Do think that true facts should be censored? Or are you just saying that humans are a problem?

  5. Ida Lupine Avatar
    Ida Lupine

    You’re both right! I’m kind of happy to realize that I am irrelevant in the grand scheme. Humans meddle too much and think their input is much more important and helpful than it really is, and we rationalize the damage we do.

    Science and facts are very important, but philosophy and art have an interpretation of these concepts too. The Road by Cormac McCarthy is one of the most beautiful books I have ever read, a favorite.

    1. Jeff Hoffman Avatar
      Jeff Hoffman

      “Science” is a very broad term. Do you mean all science, western science (which has a reductionist and mechanistic view of life), helpful science like conservation & wildlife biology and ecology? Do you mean things like theoretical physics and astronomy? Do you mean just the search for truth?

      “Science” originally meant the search for truth, which is fine. But if you overemphasize the intellect, which humans have grossly done, then you end up with a view of life that’s so myopic that it’s totally inaccurate. Facts that can be seen through the intellect are only one of many views of life, and shouldn’t be seen as the end all and be all of life.

      1. Ida Lupine Avatar
        Ida Lupine

        Absolutely. For me, probably the search for the truth, regardless of any species superiority bias. That way, we encompass it all.

    2. Rambling Dave Avatar
      Rambling Dave

      The Road is one of my favorites too. I always saw it as an environmental book, although a cautionary one. The movie was also quite good.

      If you liked McCarthy’s The Road you’d probably find Marlen Haushofer’s The Wall interesting.

      1. Ida Lupine Avatar
        Ida Lupine

        Yes. As time goes on I see it very much as an environmental/climate/nuclear war cautionary tale.

        I’ll look forward to reading The Wall! Thanks!

        1. Jeff Hoffman Avatar
          Jeff Hoffman

          I tried to find the movie, but apparently it’s not available in the U.S.

  6. rastadoggie Avatar
    rastadoggie

    Important science for ya’ll to use in the NEPA process for the big, dumb cuts proposed by your local cutters and their paid enviro flunkies. Another perk is that journalists – formerly swayed by foresters(loggers)and their unhealthy forest rhetoric – can wrap their heads around this. Its already making a difference. Thank you for this!

Author
George Wuerthner is an ecologist and writer who has published 38 books on various topics related to environmental and natural history. He has visited over 400 designated wilderness areas and over 200 national park units.

Subscribe to get new posts right in your Inbox

George Wuerthner
×