Summary: | Inside many U.S. federally designated wilderness areas, fire suppression is the dominant management strategy largely due to the risk that fires pose to resources adjacent to the wilderness boundary. Opportunities to exploit the fuel treatment and risk-mitigation benefits of allowing wilderness fires to burn are foregone when ignitions are suppressed. Existing risk-based metrics (e.g. burn probability) produced from wildfire simulation models were not designed to inform management of wilderness fires. They focus on the management of fuels, or on suppression resource allocation, not managing ignitions through monitoring strategies for resource benefits. The purpose of this research was to develop a risk-based decision support metric to support wilderness fire management. The metric, escape probability, was developed using the Bob Marshall Wilderness Complex, Montana, USA, (BMWC) as the case study landscape, and applied to evaluate previous management decisions to suppress ignitions within the BMWC. The outputs from two wildfire simulation models, FARSITE (Finney, 1998) and FSim (Finney et al., 2011), were used to map escape probability for two different landscape scenarios in 2007: (1) an observed landscape reflecting fuel conditions as a result of actual wildfire management strategies; and (2) a treated landscape that reflects hypothetical fuels and vegetation assuming suppressed ignitions in 2007 had been allowed to burn. First, wildfire spread and behavior for suppressed ignitions in 2007 were retrospectively simulated using FARSITE. Hypothetical fuels layers were created for each retrospectively simulated fire by modifying the observed pre-fire fuels conditions within the simulated perimeter based on modeled burn severity. The observed and hypothetical fuels layers were then used as inputs in FSim, a large wildfire modeling system commonly used in quantitative wildfire risk analyses. Differences in the likelihood of future wilderness fire escape between the observed and treated landscape scenarios were examined for both inside the simulated area burned by the suppressed ignitions (i.e. the treated area) and the area within several kilometers of the simulated wildfire perimeters (i.e. the off-site effects). Results suggest that larger treated areas arising from ignitions closer to the wilderness boundary had the greatest effect on reducing the likelihood of wilderness fire escape within the treated area. The relationship between ignition location, fire size, and reduction in escape probability outside the treated area was variable. Fire and fuels managers can use escape probability information during strategic decision-making and pre-season planning to allow natural fires to burn absent of suppression, as well as to evaluate the effectiveness of different risk-mitigation strategies based on how the strategies affect future opportunities to allow natural ignitions to burn.
|