Written by Chris Goodell, P.E., D. WRE | WEST Consultants
Copyright © RASModel.com. 2010. All rights reserved.
I would appreciate any feedback from you all on this topic. It's something I've been thinking about for a while now. My biggest concern with the current practice of dam breach modeling is the overwhelming uncertainty associated with dam breach parameters. Not only the ultimate breach shape and development time, but things like the initiation mechanism of the breach, the discharge coefficients (both weir and orifice), and the progression rate. The deterministic approach we use leaves a bit to be desired in my opinion. Sensitivity analyses have shown that the breach outflow hydrograph can easily vary by 100% or more, based on the set of parameters used. I've been considering ways to generate a breach outflow hydrograph based on probabilistic methods. The idea being instead of providing our "best conservative guess" for the breach hydrograph, we can produce a 95% (or whatever percent) conditional non-exceedance hydrograph based on both overall peak discharge and also timing. Meaning, this is the dam breach hydrograph that will not be exceeded in peak value 95% of the time, given a dam failure for a given failure mechanism (overtopping or piping). This is done by assigning probablity distribution functions to each breach parameter, then run a Monte Carlo simulation using random assignments (within the minimum and maximum bounds and following the prescribed distribution function) for each breach parameter. Then we can plug the resulting 95% hydrograph (or the associated set of breach parameters to create that hydrograph) into our HEC-RAS unsteady flow model and resume our deterministic approach. At least we have taken the deterministic selection of breach parameters out of the analysis. I suppose at some time, the entire model could be approached with probabilistic methods, but first things first. In fact, HEC is currently working on implementing Monte Carlo simulation capabilities into HEC-RAS for a future release.
I wonder if any state Dam Safety office is ready for this type of analysis for preparing inundation maps for emergency action plans. I think it makes more sense.
Copyright © RASModel.com. 2010. All rights reserved.
I would appreciate any feedback from you all on this topic. It's something I've been thinking about for a while now. My biggest concern with the current practice of dam breach modeling is the overwhelming uncertainty associated with dam breach parameters. Not only the ultimate breach shape and development time, but things like the initiation mechanism of the breach, the discharge coefficients (both weir and orifice), and the progression rate. The deterministic approach we use leaves a bit to be desired in my opinion. Sensitivity analyses have shown that the breach outflow hydrograph can easily vary by 100% or more, based on the set of parameters used. I've been considering ways to generate a breach outflow hydrograph based on probabilistic methods. The idea being instead of providing our "best conservative guess" for the breach hydrograph, we can produce a 95% (or whatever percent) conditional non-exceedance hydrograph based on both overall peak discharge and also timing. Meaning, this is the dam breach hydrograph that will not be exceeded in peak value 95% of the time, given a dam failure for a given failure mechanism (overtopping or piping). This is done by assigning probablity distribution functions to each breach parameter, then run a Monte Carlo simulation using random assignments (within the minimum and maximum bounds and following the prescribed distribution function) for each breach parameter. Then we can plug the resulting 95% hydrograph (or the associated set of breach parameters to create that hydrograph) into our HEC-RAS unsteady flow model and resume our deterministic approach. At least we have taken the deterministic selection of breach parameters out of the analysis. I suppose at some time, the entire model could be approached with probabilistic methods, but first things first. In fact, HEC is currently working on implementing Monte Carlo simulation capabilities into HEC-RAS for a future release.
I wonder if any state Dam Safety office is ready for this type of analysis for preparing inundation maps for emergency action plans. I think it makes more sense.
I am interested in hearing more about your idea, but good luck getting any state dam safety office to listen. Have you done an analysis like this and compared results to current methods?
ReplyDeleteIt's a good idea.
ReplyDeleteBut it would increase the cost associated to a dambreak study.
Assigning probablity distribution functions to each breach parameter could be really questionable.
But... Good idea!
sounds good but assigning probabilities to breach parameters will be tricky.
ReplyDeleteIn this case i don't know if we have enougth breach data to create reliable probability distributions of breach data... but i'm sure there are solutions for that.
A agree with anonymous, without enough data the basis for the probabilistic model will be poor, resulting in obfuscating the lack of knowledge behind statistics.
ReplyDeleteOne of my other concerns is that not only the breach parameters have a large impact, but the impacts of the turbulence and water entrainment downstream of the breach are also not well understood.
The next comment would be on the 95%, that still leaves the odds of 1 in 20 that my feeling of safety downstream of a dam could be unjustified, I would like to be more certain given the enourmous consequences of a dambreak.
That said, I really think that probabilistic modeling is the way to go to better model risk, but I think the technical complexities are only the tip of the iceberg.
The lack of historical breach data is what bothers me most about the deterministic approach. I think it makes much more sense to acknowledge the uncertainty by assigning a probability range to the breach parameters. Some parameters, like the breach width might logically be assigned a Gaussian distribution with the standard deviation defined by the spread in the results by running a handful of breach width parameter equations. I think we all would agree that assigning a single breach width based on a spread of regression equations is illogical, but we all do it because it is the only way. Then we make ourselves feel better by running a sensitivity analysis with different breach widths. But we still end up with one breach width that we ultimately use for the final flood map. I guess in simple terms, I feel better about saying I think, with a high degree of certainty, that the breach width for a given dam will be between X and Z, rather than saying, with a high degree of uncertainty, that the breach width will be Y.
ReplyDeleteSimon, I agree with you. And I only think of this as a start (tip of the iceberg). But you have to start somewhere. Eventually, we should include all of the uncertainty in the probabilistic approach, including downstream roughness, discharge coefficients, boundary conditions, etc.
Chris you have an excellent idea. All modeling involves some degree of uncertainty with parameters and dam breach analysis certainly involves quite a bit of uncertainty. The HEC already incorporates uncertainty considerations (via a Monte Carlo simulation) in HEC-FDA. I don't see why this feature couldn't be incorporated into HEC-RAS. EM 1110-2-1619 and ER 1105-2-101 provide some good discussion on the consideration of risk and uncertainty for flood damage reduction projects. If the USACE ever produced an EM or ER discussing dam breach analysis, it may be easier to get dam safety officials to accept a change to analysis procedures. Or maybe it could a good research project for a grad student somewhere to present at the ASDSO conference one year.
ReplyDeleteChris...Good post, I think you are on the right track. I completely agree with you; the purpose of probabilistic models is to expose the uncertainties in what you are trying to model. There is plenty of dam breach data available; it is just highly questionable data due to the uncertainty associated with it.
ReplyDeleteI think this post brings up a bigger topic in that we rarely present the uncertainty associated with our work...but I'll leave that one for another day.
A very similar example to what you are proposing is the HEC-FDA program which uses Monte Carlo simulations for flood damage assessment. The model was developed to integrate RAS WSPs with structure inventory data to produce stage-damage functions; basically to assign damage amounts to flood insurance studies.
Creating models to spit out TR-55 Qps and associated WSELs based on open-channel flow calcs is one thing...the uncertainty of the modeling is something we know and live with. Attaching dollar amounts to flood risk assessments is a whole new game. HEC likely knew that the results of these models would be used to direct federal expeditures and they knew it was necessary to take the uncertainty of these models into account.
In FDA, uncertainty is assigned to exceedance probabilty (usually through Log Pearson III analytical analysis); stage-discharge by estimating difference in WSELs due to debris and changes in Manning's "n"; and structure depth-damage functions through data provided by the Corps' IWR.
H&H and Economic inputs are brought together and run through several hundred-thousand Monte Carlo simulations and the product is a range of Damage Exceedance Probabilities (i.e. 75% Prob. Damage Exceeds "X"...50% Prob. Exceeds "Z"...as you suggested in your comment).
The model is not perfect and a lot of controversy has been directed at these studies (mostly towards the admittedly questionable IWR damage curves); but in my opinion, it is one of the more technically sound predictive models I've used.
As an engineer, I would much rather have a range of probable outputs with which to make an informed decision about the risk of a dam breach than use a "plug and chug" equation based on storage volume and breach height; and I don't see much "big picture" difference in assessing flood risk and dam breach risk...just different inputs.
If you haven't already, I would suggest taking a look at FDA and it's associated methodology. I am very familiar with the model and would be happy to answer any questions you might have.
Very good post...Good luck and I will stay tuned to see what you come up with.
Thanks CJ for the very useful information. I appreciate your feedback. And FDA was exactly what I had in mind when I started thinking about this approach. I may come back to you for some help as I get in to this more. Thanks for the offer!
ReplyDeleteYou ought to take a look at the following journal paper that deals with just this subject:
ReplyDeleteFroehlich, D. C. (2008). “Embankment dam breach parameters and their uncertainties.” Journal of Hydraulic Engineering, 134(12), 1708-1721.
Keep in mind that modeling a dam breach as a trapezoid that grows in time as the dam erodes is a greatly simplified approximation of what actually takes place when a dam fails. While more exact models of dam breaching are possible (both within one-dimensional cross-section averaged AND two-dimensional depth-averaged hydrodynamic solution frameworks), the breach development algorithms that are coded in HEC-RAS and HEC-HMS are going to be with us for a long time to come.
Thanks David. I will certainly read your paper.
ReplyDeleteHey Chris,
ReplyDeleteSorry to revive this seemingly dead thread, but I had a few thoughts. I too have similar frustrations due to the wide range of empirical breach equations. From studying the USBR 1998 report "comparing peak flow relations to case studies" it seems evident that some equations have a better regression correlation to dam failures within a certain height.
For example Froehlich's peak flow equation has (correct me if I am wrong) one of the best R-values, which is typically less than Danny Fread's (what HEC-RAS uses) equation for medium to large dams, but larger for small dam failures.
If a range of "best fit parameters" could be adopted to suit different sized dams could that provide a more accurate inundation envelope?
Hi Tim-
DeleteIt may be "dead" in this thread, but this work has been progressing, and in fact I've used it on a number of projects. I even have software now that runs HEC-RAS dam breach models in a Monte Carlo analysis. I've also put out some papers that demonstrate the technique. They are listed below.
To answer your question, yes, I think a range of best fit parameters could (and should) be used in the probabilistic approach. My thought is that we should run a number of different breach parameter equations to help define the uncertainty range, but to favor the breach parameter equations that best fit your dam. Also, I'm not aware of HEC-RAS incorporating any peak flow breach equations. The latest beta version of 4.2 does have breach parameter equations, but I don't recall seeing peak flow equations.
Goodell, Christopher R., “Moving Towards Risk-Based Dam Breach Modeling”, Proceedings, Dam Safety 2013 Conference, Providence, Rhode Island, September, 2013.
Goodell, C.R., “A Probabilistic Approach to Dam Breach Modeling”, Proceedings, FloodRisk 2012 Conference, Rotterdam, The Netherlands, November, 2012.
Goodell, Christopher R.; Froehlich, David C., “Comparison of Reservoir Routing Methods Used to Calculate Dam Breach Outflows”, Presentation, World Environmental & Water Resources Congress (EWRI) 2012, Albuquerque, New Mexico, May 2012.
Goodell, Christopher R.; Froehlich, David C., “Comparison of Dam Breach Flood Uncertainty Calculations”, Presentation, United States Society on Dams 2012, New Orleans, Louisiana, April 2012.
Froehlich, David C.; Goodell, Christopher R., “Breach of Duty (Not): Evaluating the Uncertainty of Dam-Breach Flood Predictions”, Proceedings, United States Society on Dams 2012, New Orleans, Louisiana, April 2012.
Any way to download/buy the MCBreach software?
DeletePlease send me an email so I can discuss options with you.
DeleteChris.Goodell@KleinschmidtGroup.com
Chris