I have written previously about risk mitigation in the context of the L’Aquila earthquake. A panel discussion about the Wivenhoe Dam case at the HWRS conference has brought many of the underlying issues to the forefront yet again.
At issue is who can judge and who should have the power to judge the action of experts in the wake of disasters?
I cannot speak to the specific instance of the Wivenhoe case, but I think it’s generally true in hydrology that if decision errors are going to be made, those decision errors will be made during extreme events. This, I believe, is because of uncertainty and because the human mind is poorly conditioned to optimize decisions in the face of uncertainty.
The data to inform decisions during extreme events (e.g. to quantify reservoir inflow) has typically been generated using rating curves at flows that most likely have never been gauged. The extrapolation of the curve to the extreme condition will have significant, but unknown, uncertainty.
During conditions when the data have low uncertainty, we’re conditioned to optimize decisions, which we can do because the risk of a decision error is low. However, when data have high uncertainty there is a high probability that any decision will go sideways and result in unexpected, undesirable, outcomes.
A measure of uncertainty on data is an index of the probability of decision error.
The engineers responsible for operational decisions are likely unaware of the uncertainties in the data and they almost certainly have no direct experience with the uncertainties of the most extreme data. Without this information there will be a tendency to try and optimize decisions using the same rule curves that apply for ‘normal’ operating conditions.
Uncertainty as actionable information.
An alternate model to consider is for uncertainty to be communicated with the data and used as actionable information. When uncertainty is low, decisions can be optimized with high confidence in the outcome of the decisions. When uncertainty is high the precautionary principle can be triggered and so that decisions can be factored to result in the least likelihood of harm, which may result in sub-optimal use of the reservoir storage capacity for its primary purpose. Any water that is unnecessarily spilled for precautionary reasons is merely a cost of data uncertainty.
The good thing about that is that once a cost can be attributed to data uncertainty, an argument can be made for additional investments in monitoring needed to reduce uncertainty. Currently, uncertainty that is unknown is unmanaged.
Another factor in the face of uncertainty is that there are many cognitive biases that can interfere with evidence-based decision making. Atir, Roszenwig and Dunning recently came to the surprising conclusion that experts are prone to over-claiming their knowledge. Epistemic error from this heuristic bias should not ever be a problem for an expert working with familiar events and with data with low uncertainty. Such error is highly likely, however, when data are uncertain.