Mitigating Risk for Risk Mitigation: What L’Aquila Means for Water Monitoring

On October 22, 2012, six Italian scientists and a government official were sentenced to 6 years in jail, given lifetime bans on holding public office, and ordered to pay compensation of €7.8m in connection with the L’Aquila earthquake. The 6.3 magnitude earthquake injured over 1,000 people and resulted in 300 deaths. The judgment was based on statements made at a press conference following an emergency meeting convened to evaluate whether an episode of 400 tremors over a four month period should be cause for major concern in the region.

Nature provides more details on the specifics of the case but I am more interested in what can be learned from the generalities of the circumstances. The AGU raises the concern that this verdict could “ultimately be harmful to international efforts to understand natural disasters and mitigate associated risk”.

I think this case is illustrative of a cultural shift that is not limited to Italy. In the past, natural disasters were attributed to ‘acts of god’. Natural disasters are increasingly seen as evidence of incompetence for which someone (other than god) needs to be held accountable. The problem is that there is lots of blame to share. Who is to blame for inadequate building codes? Who is to blame for real estate development on unprotected floodplains? Who is the blame for insufficient budgets for maintenance of dams, levees and dikes?

The blamestorming that follows any natural disaster is exposing everyone involved in the delivery of any component in the data-to-decisions chain to elevated risk. We (i.e. the community of stream hydrographers) need to be prepared to manage this risk and this means paying attention to how we communicate with the end-user.

  1. The best defense is to be able to demonstrate that our data publication processes meet the highest standards in the industry for quality, accuracy and timeliness. There should be no systematic (i.e. preventable) reason for our data to be either inaccessible or untrustworthy.
  2. Communication of uncertainty is important. A generic disclaimer on provisional data is not sufficient to fully communicate the increased probability of significant error during unusual events. Whereas methods for explicit quantification of uncertainty are not yet sufficiently robust for operational use, categorical grading of data based on subjectively determined confidence ranges can be managed in a real-time data publication process.
  3. All aspects of a monitoring system are particularly stressed during extreme events. Data gaps and identifiable problems require special attention. Timely, proactive communication is usually the best way to manage expectations of end users.
  4. Communicating with the communicators is important. From the reports I have read it appears that the experts convicted in the L’Aquila case were not blamed for what they said per se but for their failure to correct misreporting of what they said.

We should not be afraid to do our job. We must not hesitate to provide timely data and information needed to mitigate disasters. We do need to be able to demonstrate that we use best practices are for all aspects of the data production process from gauge to page. We do need to make communication of uncertainty an integral part of our products and services. We do need to pay attention to how our products and services are reported on and used. We can, and we should, take great pride in what we do, why we do it, and in the very tangible benefits to society that result from our effort.

No comments yet.

Join the conversation