More Water Resource Extremes? The New Normal Doesn’t Look Like Your Old Average Anymore

The theme of the Canadian Water Resources Association (CWRA) 2015 conference in Winnipeg this year was “More Extremes? Preparing for future challenges to Canada’s water resources.”

Fittingly, the North American Stream Hydrographers (NASH) held 3 sessions that were dominated by discussions of how to resolve several of the daunting challenges inherent in measuring water.

All of the other sessions – whether talking about water quality; policy, planning and management; flood and drought management; developments in flood forecasting; water and climate change; or transboundary water issues – were on topics dependent on accurate, timely, and reliable information about the variability of water in time and across space.

For example, in the second plenary talk Dr. Greg McCullough spoke about the effect of climate as a forcing variable for water quality in Lakes Winnipeg and Erie. The effects are profound and the process dynamics are non-intuitive. Massive trophic surges are causing degraded water quality in these very large lakes. One might expect the dominant effect of a changing climate is due to an increase in temperature, but in fact, it is recent change in precipitation patterns that is driving the delivery of nutrients causing these disruptive algal blooms. The toxic levels of microcystin have resulted in beach closures and have even been known to kill livestock.

The intensity and timing of precipitation events is resulting in more over-bank flooding onto farmland than would have occurred in the old normal. In the new normal, overbank water becomes saturated with nutrients that have been applied to the field resulting in high nutrient concentration on the descending limb of the hydrograph.

Adaptation to the new normal requires characterization of what the new normal is, which brings us back to the NASH sessions.

New policies, planning, flood management strategies, and agricultural best practices all need to be informed by how much water there is where, and when.

The session ‘Advancements in Stream Hydrography’ covered rating curve development; the use of Acoustic Doppler Current Profilers under ice cover; the value of hydrologic information; and the improvements to the Environment Canada Data Explorer – a powerful data search, discovery, and access tool for Water Survey of Canada data.

The session ‘Innovations in Velocity Index Methods’ covered the role of VI for storm water monitoring; the application of entropy theory for calibration of VI models; simplified methods of discharge estimation for inter-connecting channels with reversing flow; and AVM experience in the Water Survey of Canada.

The session ‘Reference Hydrometric Basin Network’ covered the development of the National Hydro Network; plans for a Canadian geospatial database of basin characteristics; a century of change in the Bow River in Banff National Park; and a discussion on a path forward for reference networks.

All topics covered in the NASH sessions from the preservation of the integrity of long term records, to the creation of new contextual value by integration with geospatial data, to the discovery of the abilities and limitations of sophisticated technologies, to re-imagining methodology to reduce costs, to development and promotion of best practices are necessary activities to ensure a future with enough data for secure water future.  All of these activities reported on were innovative projects undertaken by highly motivated individuals working with tightly limited budgets. Water monitoring is not funded at the level needed to comply with best practices for environmental data management.

As a case in point, new methods for measuring streamflow under an ice cover are being developed on the premise that the data will be better. If data are better then they must be different. A best practice for data management is one entrenched as a principle by the Global Climate Observing System: “a suitable period of overlap for new and old observing systems is required.” In the case of under ice streamflow the annual low flow almost always occurs under ice. The new ADCP method produces data that is less sensitive to errors due to under-ice flow angles. This means that, in general, the new method would likely report lower flow than the old mechanical meter method which is highly sensitive to errors due to flow angles. However, there is no funding for a ‘suitable period of overlap’ so the influences of technology on the Canadian low flow signal will never be known.

The long-term consequences of our ignorance will be expensive.

The cost of replacing ignorance with actionable evidence is relatively small. My new eBook articulates the many and varied benefits of water monitoring. I wrote it to help water professionals form persuasive arguments and business cases that are sensitive to local politics and priorities to help close the funding gap for water monitoring – you can read it here.

Additional water monitoring is needed so that society can adapt to a changing climate.

Monitoring is needed so that society can adapt to change. Funding is needed to ensure that monitoring can adapt to the changing needs. We control the data legacy that will guide society through more extremes. What will our legacy be?

Photo Credit: Justin Henry  |  “mmm, algae bloom”


eBook: The Value of Water Monitoring

There is a solution … you understand the value of water monitoring but need additional, sustainable funding. Know that you are not alone. The gap between water monitoring capability and the rapidly evolving need for evidence-based policies, planning, and engineering design is growing. Learn how to form persuasive arguments that are sensitive to local politics and priorities to address this global deficit in funding. The benefits of hydrological information DO vastly outweigh investments in water monitoring. Read this free eBook today!

6 responses to “More Water Resource Extremes? The New Normal Doesn’t Look Like Your Old Average Anymore”

  1. Jaime Saldarriaga June 24, 2015 at 4:31 am

    Why not making a statistical difference between climate and weather according to the definition of the World Meteorological Organisation that uses 30 year weather average as a definition for Climate?

    • Hi Jaime,
      Climate Normals are very helpful for establishing a 30 year mean baseline as long as you have a ‘normal’ (i.e. Gaussian) distribution that is independent at a 30 year lag. I don’t know who came up with the idea that climate normals should be 30 years and not 10, 20, 50 or any other number of years. My suspicion is that the choice of 30 is informed by the domain of statistics (where 30 is a widely used rule of thumb for a point of diminishing returns on increasing degrees of freedom) rather than climatology where decadal scale variability (e.g. the Pacific Decadal Oscillation) is ‘normal’ resulting in risk of serial auto-correlation over 30 years.

      That is beside the point of the title for this blog which is that ‘averages’ are not so useful anymore. The average streamflow is not changing much. What is changing are the extremes. There are a large number of statistics available to describe location (e.g. mean, median, mode), dispersion (e.g. max, min, variance standard deviation) and shape (e.g. skewness, kurtosis) of any distribution. Like it or not we are being forced to admit that the most relevant metrics for water management in the 21st Century are those that describe dispersion and shape of the distribution. A more narrowly constrained focus on defining what is ‘normal’ got us through the 20th Century but it is time to move on. This simple observation has big implications for how we measure, monitor, analyze and manage water resources.

      • Difficulties in limited space to respond, so will abbreviate.
        Have not seen recently the know characteristic of streamflow patterns can show Persistence
        wherein the subsequent water-year events are not necessarily statistically independent for a given year,
        i.e. the 1928-1932 period for PNW – USA power planning purposes (N. America). I would suggest 50 years would be a better
        ‘normal’ measure, but of course in many instances the data does not exist.

        • Hi Paul,
          The duration of a the amount of time you need to sample in order to characterize a signal depends on the characteristic you are interested in. It might be possible to identify the annual mean with only a few years of data but if the characteristic you want to design for is the magnitude and frequency of 5-year duration droughts you might need centuries of data.It has always been tough to provide society with enough water data to guide highly impactful water resources decisions. Our task is being made that much harder by the moving goalposts of what we need to be monitoring for and how much more data is required to make an impact. Fortunately, the people who get into the business of water monitoring are always up to a good challenge. Unfortunately, meeting these new challenges will be expensive. We have 20th Century funding for a 21st Century monitoring problem.

  2. Hi Stu
    it appears to me that your insightful comments on the lack of funding for the flow monitoring that is required to make reasonable decisions in the future are right on the money. Our lack of monitoring today will cause uncertainty in the decisions of tomorrow and an increasing certainty of failures from those decisions in the far future.

    It would be wonderful if we could even have inflation adjusted monitoring budgets from the 1990s. Instead we have modernization and efficiency and the continuing reduction in capabilities to make accurate projections based on new climate normals.

    Extremes are today, and will continue to be, the pinch points of the 21st, 22nd etc. centuries. We already know from tree ring data that we have no recorded streamflow data from the extreme 40 year plus droughts of the prairies. We wonder if there are similar periods of extreme floods that we don’t recognize because of the lack of previous data, or that are still to come.
    R Ross

    • Hi Rick,
      Reflecting on the absence of relevant data about drought, it is easy to imagine that anyone promoting water monitoring during past episodes of drought would be dismissed as a nut-case. The notion that the data would be incredibly valuable in 50 years time would not be seen as a persuasive argument in the context of pressing concerns of the day.

      We have stayed the course of not monitoring for the future – as if the future will never come – ever since. However, the future is unfolding before our very eyes and we can only imagine how different our policies, planning, infrastructure and attitudes would be if informed by hard data rather than wishful thinking.

      Who is the nutcase? The one who willfully repeats past mistakes? Or the one who considers what could have been and strategically engineers a better future? It is comforting to know that there is at least one other person who agrees with my opinion about the value of water monitoring.

Join the conversation

This site uses Akismet to reduce spam. Learn how your comment data is processed.