Ducks on an algae-covered pond.

More Water Resource Extremes? The New Normal Doesn’t Look Like Your Old Average Anymore

The theme of the Canadian Water Resources Association (CWRA) 2015 conference in Winnipeg this year was “More Extremes? Preparing for future challenges to Canada’s water resources.”

Fittingly, the North American Stream Hydrographers (NASH) held 3 sessions that were dominated by discussions of how to resolve several of the daunting challenges inherent in measuring water.

All of the other sessions – whether talking about water quality; policy, planning and management; flood and drought management; developments in flood forecasting; water and climate change; or transboundary water issues – were on topics dependent on accurate, timely, and reliable information about the variability of water in time and across space.

For example, in the second plenary talk Dr. Greg McCullough spoke about the effect of climate as a forcing variable for water quality in Lakes Winnipeg and Erie. The effects are profound and the process dynamics are non-intuitive. Massive trophic surges are causing degraded water quality in these very large lakes. One might expect the dominant effect of a changing climate is due to an increase in temperature, but in fact, it is recent change in precipitation patterns that is driving the delivery of nutrients causing these disruptive algal blooms. The toxic levels of microcystin have resulted in beach closures and have even been known to kill livestock.

The intensity and timing of precipitation events is resulting in more over-bank flooding onto farmland than would have occurred in the old normal. In the new normal, overbank water becomes saturated with nutrients that have been applied to the field resulting in high nutrient concentration on the descending limb of the hydrograph.

Adaptation to the new normal requires characterization of what the new normal is, which brings us back to the NASH sessions.

New policies, planning, flood management strategies, and agricultural best practices all need to be informed by how much water there is where, and when.

The session ‘Advancements in Stream Hydrography’ covered rating curve development; the use of Acoustic Doppler Current Profilers under ice cover; the value of hydrologic information; and the improvements to the Environment Canada Data Explorer – a powerful data search, discovery, and access tool for Water Survey of Canada data.

The session ‘Innovations in Velocity Index Methods’ covered the role of VI for storm water monitoring; the application of entropy theory for calibration of VI models; simplified methods of discharge estimation for inter-connecting channels with reversing flow; and AVM experience in the Water Survey of Canada.

The session ‘Reference Hydrometric Basin Network’ covered the development of the National Hydro Network; plans for a Canadian geospatial database of basin characteristics; a century of change in the Bow River in Banff National Park; and a discussion on a path forward for reference networks.

All topics covered in the NASH sessions from the preservation of the integrity of long term records, to the creation of new contextual value by integration with geospatial data, to the discovery of the abilities and limitations of sophisticated technologies, to re-imagining methodology to reduce costs, to development and promotion of best practices are necessary activities to ensure a future with enough data for secure water future.  All of these activities reported on were innovative projects undertaken by highly motivated individuals working with tightly limited budgets. Water monitoring is not funded at the level needed to comply with best practices for environmental data management.

As a case in point, new methods for measuring streamflow under an ice cover are being developed on the premise that the data will be better. If data are better then they must be different. A best practice for data management is one entrenched as a principle by the Global Climate Observing System: “a suitable period of overlap for new and old observing systems is required.” In the case of under ice streamflow the annual low flow almost always occurs under ice. The new ADCP method produces data that is less sensitive to errors due to under-ice flow angles. This means that, in general, the new method would likely report lower flow than the old mechanical meter method which is highly sensitive to errors due to flow angles. However, there is no funding for a ‘suitable period of overlap’ so the influences of technology on the Canadian low flow signal will never be known.

The long-term consequences of our ignorance will be expensive.

The cost of replacing ignorance with actionable evidence is relatively small. My new eBook articulates the many and varied benefits of water monitoring. I wrote it to help water professionals form persuasive arguments and business cases that are sensitive to local politics and priorities to help close the funding gap for water monitoring – you can read it here.

Additional water monitoring is needed so that society can adapt to a changing climate.

Monitoring is needed so that society can adapt to change. Funding is needed to ensure that monitoring can adapt to the changing needs. We control the data legacy that will guide society through more extremes. What will our legacy be?

eBook: The Value of Water Monitoring

There is a solution … you understand the value of water monitoring but need additional, sustainable funding. Know that you are not alone. The gap between water monitoring capability and the rapidly evolving need for evidence-based policies, planning, and engineering design is growing. Learn how to form persuasive arguments that are sensitive to local politics and priorities to address this global deficit in funding. The benefits of hydrological information DO vastly outweigh investments in water monitoring.

  • Jaime Saldarriaga
    Posted at 4:31 am, June 24, 2015

    Why not making a statistical difference between climate and weather according to the definition of the World Meteorological Organisation that uses 30 year weather average as a definition for Climate?

      • mpaul hansen
        Posted at 7:58 pm, June 28, 2015

        Difficulties in limited space to respond, so will abbreviate.
        Have not seen recently the know characteristic of streamflow patterns can show Persistence
        wherein the subsequent water-year events are not necessarily statistically independent for a given year,
        i.e. the 1928-1932 period for PNW – USA power planning purposes (N. America). I would suggest 50 years would be a better
        ‘normal’ measure, but of course in many instances the data does not exist.

  • Rick Ross
    Posted at 1:57 pm, August 24, 2015

    Hi Stu
    it appears to me that your insightful comments on the lack of funding for the flow monitoring that is required to make reasonable decisions in the future are right on the money. Our lack of monitoring today will cause uncertainty in the decisions of tomorrow and an increasing certainty of failures from those decisions in the far future.

    It would be wonderful if we could even have inflation adjusted monitoring budgets from the 1990s. Instead we have modernization and efficiency and the continuing reduction in capabilities to make accurate projections based on new climate normals.

    Extremes are today, and will continue to be, the pinch points of the 21st, 22nd etc. centuries. We already know from tree ring data that we have no recorded streamflow data from the extreme 40 year plus droughts of the prairies. We wonder if there are similar periods of extreme floods that we don’t recognize because of the lack of previous data, or that are still to come.
    R Ross

Post a Comment