Almost everything we know about our global freshwater resources is due to the humble stage-discharge rating curve. The vast majority of all flow data ever produced is the derived result of a transform from a variable that is easy to monitor continuously (stage) to a variable that is impossible to directly measure continuously (discharge). This means we are dependent on rating curves for advancements in hydrological science; for flood forecasting; for drought management; for engineering designs that provide us with physical safety, transportation, water supply and waste disposal; for water management policies and decisions that ensure energy and food security.
Hydrology field work done today, if managed well, becomes part of a legacy of information that will serve for generations to come. As an avid canoeist and whitewater kayaker I was easily drawn into a career in hydrometry in spite of an undergraduate education in biology. Shortly after graduating from the University of Alaska I started work with the Water Survey of Canada in Whitehorse, Yukon. The initial appeal was the freedom to travel extensively to some of the most beautiful landscapes on the planet to measure streamflow. The highlight of my career was measuring 7040 m3s-1 of flow on the Porcupine River using a small, under-powered, aluminum skiff, kevlar tagline and a 150 pound sounding weight. It took 4 tries to string the line, while uprooted trees and large ice floes came down the river. I am guilty of being a data philosopher. I think we have to first be able to clearly articulate what an ideal data set should look like and then we can influence the direction of technological development to make that ideal achievable.
I have been playing around with Paul Whitfield and Jennifer Dierauer’s Flowscreen R package designed for detecting trends and changepoints in hydrological time series and it got me thinking about how time series data analysis may be becoming an endangered activity. The immediate priority for any monitoring agency is to provide data for urgent requirements. Real-time data dissemination is king. You need to go well down the list of urgencies before you come to the requirements of future generations of hydrologists who have not yet been born.
There are several highly sophisticated technologies available for measuring streamflow. However, no amount of electronic wizardry will guarantee that you come home with a good discharge measurement. There are many things that can wrong such as with the electrical power or electronic communication between the system components. In the event as an electronic failure you are screwed because these devices are so expensive that no one can afford to carry a spare.
The theme of the CWRA 2016 conference in Montreal was “Water Management at all Scales: Reducing Vulnerability and Increasing Resilience”. Three days of presentations related to this theme got me thinking about what we need to be doing better in order to be better custodians of damaged, threatened and pristine water systems. We are the inheritors of a legacy of misguided decisions that have left many water sources (e.g. hillslopes, springs, wetlands), waterways, and sinks (e.g. oceans and deep aquifers) in an unhealthy state.
A different point of view changes nothing but it can change everything. Last week I wrote about how unmanned Aerial Vehicles (UAV), also known as drones, could be used for Large Scale Particle image Velocimetry (LSPIV) to get surface velocity measurements, which when combined with surveyed cross-sections can produce extreme flow gaugings. That same drone, equipped with the same camera, can also provide the cross-sectional information needed to complete the job.
Extreme flows are extremely hard to gauge, hence we get very few gaugings to accurately define the top-end of stage-discharge rating curves. This is a problem. Whereas empirically calibrated functional relationships can be trustworthy for the purpose of interpolation, they can be notoriously unreliable for extrapolation. One needs to be very careful about extrapolating any rating curve to an ungauged extreme.
It is increasingly the case that when I am talking to people about what AQUARIUS software ‘should’ do, I find that there are multiple motivations for what ‘should’ means. There are many different ways that value can be perceived and product development depends on this perception. The concept of “shared value” is where companies can solve society’s problems and make profit at the same time.
Water monitoring is a place-based activity. The work is wherever the water is, which is all over the planet. A stream hydrographer can cover a very large geographic area so regional offices typically only concentrate a small number of hydrographers at any one location and there are many locations. Water monitoring agencies have limited resources available to develop specialized training material or to send hydrographers on specialized courses so the most prevalent mode of career development is on-the-job training.
Imagine, for a minute, your stereotype of a person of learning; especially one with detailed knowledge in some specialized field of science. I expect the person filling your mind’s eye is not a ruddy-faced bloke with a substantial belly and a thick Queenslander accent wearing shorts and R.M.Williams boots. Appearances are deceiving. I first met Ray ‘Rainman’ Maynard in Nelson, New Zealand.
Stage-discharge rating curves define a unique relation between water level and discharge, enabling continuous derivation of streamflow from water level record. This is important because water level (which is relatively easy to monitor) is only locally relevant whereas discharge (which is relatively difficult to measure directly) is the integral of all runoff processes upstream of the gauge. The vast majority of all streamflow data that has ever been produced is a derived result of a rating curve. In other words, almost everything that we know (or rather that we think we know) about hydrology is a result of rating curves.
Many people believe that it makes no sense to store data at a resolution that is more precise than the resolution that it can be observed. For example, it is believed that if you round water level to the nearest millimeter then the value will never be more than half a millimeter from the original. This idea was accepted as a reasonable compromise in the 20th century, and data management systems from that era were designed around it as a core concept. Modern data processing requirements, however, demand a different approach.
If there is one theme that dominated water news in February it must be innovation. Starting with how Microsoft is taking water cooling to a whole new level to create fully scalable data centers under ocean waters. I don’t think we can believe that the waste heat in the receiving waters will be totally benign, but it is entirely possible that this is a less impactful solution than any of the other mass computing options. So what if computing and data storage get much, much, cheaper as a result of this technology?
Last month was a busy one for water news. The biggest story of the month has to be that 2015 was the hottest on record. This is true globally with the WMO reporting that 2015 broke all previous records by a strikingly wide margin at 0.76° C above the 1961-1990 average. This marks the first time that global temperatures have been 1° C above the pre-industrial era.
One objective of the Hydrology Corner is to provide a forum where hydrometric problems can be discussed and clever solutions to those problems can be shared. The stream gaugers vs. beavers post is a good example of a discussion of a difficult problem. Not only have several people posted on the blog but the post also resulted in an email exchange with Jeff Watson from Horizons Regional Council who realized that New Zealand may have a solution to a North American problem.