Laboratory analysis of a water quality sample links a lot of data and metadata to a singular point in time and space. However, the objectives for monitoring may span spatial and temporal scales from point sampling (e.g. at an outfall) to watershed assessment (e.g. to characterize waters; identify trends; assess threats; inform pollution control; guide environmental emergency response; and support the development, implementation, and assessment of policies and regulations). Reconciling data- and metadata-dense analytical results with watershed-scale outcomes is a work-in-progress for many monitoring agencies.
Hydrology field work done today, if managed well, becomes part of a legacy of information that will serve for generations to come. As an avid canoeist and whitewater kayaker I was easily drawn into a career in hydrometry in spite of an undergraduate education in biology. Shortly after graduating from the University of Alaska I started work with the Water Survey of Canada in Whitehorse, Yukon. The initial appeal was the freedom to travel extensively to some of the most beautiful landscapes on the planet to measure streamflow. The highlight of my career was measuring 7040 m3s-1 of flow on the Porcupine River using a small, under-powered, aluminum skiff, kevlar tagline and a 150 pound sounding weight. It took 4 tries to string the line, while uprooted trees and large ice floes came down the river. I am guilty of being a data philosopher. I think we have to first be able to clearly articulate what an ideal data set should look like and then we can influence the direction of technological development to make that ideal achievable.
Dr. Ellen Wohl provoked the audience to give thought to how we evaluate river health in her keynote address to the Riverflow 2016 Conference in St. Louis Missouri. Mindfulness of one’s own motivations and openness to a broader perspective is a good starting place for this discussion. Whereas a concrete-lined drainage ditch is highly functional in one world-view, a poorly drained swamp can be highly functional in a different world view. In order to understand rivers we need to be self-aware of our place in the natural order of things.
While there must be an underlying true relation between water level at a given place and time and the corresponding discharge, our experience of that truth is limited to gauging observations from which we must infer the totality of the relationship. It is generally true that if you give the same set of data to “n” different hydrographers they will produce “n” different discharge hydrographs. There is no assurance that any of the hydrographs are actually true. Each hydrographer is making inference about what they believe to be true based on a relatively few gaugings.
Almost everything we know about our global freshwater resources is due to the humble stage-discharge rating curve. The vast majority of all flow data ever produced is the derived result of a transform from a variable that is easy to monitor continuously (stage) to a variable that is impossible to directly measure continuously (discharge). This means we are dependent on rating curves for advancements in hydrological science; for flood forecasting; for drought management; for engineering designs that provide us with physical safety, transportation, water supply and waste disposal; for water management policies and decisions that ensure energy and food security.
I have been playing around with Paul Whitfield and Jennifer Dierauer’s Flowscreen R package designed for detecting trends and changepoints in hydrological time series and it got me thinking about how time series data analysis may be becoming an endangered activity. The immediate priority for any monitoring agency is to provide data for urgent requirements. Real-time data dissemination is king. You need to go well down the list of urgencies before you come to the requirements of future generations of hydrologists who have not yet been born.
There are several highly sophisticated technologies available for measuring streamflow. However, no amount of electronic wizardry will guarantee that you come home with a good discharge measurement. There are many things that can wrong such as with the electrical power or electronic communication between the system components. In the event as an electronic failure you are screwed because these devices are so expensive that no one can afford to carry a spare.
The theme of the CWRA 2016 conference in Montreal was “Water Management at all Scales: Reducing Vulnerability and Increasing Resilience”. Three days of presentations related to this theme got me thinking about what we need to be doing better in order to be better custodians of damaged, threatened and pristine water systems. We are the inheritors of a legacy of misguided decisions that have left many water sources (e.g. hillslopes, springs, wetlands), waterways, and sinks (e.g. oceans and deep aquifers) in an unhealthy state.
A different point of view changes nothing but it can change everything. Last week I wrote about how unmanned Aerial Vehicles (UAV), also known as drones, could be used for Large Scale Particle image Velocimetry (LSPIV) to get surface velocity measurements, which when combined with surveyed cross-sections can produce extreme flow gaugings. That same drone, equipped with the same camera, can also provide the cross-sectional information needed to complete the job.
Extreme flows are extremely hard to gauge, hence we get very few gaugings to accurately define the top-end of stage-discharge rating curves. This is a problem. Whereas empirically calibrated functional relationships can be trustworthy for the purpose of interpolation, they can be notoriously unreliable for extrapolation. One needs to be very careful about extrapolating any rating curve to an ungauged extreme.
It is increasingly the case that when I am talking to people about what AQUARIUS software ‘should’ do, I find that there are multiple motivations for what ‘should’ means. There are many different ways that value can be perceived and product development depends on this perception. The concept of “shared value” is where companies can solve society’s problems and make profit at the same time.
Water monitoring is a place-based activity. The work is wherever the water is, which is all over the planet. A stream hydrographer can cover a very large geographic area so regional offices typically only concentrate a small number of hydrographers at any one location and there are many locations. Water monitoring agencies have limited resources available to develop specialized training material or to send hydrographers on specialized courses so the most prevalent mode of career development is on-the-job training.
Imagine, for a minute, your stereotype of a person of learning; especially one with detailed knowledge in some specialized field of science. I expect the person filling your mind’s eye is not a ruddy-faced bloke with a substantial belly and a thick Queenslander accent wearing shorts and R.M.Williams boots. Appearances are deceiving. I first met Ray ‘Rainman’ Maynard in Nelson, New Zealand.
Stage-discharge rating curves define a unique relation between water level and discharge, enabling continuous derivation of streamflow from water level record. This is important because water level (which is relatively easy to monitor) is only locally relevant whereas discharge (which is relatively difficult to measure directly) is the integral of all runoff processes upstream of the gauge. The vast majority of all streamflow data that has ever been produced is a derived result of a rating curve. In other words, almost everything that we know (or rather that we think we know) about hydrology is a result of rating curves.
Many people believe that it makes no sense to store data at a resolution that is more precise than the resolution that it can be observed. For example, it is believed that if you round water level to the nearest millimeter then the value will never be more than half a millimeter from the original. This idea was accepted as a reasonable compromise in the 20th century, and data management systems from that era were designed around it as a core concept. Modern data processing requirements, however, demand a different approach.