The sessions and presentations at AWRA conference in Orlando Florida reinforced many observations I have been making about the water sector. Long gone are the days when the conference was dominated by the stereotype engineer with pocket protectors and a slide rule. There are no sessions on nuances of flood frequency analysis or the shear stress of rip rap. There is obviously still a need for water data for conventional engineering purposes but this need has been overwhelmed by a new reality. The application of water science is changing.
Hydrology field work done today, if managed well, becomes part of a legacy of information that will serve for generations to come. As an avid canoeist and whitewater kayaker I was easily drawn into a career in hydrometry in spite of an undergraduate education in biology. Shortly after graduating from the University of Alaska I started work with the Water Survey of Canada in Whitehorse, Yukon. The initial appeal was the freedom to travel extensively to some of the most beautiful landscapes on the planet to measure streamflow. The highlight of my career was measuring 7040 m3s-1 of flow on the Porcupine River using a small, under-powered, aluminum skiff, kevlar tagline and a 150 pound sounding weight. It took 4 tries to string the line, while uprooted trees and large ice floes came down the river. I am guilty of being a data philosopher. I think we have to first be able to clearly articulate what an ideal data set should look like and then we can influence the direction of technological development to make that ideal achievable.
Stream hydrographers from all around Oceania gather for the biennial Australia Hydrographers Association Conference, which was held this year in Canberra, the capital of Australia. Water monitoring is a place-based activity meaning that hydrographers are widely dispersed all across the landscape with very little opportunity to interact, build community, share experiences, and develop best practices.
The most passionate people involved in the water monitoring industry all care deeply about the preservation of traceable provenance for their data. To people on the outside this can seem like an indulgence that adds a burden of work to the data management process with little apparent benefit. The benefit is ‘verifiable truth’, a distinction with little value. Until it matters!
In the United States, the Endangered Species Act of 1973 (ESA) defines endangered species as “any species which is in danger of extinction throughout all or a significant portion of its range… “ and critical habitat as “the specific areas within the geographical area occupied by the species … on which are found those physical or biological features (I) essential to the conservation of the species and (II) which may require special management considerations or protection.” However, when the Endangered Species Act talks about conservation it refers to instruments such as: “research, census, law enforcement, habitat acquisition and maintenance, propagation, live trapping, and transplantation …” Those instruments may have been the best available at the time but times have changed.
The Riverflow 2016 conference had a full session on recent research in image-based measurements and video analysis. It is exciting to watch innovation in process as these researchers learn to exploit the capabilities of emerging consumer technologies. Never mind that the primary use of these technologies is so that people can instantly share their sense of place in the ‘real world’ within the virtual world where they really spend their lives. Without the billions of people motivated to lay claim to their physical existence with photos and videos, the technology for water monitoring using digital imaging would neither be accessible nor affordable.
I was stunned when I looked at NASA’s plot of monthly temperature anomalies showing that not only was July the hottest July ever recorded, it was the hottest month ever recorded, and the most recent (maybe not the last!) in an streak of new monthly records going back 10 months. There is shock value in seeing the whitespace on this graph separating the last ten months from the warmest weather of the past 136 years.
Laboratory analysis of a water quality sample links a lot of data and metadata to a singular point in time and space. However, the objectives for monitoring may span spatial and temporal scales from point sampling (e.g. at an outfall) to watershed assessment (e.g. to characterize waters; identify trends; assess threats; inform pollution control; guide environmental emergency response; and support the development, implementation, and assessment of policies and regulations). Reconciling data- and metadata-dense analytical results with watershed-scale outcomes is a work-in-progress for many monitoring agencies.
Dr. Ellen Wohl provoked the audience to give thought to how we evaluate river health in her keynote address to the Riverflow 2016 Conference in St. Louis Missouri. Mindfulness of one’s own motivations and openness to a broader perspective is a good starting place for this discussion. Whereas a concrete-lined drainage ditch is highly functional in one world-view, a poorly drained swamp can be highly functional in a different world view. In order to understand rivers we need to be self-aware of our place in the natural order of things.
While there must be an underlying true relation between water level at a given place and time and the corresponding discharge, our experience of that truth is limited to gauging observations from which we must infer the totality of the relationship. It is generally true that if you give the same set of data to “n” different hydrographers they will produce “n” different discharge hydrographs. There is no assurance that any of the hydrographs are actually true. Each hydrographer is making inference about what they believe to be true based on a relatively few gaugings.
Almost everything we know about our global freshwater resources is due to the humble stage-discharge rating curve. The vast majority of all flow data ever produced is the derived result of a transform from a variable that is easy to monitor continuously (stage) to a variable that is impossible to directly measure continuously (discharge). This means we are dependent on rating curves for advancements in hydrological science; for flood forecasting; for drought management; for engineering designs that provide us with physical safety, transportation, water supply and waste disposal; for water management policies and decisions that ensure energy and food security.
I have been playing around with Paul Whitfield and Jennifer Dierauer’s Flowscreen R package designed for detecting trends and changepoints in hydrological time series and it got me thinking about how time series data analysis may be becoming an endangered activity. The immediate priority for any monitoring agency is to provide data for urgent requirements. Real-time data dissemination is king. You need to go well down the list of urgencies before you come to the requirements of future generations of hydrologists who have not yet been born.
There are several highly sophisticated technologies available for measuring streamflow. However, no amount of electronic wizardry will guarantee that you come home with a good discharge measurement. There are many things that can wrong such as with the electrical power or electronic communication between the system components. In the event as an electronic failure you are screwed because these devices are so expensive that no one can afford to carry a spare.
The theme of the CWRA 2016 conference in Montreal was “Water Management at all Scales: Reducing Vulnerability and Increasing Resilience”. Three days of presentations related to this theme got me thinking about what we need to be doing better in order to be better custodians of damaged, threatened and pristine water systems. We are the inheritors of a legacy of misguided decisions that have left many water sources (e.g. hillslopes, springs, wetlands), waterways, and sinks (e.g. oceans and deep aquifers) in an unhealthy state.
A different point of view changes nothing but it can change everything. Last week I wrote about how unmanned Aerial Vehicles (UAV), also known as drones, could be used for Large Scale Particle image Velocimetry (LSPIV) to get surface velocity measurements, which when combined with surveyed cross-sections can produce extreme flow gaugings. That same drone, equipped with the same camera, can also provide the cross-sectional information needed to complete the job.