The most passionate people involved in the water monitoring industry all care deeply about the preservation of traceable provenance for their data. To people on the outside this can seem like an indulgence that adds a burden of work to the data management process with little apparent benefit. The benefit is ‘verifiable truth’, a distinction with little value. Until it matters!
In the United States, the Endangered Species Act of 1973 (ESA) defines endangered species as “any species which is in danger of extinction throughout all or a significant portion of its range… “ and critical habitat as “the specific areas within the geographical area occupied by the species … on which are found those physical or biological features (I) essential to the conservation of the species and (II) which may require special management considerations or protection.” However, when the Endangered Species Act talks about conservation it refers to instruments such as: “research, census, law enforcement, habitat acquisition and maintenance, propagation, live trapping, and transplantation …” Those instruments may have been the best available at the time but times have changed.
The Riverflow 2016 conference had a full session on recent research in image-based measurements and video analysis. It is exciting to watch innovation in process as these researchers learn to exploit the capabilities of emerging consumer technologies. Never mind that the primary use of these technologies is so that people can instantly share their sense of place in the ‘real world’ within the virtual world where they really spend their lives. Without the billions of people motivated to lay claim to their physical existence with photos and videos, the technology for water monitoring using digital imaging would neither be accessible nor affordable.
I was stunned when I looked at NASA’s plot of monthly temperature anomalies showing that not only was July the hottest July ever recorded, it was the hottest month ever recorded, and the most recent (maybe not the last!) in an streak of new monthly records going back 10 months. There is shock value in seeing the whitespace on this graph separating the last ten months from the warmest weather of the past 136 years.
Laboratory analysis of a water quality sample links a lot of data and metadata to a singular point in time and space. However, the objectives for monitoring may span spatial and temporal scales from point sampling (e.g. at an outfall) to watershed assessment (e.g. to characterize waters; identify trends; assess threats; inform pollution control; guide environmental emergency response; and support the development, implementation, and assessment of policies and regulations). Reconciling data- and metadata-dense analytical results with watershed-scale outcomes is a work-in-progress for many monitoring agencies.
Dr. Ellen Wohl provoked the audience to give thought to how we evaluate river health in her keynote address to the Riverflow 2016 Conference in St. Louis Missouri. Mindfulness of one’s own motivations and openness to a broader perspective is a good starting place for this discussion. Whereas a concrete-lined drainage ditch is highly functional in one world-view, a poorly drained swamp can be highly functional in a different world view. In order to understand rivers we need to be self-aware of our place in the natural order of things.
While there must be an underlying true relation between water level at a given place and time and the corresponding discharge, our experience of that truth is limited to gauging observations from which we must infer the totality of the relationship. It is generally true that if you give the same set of data to “n” different hydrographers they will produce “n” different discharge hydrographs. There is no assurance that any of the hydrographs are actually true. Each hydrographer is making inference about what they believe to be true based on a relatively few gaugings.
The Great Lakes hold 21% of the world’s fresh surface water by volume. Only the right information today can ensure the sustainable use of these waters for generations to come. In North America, the Great Lakes account for 84 percent of fresh surface water. Today, these lakes are sourced for drinking water for over 40 million people. One and a half million U.S. jobs and $62 billion in U.S. wages depend on the health of the Great Lakes. While restorations efforts are progressing, climate change and water quality concerns still threaten their ecosystem. You’re invited to make a difference by participating in the 2016 Great Lakes Observing System (GLOS) Data Challenge!
Almost everything we know about our global freshwater resources is due to the humble stage-discharge rating curve. The vast majority of all flow data ever produced is the derived result of a transform from a variable that is easy to monitor continuously (stage) to a variable that is impossible to directly measure continuously (discharge). This means we are dependent on rating curves for advancements in hydrological science; for flood forecasting; for drought management; for engineering designs that provide us with physical safety, transportation, water supply and waste disposal; for water management policies and decisions that ensure energy and food security.
I have been playing around with Paul Whitfield and Jennifer Dierauer’s Flowscreen R package designed for detecting trends and changepoints in hydrological time series and it got me thinking about how time series data analysis may be becoming an endangered activity. The immediate priority for any monitoring agency is to provide data for urgent requirements. Real-time data dissemination is king. You need to go well down the list of urgencies before you come to the requirements of future generations of hydrologists who have not yet been born.
There are several highly sophisticated technologies available for measuring streamflow. However, no amount of electronic wizardry will guarantee that you come home with a good discharge measurement. There are many things that can wrong such as with the electrical power or electronic communication between the system components. In the event as an electronic failure you are screwed because these devices are so expensive that no one can afford to carry a spare.
The theme of the CWRA 2016 conference in Montreal was “Water Management at all Scales: Reducing Vulnerability and Increasing Resilience”. Three days of presentations related to this theme got me thinking about what we need to be doing better in order to be better custodians of damaged, threatened and pristine water systems. We are the inheritors of a legacy of misguided decisions that have left many water sources (e.g. hillslopes, springs, wetlands), waterways, and sinks (e.g. oceans and deep aquifers) in an unhealthy state.
A different point of view changes nothing but it can change everything. Last week I wrote about how unmanned Aerial Vehicles (UAV), also known as drones, could be used for Large Scale Particle image Velocimetry (LSPIV) to get surface velocity measurements, which when combined with surveyed cross-sections can produce extreme flow gaugings. That same drone, equipped with the same camera, can also provide the cross-sectional information needed to complete the job.
Extreme flows are extremely hard to gauge, hence we get very few gaugings to accurately define the top-end of stage-discharge rating curves. This is a problem. Whereas empirically calibrated functional relationships can be trustworthy for the purpose of interpolation, they can be notoriously unreliable for extrapolation. One needs to be very careful about extrapolating any rating curve to an ungauged extreme.