On my way home from the AWRA conference in Orlando I sat next to a fellow on his way home from the IAAPA Expo (International Association of Amusement Parks & Attractions), which had taken place at the Orange County Conference Centre the same week. Even though he slept for most of the 7 hours we sat next to each other, I did learn a thing or two while he was awake. There were 35,000 people at the amusement park convention and the expo was so large that the distance to walk around all of the vendor booths was 9 miles! It is hard for me to grasp the scale and the meaning of this. There were, perhaps, 500 water professionals who could afford the time and money to come to the AWRA, a significant turnout for water professionals in North America.
The sessions and presentations at AWRA conference in Orlando Florida reinforced many observations I have been making about the water sector. Long gone are the days when the conference was dominated by the stereotype engineer with pocket protectors and a slide rule. There are no sessions on nuances of flood frequency analysis or the shear stress of rip rap. There is obviously still a need for water data for conventional engineering purposes but this need has been overwhelmed by a new reality. The application of water science is changing.
Stream hydrographers from all around Oceania gather for the biennial Australia Hydrographers Association Conference, which was held this year in Canberra, the capital of Australia. Water monitoring is a place-based activity meaning that hydrographers are widely dispersed all across the landscape with very little opportunity to interact, build community, share experiences, and develop best practices.
The most passionate people involved in the water monitoring industry all care deeply about the preservation of traceable provenance for their data. To people on the outside this can seem like an indulgence that adds a burden of work to the data management process with little apparent benefit. The benefit is ‘verifiable truth’, a distinction with little value. Until it matters!
In the United States, the Endangered Species Act of 1973 (ESA) defines endangered species as “any species which is in danger of extinction throughout all or a significant portion of its range… “ and critical habitat as “the specific areas within the geographical area occupied by the species … on which are found those physical or biological features (I) essential to the conservation of the species and (II) which may require special management considerations or protection.” However, when the Endangered Species Act talks about conservation it refers to instruments such as: “research, census, law enforcement, habitat acquisition and maintenance, propagation, live trapping, and transplantation …” Those instruments may have been the best available at the time but times have changed.
The Riverflow 2016 conference had a full session on recent research in image-based measurements and video analysis. It is exciting to watch innovation in process as these researchers learn to exploit the capabilities of emerging consumer technologies. Never mind that the primary use of these technologies is so that people can instantly share their sense of place in the ‘real world’ within the virtual world where they really spend their lives. Without the billions of people motivated to lay claim to their physical existence with photos and videos, the technology for water monitoring using digital imaging would neither be accessible nor affordable.
I was stunned when I looked at NASA’s plot of monthly temperature anomalies showing that not only was July the hottest July ever recorded, it was the hottest month ever recorded, and the most recent (maybe not the last!) in an streak of new monthly records going back 10 months. There is shock value in seeing the whitespace on this graph separating the last ten months from the warmest weather of the past 136 years.
Laboratory analysis of a water quality sample links a lot of data and metadata to a singular point in time and space. However, the objectives for monitoring may span spatial and temporal scales from point sampling (e.g. at an outfall) to watershed assessment (e.g. to characterize waters; identify trends; assess threats; inform pollution control; guide environmental emergency response; and support the development, implementation, and assessment of policies and regulations). Reconciling data- and metadata-dense analytical results with watershed-scale outcomes is a work-in-progress for many monitoring agencies.
Dr. Ellen Wohl provoked the audience to give thought to how we evaluate river health in her keynote address to the Riverflow 2016 Conference in St. Louis Missouri. Mindfulness of one’s own motivations and openness to a broader perspective is a good starting place for this discussion. Whereas a concrete-lined drainage ditch is highly functional in one world-view, a poorly drained swamp can be highly functional in a different world view. In order to understand rivers we need to be self-aware of our place in the natural order of things.
While there must be an underlying true relation between water level at a given place and time and the corresponding discharge, our experience of that truth is limited to gauging observations from which we must infer the totality of the relationship. It is generally true that if you give the same set of data to “n” different hydrographers they will produce “n” different discharge hydrographs. There is no assurance that any of the hydrographs are actually true. Each hydrographer is making inference about what they believe to be true based on a relatively few gaugings.
The Great Lakes hold 21% of the world’s fresh surface water by volume. Only the right information today can ensure the sustainable use of these waters for generations to come. In North America, the Great Lakes account for 84 percent of fresh surface water. Today, these lakes are sourced for drinking water for over 40 million people. One and a half million U.S. jobs and $62 billion in U.S. wages depend on the health of the Great Lakes. While restorations efforts are progressing, climate change and water quality concerns still threaten their ecosystem. You’re invited to make a difference by participating in the 2016 Great Lakes Observing System (GLOS) Data Challenge!
Almost everything we know about our global freshwater resources is due to the humble stage-discharge rating curve. The vast majority of all flow data ever produced is the derived result of a transform from a variable that is easy to monitor continuously (stage) to a variable that is impossible to directly measure continuously (discharge). This means we are dependent on rating curves for advancements in hydrological science; for flood forecasting; for drought management; for engineering designs that provide us with physical safety, transportation, water supply and waste disposal; for water management policies and decisions that ensure energy and food security.
I have been playing around with Paul Whitfield and Jennifer Dierauer’s Flowscreen R package designed for detecting trends and changepoints in hydrological time series and it got me thinking about how time series data analysis may be becoming an endangered activity. The immediate priority for any monitoring agency is to provide data for urgent requirements. Real-time data dissemination is king. You need to go well down the list of urgencies before you come to the requirements of future generations of hydrologists who have not yet been born.
There are several highly sophisticated technologies available for measuring streamflow. However, no amount of electronic wizardry will guarantee that you come home with a good discharge measurement. There are many things that can wrong such as with the electrical power or electronic communication between the system components. In the event as an electronic failure you are screwed because these devices are so expensive that no one can afford to carry a spare.