At Aquatic Informatics we are encouraged to take an active role in the community and so I was quick to agree when I was recently asked do a short course on hydrology for a conference on Global Stewardship at a local private school. It turned out to be a lot of fun, for me at least. I took a look on a map and realized that because the school is located at the crest of a hill and on the edge of a large forested park it provided an excellent opportunity to do a comparison and contrast between forest and urban hydrology. There is a lovely stream running through the forest, which provided an excellent contrast to an adjacent stream course that has been completely built over with housing.
It’s with great pleasure that I’m hosting this month’s webinar on February 28th about some of the most common discrete data management challenges. This topic comes up repeatedly in the field of environmental data management. Regardless of the size of your organization, I’m sure some of the challenges that we’re going to outline will resonate with you and your colleagues.
Inattention and imperfect information costs individuals, organizations and society in immeasurable ways. The relatively new field of information economics (infonomics) is revealing that great efficiencies can be gained by managing information as a strategic asset. All business decisions are made with the information available at the time. Yet, this availability is often a result of desperate scraping of whatever data happens to be readily accessible in real-time resulting in sub-optimal business outcomes. The new insight emerging from the study of infonomics is that decisions can be materially improved by anticipating needs and nurturing the information required to meet those needs.
Ray Maynard calls me a peripatetic hydrologist. I had to look it up. There are two meanings: 1) a person who travels from place to place or 2) an Aristotelian philosopher. I think I fit both definitions. Aristotle placed great emphasis on direct observation of nature and that theory must follow fact. I also travel a lot. Whereas I can’t deny this label, I have to wonder if it was meant as a compliment. After all, hydrology is a place-based, observational, science. How can I be a real hydrologist if I am traveling all the time, and hence, not occupied with making direct observations at a place?
The United States Geological Survey (USGS) has replaced its custom, in-house developed, Automated Data Processing System (ADAPS) originally designed in 1985 with the commercial-off-the-shelf (COTS) AQUARIUS Time-Series software. The state of Alabama, of the USGS Lower Mississippi Water Science Center, has now officially retired the ADAPS system. This is the first in a scheduled deployment rollout of all 50 states. This is a big deal, not only for Aquatic Informatics and for the USGS, but for the world.
In the field of hydrometry there is benefit that arises from global collaboration. Few monitoring agencies have enough resources needed to invest in wide-ranging discovery of better ways for acquiring and producing streamflow data. However, it is feasible for local centres of expertise to develop that can advance any one of many opportunities for significant advancement in the business of water measurement and monitoring.
On my way home from the AWRA conference in Orlando I sat next to a fellow on his way home from the IAAPA Expo (International Association of Amusement Parks & Attractions), which had taken place at the Orange County Conference Centre the same week. Even though he slept for most of the 7 hours we sat next to each other, I did learn a thing or two while he was awake. There were 35,000 people at the amusement park convention and the expo was so large that the distance to walk around all of the vendor booths was 9 miles! It is hard for me to grasp the scale and the meaning of this. There were, perhaps, 500 water professionals who could afford the time and money to come to the AWRA, a significant turnout for water professionals in North America.
The sessions and presentations at AWRA conference in Orlando Florida reinforced many observations I have been making about the water sector. Long gone are the days when the conference was dominated by the stereotype engineer with pocket protectors and a slide rule. There are no sessions on nuances of flood frequency analysis or the shear stress of rip rap. There is obviously still a need for water data for conventional engineering purposes but this need has been overwhelmed by a new reality. The application of water science is changing.
Stream hydrographers from all around Oceania gather for the biennial Australia Hydrographers Association Conference, which was held this year in Canberra, the capital of Australia. Water monitoring is a place-based activity meaning that hydrographers are widely dispersed all across the landscape with very little opportunity to interact, build community, share experiences, and develop best practices.
The most passionate people involved in the water monitoring industry all care deeply about the preservation of traceable provenance for their data. To people on the outside this can seem like an indulgence that adds a burden of work to the data management process with little apparent benefit. The benefit is ‘verifiable truth’, a distinction with little value. Until it matters!
In the United States, the Endangered Species Act of 1973 (ESA) defines endangered species as “any species which is in danger of extinction throughout all or a significant portion of its range… “ and critical habitat as “the specific areas within the geographical area occupied by the species … on which are found those physical or biological features (I) essential to the conservation of the species and (II) which may require special management considerations or protection.” However, when the Endangered Species Act talks about conservation it refers to instruments such as: “research, census, law enforcement, habitat acquisition and maintenance, propagation, live trapping, and transplantation …” Those instruments may have been the best available at the time but times have changed.
The Riverflow 2016 conference had a full session on recent research in image-based measurements and video analysis. It is exciting to watch innovation in process as these researchers learn to exploit the capabilities of emerging consumer technologies. Never mind that the primary use of these technologies is so that people can instantly share their sense of place in the ‘real world’ within the virtual world where they really spend their lives. Without the billions of people motivated to lay claim to their physical existence with photos and videos, the technology for water monitoring using digital imaging would neither be accessible nor affordable.
I was stunned when I looked at NASA’s plot of monthly temperature anomalies showing that not only was July the hottest July ever recorded, it was the hottest month ever recorded, and the most recent (maybe not the last!) in an streak of new monthly records going back 10 months. There is shock value in seeing the whitespace on this graph separating the last ten months from the warmest weather of the past 136 years.
Laboratory analysis of a water quality sample links a lot of data and metadata to a singular point in time and space. However, the objectives for monitoring may span spatial and temporal scales from point sampling (e.g. at an outfall) to watershed assessment (e.g. to characterize waters; identify trends; assess threats; inform pollution control; guide environmental emergency response; and support the development, implementation, and assessment of policies and regulations). Reconciling data- and metadata-dense analytical results with watershed-scale outcomes is a work-in-progress for many monitoring agencies.