I think an appropriate analogy is “fight fire with fire.” In this approach, algae are farmed to produce productive and useful biomass while removing excess nutrients downstream of sources of agricultural runoff. By cleaning the water, the beneficial algae mitigate the conditions that result in harmful algal blooms. I expect we will be hearing much more about this approach in the future.
One of my “go-to” reference books is the 975 page “Bibliography of Hydrometry” by Steponas Kolupaila, so this document about his contribution to hydrological science caught my eye. Throughout the 20th century, the effects of politics and geography isolated practitioners of hydrometry as effectively as if they were finches on the Galapagos Islands. The result was specialized regional methods for everything from discharge measurements to rating curves. These specialized regional methods developed to the point where inter-breeding of ideas was no longer possible. I have always admired Kolupaila’s quixotic attempts to encourage cross-breeding through knowledge exchange to ensure a robust and resilient future for hydrometry. This is a wonderful tribute to a great scholar.
This article caught my eye because just a few months ago I wrote a blog post that suggested the use of cloud seeding for glacier preservation. It will be very interesting to see if this experiment using artificial snow in Switzerland works out.
Stream hydrographers are highly attuned to fluvial morphodynamics. These are the processes that control the shape and evolution of our stage discharge rating curves. This research is also relevant for understanding force dynamics when rivers are artificially constrained with levees or stop banks.
Concerns about irrigation generally center on whether there is enough water at the right time to support agricultural requirements. An issue that gets less attention is the quality of the water applied to the crops. This study finds that almost 900 million people are at risk from crops irrigated with water that received low levels of wastewater treatment. Are you measuring the quality of water used for irrigating the food you eat?
In order for data to be useful and hence have value, people have to be able to find it, evaluate it, and use it (i.e. search, discovery, access). This cross-disciplinary examination of the state of the art of data search is a useful reference for evaluating opportunities for improvements in the search for water data. Because “(…) data are distributed across numerous repositories and platforms” (Dow et al., 2015), users first need to discover the platform, and then must invest significant time and energy becoming familiar with each search environment (Ames et al., 2012; Beran et al., 2009). Given the diversity of search interfaces, it is not surprising that water scientists desire a “Google for data” (Megler & Maier, 2012).
How Uncertainty Analysis of Streamflow Data Can Reduce Costs and Promote Robust Decisions in Water Management Applications
Data are an uncertain measure of useful information. The full extent of data usefulness requires a measure of data uncertainty. This study demonstrates that uncertainty analysis has uncovered economic costs in the hydropower industry, improved public acceptance of controversial water management policy, and tested the accuracy of water quality trends. Easier access and improved usability of flow uncertainty data is needed. Better water management requires better understanding and communication of uncertainty.
This book is beyond my price point to purchase, but I will promote it because it reminds me of Reg Dunkley, the meteorologist at Environment Canada, who was tasked with forensic investigations of extreme events for legal purposes. Rarely did he have reliable climate or hydrometric data for the time and place of an extreme event in a remote location. What he would rely on were reports of bridge failures and culvert washouts to determine the intensity, duration and spatial extent of events. Records of destruction may be an imprecise tool for measurement of streamflow and rainfall, but sometimes it is all that you have.
Planning a century in advance may seem futile. The problem with not planning for long-term objectives is that water infrastructure is expensive and once built it is very hard to detach from the dependencies it creates. It is easy to plan for the short term and build infrastructure that will support a vibrant economy today. It is harder to deconstruct that economy when the water supply is no longer reliable or the quality is no longer usable. The state of the art of climate prediction is still not great, but it is far better than it used to be. In ten years’ time, the forecast for 2117 will be different than our current 100-year projection, but it probably won’t be wildly different. The adjustments in planning that will be needed to stay on course will be minor (probably!). None of this is without uncertainty, but we are also getting much better at understanding, communicating and planning with uncertainty.
The use of the “conditional” event attribution approach for investigation of an unusual event is worth paying attention to. Most of our built infrastructure is designed to flood frequencies determined from the period of record. It is increasingly the case that such designs will fail. Building a more correct understanding of the reasons for failure will lead to more effective responses. Not all unusual extremes are due to climate change. Some may be due to land use change, some may be mere statistical outliers, and some may be due to an inadequate period of observation history. Correct attribution is critical to correct response.
Continuous real-time monitoring of drinking water throughout the water distribution system has not previously been feasible because of the high cost and low reliability of sensors. New miniaturized (micron-scale) sensors can provide reliable results in laboratory conditions but cannot withstand the high pressures of drinking water systems. Researchers at UBC have used 3D printing to create a conduit with embedded miniaturized sensors that is robust to operational conditions by controlling the flow rate over the pH, temperature and conductivity sensors. While this example is for drinking water supplies, it is an excellent example of how quickly newly emergent technologies can lead to solutions for monitoring problems. It surely won’t be long before miniaturized sensors change the way water is monitored and how we even conceive of water monitoring networks.
“Narragansett Bay is New England’s largest estuary with an area of about 385 square kilometers. Pollution led to algal blooms and a lack of dissolved oxygen in the bay – both of which harm fish and aquatic plants. In 2003, a fish kill resulted in the death of more than 1 million fish. Water quality improved after Rhode Island passed a law requiring major wastewater treatment plants to reduce the amount of nitrogen they were putting into the water by 50 percent of 1995 levels. In some parts of the bay, the oxygen available to marine life had returned to normal levels by 2014. Higher chlorophyll concentration means the water may contain elevated levels of nitrogen and other nutrients. A one-unit decline in chlorophyll concentration leads to a 0.1 percent increase in value for homes in the 100 meters (109 yards) closest to the water – or $200 for a house worth $200,000. Decreasing chlorophyll levels in the water by 25 percent would result in a 0.24 percent or $45.5 million increase in the aggregate value of homes within 1,500 meters (1,640 yards) of the water within one year.” Let us assume that the inverse is also true. Let’s make sure that governments understand the value of water monitoring to protect the capital assets of taxpayers.
The concept, as I understand it, is that gridded model output from land surface schemes coupled with numerical weather prediction can provide inputs of erosive rainfall, snowmelt and icemelt that have high predictive power for estimating suspended sediment concentration. This means that discharge data are no longer needed as a predictive variable. This also means that there can be a higher level of data interpretation than is possible with brute force empirical rating approaches.
Technical Note: Stage and Water Width Measurement of a Mountain Stream Using a Simple Time-Lapse Camera
Yet another use for relatively inexpensive gauge cameras. Image management is clearly the new frontier for hydrometric data management.
As front-line workers in the water sector, we are often asked for our opinions, particularly about the effects of climate change on local water resources. Attributing general cause to specific effect – i.e. connecting the global scale to the local scale, is a topic that most of us would like to avoid. However, it is also a useful starting point for improving public awareness of the importance of monitoring. This report focuses on weather and climate, but the following statement is also true for water: “More international collaboration with a focus on consistent climate monitoring will lead to improved climate services, helping us to better monitor, understand and anticipate climate extremes.”