Drought is a large-scale problem. Droughts of high severity, long duration, and broad spatial extent happen infrequently enough that ecological and economic dependencies on water are bound to develop during the between-drought intervals. The trick for drought resilient societies is to develop a long-term memory that serves to limit over-enthusiastic exploitation of water resources when water supply is relatively abundant.
Most indigenous societies that are highly attuned to their environment build long-term memory into stories, songs, dances, and oral traditions that are passed from generation to generation thereby ensuring that each successive generation has the wisdom to survive infrequent extreme events. There are many social, economic, and historical reasons that help explain why we have progressed into a society that is de-tuned to our environment and dis-connected from the wisdom of our elders. As a result, we are left with a problem:
How can we replace social memory of past experience with a new source of wisdom for modern decisions?
One thing I’m fairly confident of is that droughts that are causing hardship today will pass. Even Lake Mead which has been dramatically shrinking for the past 15 years will eventually be replenished. What I’m not so confident in is that when drought returns again that any of the lessons learned today will be remembered, or acted on, during the next inter-drought interval.
In a modern society we don’t turn to the village elders for advice on how to be healthy, happy, and productive, we turn to Google. Google is, fundamentally, a search engine that is very good at discovering relevant information in a world of Big Data. Big Data is our memory.
In a related post on the The Other Extreme, Naomi contributed to the conversation with her infographic “Big Data, Small Footprint”, which is very enlightening. The world’s data centres use 30 billion watts of electricity (equal to the output of 30 nuclear power plants), 90% of the data in the world was created in the past two years, and any data centre that is older than seven years is considered obsolete. It’s predicted that 35 zettabytes of data will be produced annually by 2020.
What does Big Data have to do with the Measurement of Nothing? or Nothing Matters?– Everything.
The measurement of small flow matters. Immediately, for informing — and fine tuning — adaptive management strategies that can more effectively mitigate for current conditions, like extreme droughts. In perpetuity, for creating a memory in Big Data that will be a source of wisdom to avoid future conflicts. This has several implications for how we operate water monitoring networks.
Funding for water monitoring networks is often motivated by flooding events. This means that everything from network density, to site location considerations, to station design and installation, to choices in sensing and measurement equipment, to operational procedure and field visit scheduling, and to data processing and management are sub-optimal from a measurement of small flow perspective.
Further investments are required to build Big Water Data. We have a chance to build a memory that will serve future society with a vital source of wisdom that we lack. Will we? The answer to that question depends on our collective wisdom.