The OGC WaterML 2.0 standard is an industry game-changer. Now it is easier for hydrometric data producers to make their data accessible for timely decision making and it is easier for data consumers to find relevant data to drive evidence-based decision making.
Almost as important as the sharing of data is the sharing of data quality. Let me explain.
There are a great many, perfectly valid, ways of measuring almost anything. However, the quality of the different measurements of the same thing will vary as a result of a wide variety of factors.
As a thought experiment, imagine you are having some new cabinets built. You could painstakingly make replicate measurements with a high-quality surveying instrument or you could simply pace off the needed measurements.
In either case, you could pass the measurements to a cabinet maker and get some cabinets built. In the first case, the craftsman would have the confidence to build exactly to specification and the resulting cabinets would be elegantly designed to fit perfectly. In the second case, the craftsman would need a more complicated design perhaps with sliding panels to hide adjustable expansion joints.
Now suppose you do not tell the craftsman about the quality of your measurements. She might assume (perhaps correctly) that the measurements are reliable and build to specification, or she might assume (perhaps correctly) that the measurements are not reliable and build for adaptation.
This same scenario plays out every day with water data. All reputable stream hydrographers strive to produce the best quality data possible, but the ‘best possible’ varies from place to place and from time to time for a large number of perfectly valid reasons.
Consumers of the data are placed in the situation of our hypothetical cabinet maker. The choice is to assume the data are reliable and run the risk that they are not. Or to assume the data are not reliable and accept the inevitable costs and consequences of under-specification. In either case, the likelihood that the solutions for the water problem at hand will be designed with optimal cost-efficiency and outcome effectiveness is low.
Understanding, characterizing, monitoring, and communicating data quality benefits data producers as well as data consumers. The adage ‘you manage what you measure’ is especially true for data quality. Over time, data quality will inevitably improve as a result of consistent monitoring.
I have a lot more to say about communicating data quality. Please read my new ebook on the subject, available here.
The OGC WaterML 2.0 standard is an industry game-changer. The interoperable exchange of water data across agencies is unlocking information silos. But not all data are created equal. Sharing data quality is key to building trust. Making the right decisions requires data that are fit for purpose. This eBook examines the current standards for characterizing and communicating data quality. Discover how qualifying your data can build confidence and trust.