Graph of long-tail data.

Closing the Gap in Hydrometric Data – A Call for Your Participation

How long is the tail of hydrometry?

A solution for the gap between data availability and the impacts of water variability, across all scales of interest, on people and the environment is needed. One of my great hopes for the development of OGC standards for interoperable hydrometric data is that it will shed light on the dark data under the long tail of hydrometry. It is my opinion, unsubstantiated by quantitative surveys, that there are far more hydrometric data out there than are readily accessible from the major hydrometric data providers.

There are more projects with fewer than 10 gauges than there are agencies running more than 10,000 gauges. How many more: one hundred times; one thousand times; ten thousand times? Most of our accessible data are from the large national hydrometric programs – it is difficult to discover the size of the total data potential.

Hydrology is location-based.

There is no ‘ideal’ density for a hydrometric network. More data are always better because even closely placed gauges can represent quite different scaling, climatic, anthropogenic (e.g. effect of extractions, dams, diversions), and landscape processes. Hydrologic misfortune is too often a result of sole reliance on synoptic scale monitoring to predict hydrologic variability at a local scale for planning and management decisions. If you need to understand water at a local scale you need data at a local scale.

This need is largely met by project-specific monitoring done at a very small scale, often by independent stream hydrographers running a handful of gauges. These hydrographers do not have ready access to the resources of the large data providers for data management, archive and dissemination resulting in data that is unsearchable, undiscoverable and inaccessible. As a result this data tends to be collected, often at some considerable expense, for one-time use.

Re-use of such data could greatly expand our ability to understand and manage hydrological variability across all scales of interest. Data re-use implies effective metadata management. Evaluation of ‘fitness for purpose’ for 3rd party use of data requires relevant, reliable and trustworthy information about the data.

Quantifying the size of the opportunity for increasing our global hydrometric data asset is a daunting task.

I would like to get at least a small sense for this opportunity with an informal survey of readers of this blog.

Please take a few moments to answer few questions

A simple conversation about the opportunity to make the most of our global hydrometric data investment seems like a good place to start.

There will be no cost to participate in the WebEx teleconference, which I will schedule for some time in late September. If needed, the teleconference might be in two parts to accommodate diverse time zones.

The readers of this blog might be just the right group of people to start the conversation.

Please pass the link to this post along to any colleagues who you believe are knowledgeable about the problem and/or who should be part of the solution.

7 Comments
  • PIRLET
    Reply
    Posted at 8:26 am, August 15, 2014

    Solution to your problem requires both a “standardization support” and the right methodology. I have some ideas which I could share with you. I am a standardization expert, with a large experience in EU Research Projects.

    Do not hesitate to contact me.

    Best regards,

    André PIRLET, MScE
    Standardization&Research Belgium

  • Chuck Dalby
    Reply
    Posted at 4:54 pm, August 15, 2014

    The quest to assemble hydrologic data in a single location accessible to all is a noble one, but requires a HUGE effort that is generally beyond the scope, capability and funding of any one entity. Almost in the realm of establishing a “World Government”. The USEPA has tried this with the STORET water-quality database over the last 30 years and met with some success. The US Geological Survey has managed this with the National Water Information System (NWIS), but this applies primarily to USGS data collection and that was/is difficult enough. The problem is that almost everyone does their data collection, interpretation and analysis a little differently and resists conformity–and many are generally incapable or unwilling to provide the level of “metadata” required to support sufficient knowledge of data “quality” so that the end-user can make their own judgements. Of course someone could offer to provide this service for a small fee………… Chuck

  • Dave Gunderson
    Reply
    Posted at 5:47 am, August 17, 2014

    @Chunk Danby’s comments are spot on. My own thoughts are:

    1. It doesn’t matter how large or small the organization is. We all support the data collection that is important to our needs. Even within an organization like the USGS (that operates with a unified plan), it often defaults to how the local office conducts it’s own work. Some are better than others.

    2. Chuck also mentions Meta Data within the collection. What constitutes the Meta Data from a measurement and what are the standards of the Data being collected? I’d LOVE to talk with others that would like to discuss this topic.

    3. Where do we talk about best practices and the methods we employ? The best venue that I’ve seen was the USGS Surface Water Convention. Other venues usually rest with the vendors of our data collection equipment. With the advancement of the webinar, provides another method to connect people that have the interest but not the time/money to attend these special events. Our own agency is looking into video conferencing to conduct meetings of this type.

  • Chuck Dalby
    Reply
    Posted at 10:35 am, August 18, 2014

    Hi Stuart,

    You made several very good points. First, never give up! Second, in this era of “big data” and social media the ability and desire to share data may trump some of the problems of the past. In addition, one of the bigger challenges of bringing diverse hydrologic data, of known quality, into a shareable “database” deals with information collected in the past. There is always the future, and with someone taking the lead in establishing the appropriate protocols, a new foundation can be built plumb and square. I am reminded of the early days of GIS–very little data available and very little metadata. But now it is a very different story–good data everywhere. In that situation a software vendor (ESRI) took the lead and drove the process with assistance from state and federal agencies. Perhaps aquainformatics can do the same ?
    Chuck

Post a Comment

Comment
Name
Email
Website