The Internet of Things (IoT) & Better Water Data

In my Hallowe’en post I presented various ways in which better rigor in tracking data provenance can do many things — up to and including — saving the world during a zombie apocalypse. Today, I would like to focus on a much more immediate and pragmatic benefit of improving traceability of data to source.

Our negligence in communicating and preserving primary and intermediate states of our data is excusable because of our dependence on outdated communications protocols. For example, serial data communication using an RS 232 port from our loggers to the laptop. The RS232 baud rate is painfully slow compared to speeds available by Ethernet communication. A file that may have previously taken an hour to download by RS232 is now done in seconds. As our field devices bulk up with data and metadata (e.g. ADVM) this is becoming increasingly important.

The Internet of Things (IoT) and machine to machine (M2M) interconnectivity is changing how we interact with our field devices.

Dave Gunderson from the US Bureau of Reclamation is an early adopter of IoT. He reports that communication with his gauges using Ethernet connectivity is fast, really fast. IP addressing over the Ethernet is a point to multi-point topology, which also means the logger can have multiple connections open at a time. In the field, Dave interacts with his devices using web pages that are rich in meaningful content tailored to his needs. Meanwhile, back at the office, water managers have access to web pages rich in information that is tailored to their needs.

Blog-Internet-of-things

The power of a web browser cannot be overstated.

Loads of data can be displayed in a custom format that was not possible using vendor supplied GUI or by means of a Terminal Emulator. Tables, graphs, and lists allow the hydrographer to know, at a glance and in real-time, exactly what’s going on with his gauge.

The Ethernet is a superhighway for his data that can handle lots and lots of high speed traffic compared to the dirt path of serial communication. This means he gets tons of relevant metadata along with each water level reading. He can view the pressure in PSI before it’s converted to water level; PSI is independent of any calibration done in the field so he has source data to work from if there is ever error in the field work. He also gets a lot of information that reveals what he calls the personality of the gauge. This allows him to anticipate if his gauge is getting a bit grumpy about anything.

Dave not only has diagnostics like battery voltage, he also has details of how hard the gauge was working when the voltage was sampled. If his gauges are misbehaving he is alerted with a detailed message about the condition. The message is generated and sent in real-time to a status web page. Dave’s team may hate getting SMS messages on their cell phones in real-time. What they do like is a daily email of the events that have happened in the last 24 hours.

Dave works in many high turbulence locations and his clients need a smooth signal for operational decision making, so he uses averaging to prepare the signal for his clients. However, he’s able to see everything that’s going on during the sampling period: when the sample started, when it ended, how many values were ‘good,’ and how many rejected, and what range and distribution of the readings were. This is all highly relevant and valuable for troubleshooting problems, understanding the data, making continuous improvements to the monitoring plan, and giving peace of mind that when things are working well, they really are working well.

Dave gets more out of his data investment by improving his quality control; adaptively and continuously enhancing processes; improving timeliness in proactive maintenance; and by enriching the usefulness of his data while enhancing credibility and defensibility. With real-time monitoring and reporting in place, Dave sees events and trends as they’re happening. It’s not only data about the environment that’s valuable, he is also able to see — and act on — data about his monitoring system.

In addition to tangible operational benefits, these improvements in data provenance and traceability will undoubtedly come in handy in the future.

We will have to wait and see whether Dave can also save us from a zombie apocalypse.


Resources_AQUARIUS-Corporate-VideoAQUARIUS Video: Faster Analysis. Better Decisions.

Today, water monitoring professionals are under more pressure than ever before. Learn how the dedicated team of hydrologists, scientists, and software engineers at Aquatic Informatics designed AQUARIUS: the world’s leading software suite for water data management. Watch Video.

3 responses to “The Internet of Things (IoT) & Better Water Data”

  1. Anything can be connected and sensor readings made available for anything measurable but the usefulness of the data needs to be evaluated first and compared to the extra costs of instal and maintenance of the sensors and sending units, especially in the exterior environment. IoT and M2M will have most application in the controlled and piped liquid industry. Many companies are nibbling at these edges although with shifting focus and rationalizing existing tech and situations may be useful. AI, with a clear focus, could look at IoT (water division) as a new growth industry but might have to consider expanding into sensor hardware as well.

    • Hi Nick,
      You are absolutely correct. The usefulness of the data has to be put in the context of additional costs. However, I think that Dave has demonstrated that – for his purposes – he has been able to use new-found capacity and bandwidth to generate timely information that improves the operation of his gauges. The problem is that for most people more data can conceal rather than reveal vital information. The data must have a metadata payload that enables effective search, filter, sort, rank, categorize and aggregation functions. The increase in metadata volume can itself be a burden unless it is sufficiently transparent, self-descriptive, and well managed. There are many new buzzwords that define the transition from a point-to-point topology to a cloud-based, Big Data paradigm. We need to, somehow, move beyond the buzzwords to discover new, effective, workflows that result in meaningful and beneficial outcomes. Dave may be one of the first down this path but he won’t be the last.

  2. Technological convergence of digital monitors, m2m, IoT, measurements, data processing, modeling, and management is advancing very quickly. Expert systems which can take these inputs and create meaningful, useful metadata, easy to set limits and include some fuzzy logic are going to be very powerful indeed.

Join the conversation