A New Way of Thinking … from Water Quality Data to Information

Water monitoring, Hydrology, Water data management software, Rating curve, Stage discharge curve

The water quality monitoring community needs to think differently about how we collect, store, and analyze all types of water quality data. Twenty years of field work in water quality and water resources have led me to the conclusion that to truly improve and protect the world’s waters, we need to think differently about how we plan, collect, analyze, and report water quality information.

For the majority of my career I have managed mountains of water quality and environmental data.

I’ve worked with more spreadsheets and custom solutions than I care to remember. I’ve wondered how my peers in other cities, states/provinces, agencies, and countries collected, analyzed, processed, and stored their data. I’ve seen and experienced the countless hours and budgets allocated in developing custom agency tools, only to have those tools become inoperable once the author was no longer available for support. I have seen many levels of sophistication (or lack thereof) in water quality monitoring programs, as well as the perception of the value of that water quality monitoring data by managers and policy makers. All of this exposure has led me realize that many water quality monitoring programs are information poor, despite the amount of data a program or project may collect. When I joined the City of Lake Oswego in 2005, I had decided it was time to work at the local level and to connect “water quality monitoring and good science” with “policy.” The decision to work at the local level exposed me to a wide range of programmatic issues like enforcement, permit compliance, education, program and project management, and probably most importantly, how water quality monitoring could …and should…support telling the true story of “restoring and maintaining the chemical, physical, and biological integrity of the Nation’s waters.” As the Water Quality Program Coordinator for the City, it was my job to comply with permit conditions, implement management plans, and show progress toward the reduction of water pollution. In 2007, when the City was afforded the opportunity to implement two gaging stations with water quality sondes, I knew that traditional data processing methods were not going to support my program and a custom solution was simply not practical. AQUARIUS was deployed to turn the large volumes of real-time data into useable information, alerts, and reports with the resources we had at hand, and by the one person resource – me. In 2011, I integrated the City’s Weather Station Network into AQUARIUS and truly began integrating the water resource time series in the City of Lake Oswego. The City’s use of AQUARIUS was out of necessity.  (Click here for the AQUARIUS case study that will give you more information on my use of AQUARIUS for Stormwater, TMDL Monitoring, and Regulatory Compliance).

Stormwater, TMDL Monitoring & Regulatory Compliance

City of Lake Oswego Improves Urban Water Resource Management with Real-Time Data Analysis

“Despite governing a small municipality, our policy makers want to know that they can trust our environmental data. By using AQUARIUS, we use the same methodologies and tools to build our rating curves and process our environmental data as the USGS and Water Survey Canada. That provides credibility.” David A. Gilbey

“You can have data without information, but you cannot have information without data”So why did I jump ship from a so called “cushy” public service job to work with this software company? I’m passionate about water. Period. I know there is a global need to change the way we plan, collect, analyze, and report water quality information… to ensure that we provide our stakeholders with usable, timely, and accurate information. Joining the Aquatic Informatics team has given me the opportunity to do just that, by contributing to protecting and enhancing the world’s water resources through inspiring monitoring solutions for water availability and water quality. This takes inspiration, innovation, and forethought – something that Aquatic Informatics has an abundance of. The customers of Aquatic Informatics are just as amazing and inspirational…and now I have the opportunity to help build solutions for these professionals utilizing their input on best practices.

– Daniel Keys Moran.

I’m now working with a team that is passionate and dedicated to providing the best possible tools so “field guys (and women)” and water professionals can do their jobs more efficiently and effectively, enabling them to produce data of a known quality, connect the data with the policy, and ultimately produce timely, accurate water information so that the world’s waters are available in the right quantities and qualities to support future generations. I have experienced first-hand how AQUARIUS helps organizations transform from being data rich and information poor, to data rich and information rich. This sounds easy, but in water quality monitoring, how many tools can you name that truly accomplish this? Stay tuned for more blog posts on the following topics: watershed management, stormwater management, TMDLs, regulatory compliance, and other related topics. I look forward to your comments as well. Please use this blog to share your ideas and success stories so that we can continue to evolve industry best practices together. See you on the water.

28 responses to “A New Way of Thinking … from Water Quality Data to Information”

  1. Carol Slaughterbeck July 25, 2013 at 7:42 pm

    Great post Dave–I love this quote as it relates to many aspects of water resources. Look forward to more posts!

    “You can have data without information, but you cannot have information without data”
    – Daniel Keys Moran

  2. Nicely said and done. Looks like the transition is going smoothly. Hope you don’t forget about us little people.
    Take good care. Love to your family
    Kevin

  3. Great Post Dave! please send me some information on the tools your team has developed!
    W2

  4. Absolutely, Current status of water quality within our watersheds are the true litmus of how to apply corrective measures. You can think all you want but the only way to success is to measure the results!!

    • @Jack. I couldn’t agree with you more. One of the largest hurdles for policy decisions and financial expenditures is ” Is it working?” and “what does it cost to clean it up?”…all questions that HAVE TO include the use monitoring information derived from data of a known quality.

    • Monitoring is is the weak link. Cities/municipalities/owners are being pressured to jump in to green infrastructure/source controls – but without monitoring programs and real performance information many are resistant to spend limited resources or be the test case.

    • @Tim. Agreed. are you familiar with any of the work that is being done as far as performance monitoring in the PNW? ( http://www.wastormwatercenter.org/ ) …There are some really inspiring uses of water quality monitoring technology to really understand pollutant removal performance.

  5. I found this water quality data collection blog to be on the right track. We’ve practiced similar methods (field tablets with on-line access / other) for large superfund sites here on SoCal where implemeting Long-Term Monitoring Optimization (LTMO) through use of MAROS ver 2.2 modeling software – developed by AFCEE / others.

    Cheers,
    Mark.

    • @Mark. Thanks. It is incredible to look at all the available tools for WQ sampling activities that are available today. The conversion of data into information is such a critical component of any monitoring program. Thanks for the comment! – Dave Gilbey

    • @Dave. Good feedback on improving GW data collection and utilization tools. Using e-tablets loaded with the appropriate software can eliminate much if not all transcribing errors at the entry point of data collection. This aqpproach solves a huge problem and greatly improves the accuracy of GW data. How you use the data after collection still relies on proper use of best practices.

      Cheers, Mark.

    • @Mark. EXACTLY! I have found that transcription errors are typically one of the greatest sources of unnecessary errors. Agreed about best practices too. I think there is a great need to share these best practices and the way the best practices are employed/used. A vast majority of water quality professionals receive their training on the job; if they are lucky enough to have a seasoned mentor, they will learn the best use of those practices. But what about those who don’t have a mentor? I think it’s incumbent on the current professionals to provide a forum for those individuals….like here on LinkedIn, or through professional organizations like AWRA, etc. and in the tools we use through training and open forums. What are your thoughts? Thanks Mark!

    • @Dave. Many states now and/or will be requiring scientists / others to obtain professional development units (PDUs) to maintain their licenses. In most cases, this should be the “carrot and the stick” needed to get practicing geologists / hydrologists to seek training / mentoring to improve their implementation of best practices. However, I agree that proactive forums offered by NWGA / AWRA / others can also fill in the gaps for those who do not have PDU requirements.
      Cheers, Mark

    • Jerrold Kazynski July 26, 2013 at 12:11 pm

      It’s great to see new technologies applied to earth science. I fondly remember conducting aquifer tests using a steel surveyor’s tape and blue chalk, then calling out to a 2nd person recording the data: “holding at 75 feet,” and “wet at 6.72 feet.” After two hours, one person could finish the test and record his own measurements. Then the electric sounders came along with depth registers every 5 feet, speeding up the depth measurement while introducing error potential when you measure from the depth register above or below where you held. Lots of manual data collection, transcribing, math evaluation, formula selection, curve-fitting, nomograph manipulation, etc. to obtain the answer to your hydrogeologic question.

      As a matter of fact, I recall using the archaic units of hogsheads per fortnight for our Transmissivity values!

    • @Jerrold. Thanks for the comments! I’m excited to help connect these new technologies to the practices… and to connect the professionals like you with the new generations of hydrologists and water quality ranks in the years to come.

      I have to say that I had never heard of “hogsheads per fortnight” to describe transmissivity values …(1 hogsheads per fortnight =1.97156864 × 10-7 m3 / s for those of you following along, thank you Google!)… but your post helped me win a bet this weekend…therefore I feel obligated to buy you a beer if our paths should ever cross.

      Cheers!

  6. George Livingston July 26, 2013 at 6:07 pm

    Data to information is not a new way of thinking, it’s as old as the hills. Data to information involves statistics and the unfortunate problem is many who are in management avoid the “S” word like the plague. To them, statistics goes no further than pie and bar charts, means and medians; In other words, simple descriptive statistics.

    In his book, “Statistical Methods in Hydrology”, Charles T. Haan; University of Iowa, States:

    “THE RANDOM Variability of hydrological variables has been recognized for Centuries.”

    “The general field of hydrology was one of the first areas of science and engineering to use statistical concepts in an effort to analyze natural phenomenon.”

    Both quotes Preface, page xiii

    In their book, “Elements of Statistical Inference”, Huntsburger and Billingsley define statistics as follows: “Statistics is concerned with the development and application of methods and techniques for collecting, analyzing, and interpreting, quantitative data in such a way that the reliability of conclusions based on the data to be evaluated objectively by means of probability statements.”

    Reliability involves the proper use of probability; probability depends on mathematics.

    Drawing information from water quality data or any other data starts with a problem definition and a quality sample plan design. Quality information from existing data results from data measurement quality.

    Excellent reference books: (1) *Statistical Methods For Environmental Pollution Monitoring, Richard O. Gilbert; (2) Statistics For Environmental Science And Management, Bryan F. J. Manly.

    Other excellent reference books: (1) Statistical Methods for Groundwater Modeling, Robert D. Gibbons; (2) Statistical Methods In Water Resources, D.R. Helsel and R>M. Hirsch

    *Classic book. Expensive.

    Many Statistics for Engineers and Scientist books are also excellent reference books. Older editions can be purchased from amazon.com for less than ten US dollars.

    My background includes data and statistical analysis as well as time series analysis and forecasting of hydrological and water quality data; Also geostatistics.

  7. George Livingston July 26, 2013 at 8:14 pm

    Data to information is not a new way of thinking, it’s as old as the hills. Data to information involves statistics and the unfortunate problem is many who are in management avoid statistics like the plague. To them, statistics goes no further than pie and bar charts, means and medians; In other words simple descriptive statistics. “Statistical Methods in Hydrology” book, Charles T. Haan; University of Iowa, States: “THE RANDOM Variability of hydrological variables has been recognized for Centuries.” “The general field of hydrology was one of the first areas of science and engineering to use statistical concepts in an effort to analyze natural phenomenon.” Both quotes Preface, page xiii
    “Elements of Statistical Inference” Book, Huntsburger and Billingsley define statistics as follows: “Statistics is concerned with the development and application of methods and techniques for collecting, analyzing, and interpreting, quantitative data in such a way that the reliability of conclusions based on the data to be evaluated objectively by means of probability statements.” Reliability involves the proper use of probability; probability depends on mathematics. Drawing information from water quality data or any other data starts with a problem definition and a quality sample plan design. Quality information from existing data results from measurement quality. My background includes data and statistical analysis as well as time series analysis and forecasting of hydrological and water quality data; Also geostatistics.

    • @ George. George ,Thank you so much for you thoughtful comments! Especially about the need for a quality sampling plan; this is the most important piece of any water quality monitoring project! …and to converting that data to information.

      I have to say I have many of the references you mention in my library. I may have to make some time to track down some of the others you mention here. I would add that I have found the following statistical texts useful as well:

      * Statistical Methods for Water Quality Management, Graham McBride (2005)

      * Analysis, Zar (fourth edition)(…I’m not sure I know anyone who is involved with Stats that doesn’t have this one…or at least one of the editions)

      * Methods and Their Applications, AC Davidson and DV Hinkley (2006)

      * Framework for Recreational Water Quality Criteria and Monitoring, Larry Wymer (2007)

      and if you’re working in any facet of water resources, you have to have “Statistical Methods In Water Resources, D.R. Helsel and R.M. Hirsch “. Here is a link to the pdf for anyone that doesn’t have this reference:

      http://pubs.usgs.gov/twri/twri4a3/pdf/twri4a3-new.pdf

      I had a sheet of paper post in my office for years that said:
      “It is easy to lie with statistics. It is hard to tell the truth without it.” -Andrejs Dunkels
      Still my favorite quote and still think its useful today.

      • I was (probably still am) naive when I read the book; it seems to me the essence of NSM is tying toehgetr the most useful streams of data. Who doesn’t want correlation? Well, turns out most IDS/IPS vendors. But any points I unfairly deducted for obviousness were swamped by the points awarded for the picking the right types of information for that correlation. I did a lot of this stuff manually without seeing that there was a pattern to what I was doing – I always wanted to know who was involved in an event, to see the packet, and often wanted to see the packet in context. This was really hard before sguil. I’m not aware of anything like the compelling, comprehensive, disciplined approach Tao covers. Awesome freakin’ book.

    • Thanks, David. Now I know the origin of the following saying: There are lie, bold lie and statistics under the sky.

    • George Livingston August 12, 2013 at 12:09 pm

      Thanks for your response David.

      Do have:(1) Biostatistical Analysis By Zar; (2) Statistical Methods In Water Resources, D.R. Helsel and R.M. Hirsch ”

      Took Helsel and Hirsch’s Statistical Applications in Ground Water Pollution Monitoring Continuing Education course at The Colorado School of Mines.

      Couple other good books are: (1) Modeling Hydrological Change, Statistical Methods by Richard H. McCuen and (2) Nondectects And Data Analysis, Statistics For Censored Data by Denis R. Helsel.

      Thanks telling me about some of your books. I’ll be checking them out at my library.

  8. Rose Comstock COSHM July 29, 2013 at 2:34 pm

    How about actually addressing water quality problems with infrastructure upgrades where data has shown and verified its time to act instead of adding more paper to the pile? Monitoring is important after some action has taken place to determine if the action is affective.

    • In my experience,few projects budget monitoring as part of the project, so there is little opportunity to learn how to mitigate for adverse affects. When monitoring opportunities come along, design the monitoring program to not only record what has occurred but also framed so that inferential questions can be tested. Otherwise how can we learn?

    • @Rose. I agree. Monitoring for the sake of monitoring is not helpful (and frankly wasteful IMO), but I’d add that that turning that pile of data into information digestible to those that need to pull the trigger on those infrastructure upgrades is typically a missed opportunity to educate and “tell the story”. Determining/evaluating effectiveness should be key part in any large investment. Thanks for the comment!

    • @Stacy. Thanks for the comment!. I have had the same experience. Monitoring budgets are typically the first things cut in any project, but I am a firm believer that if we as industry professionals need to tell a better “story” with the information we collect. I also share your sentiment of making your monitoring useful to more than just the single project; I think this is where the incorporation of well integrated metadata and data mining, and information sharing is key to the success of any project.

Join the conversation