Afghanistan Training & Water Data Security

Water monitoring, Hydrology, Water data management software, Rating curve, Stage discharge curve

I have just returned from a trip to India where I provided training to a group of hydrologists and water resource managers from the United Nations and from the Afghanistan Ministry of Energy and Water.

This trip was enlightening, challenging, and perspective changing.

I always thought of my work in the Arctic with Water Survey of Canada as some of the most challenging field work there was. The elements, isolation, and environment were unforgiving, but I never gave much thought to my office environment, network security, building security, or data storage. I, like most of my co-workers, complained about key fobs and too many passwords to remember, but we never had to think or worry about security. We gave our data away for free on the web.

Who would want to steal or destroy my data? On my trip I learned this was a sheltered view that I was privileged to have.

Chatting with the training group in India quickly opened my eyes to things I never had to think about. We talked about their historical water records, which spans from around 1960 to 1980 and then 2007 to the present.

There is a data gap from 1980 to 2007 that is larger than their entire historical water record!

The dates tell quite a story if you know anything about the history of Afghanistan. Russia invaded Afghanistan in December 1979. Obviously, water monitoring is not a priority when fighting an occupation. This was followed by civil wars, Taliban rule, and then the U.S. came after September 11th, 2001.

Fast-forward to 2007.  Development funds became available to help rebuild the country, but projects for agriculture, power generation, and infrastructure re-building all need water data. The Ministry of Energy and Water has started to rebuild its network and today it has 125 Automatic Hydrological Stations including 43 Cable ways, 26 Automatic Weather Stations, and 30 Automatic Snow Survey Stations. I asked them how safe Afghanistan is and how challenging their fieldwork is.

They modestly replied that some stations have “security concerns.”

When I think of “security concerns” with a station I think of vandals, but this is not what they were referring to and my perspective shifted abruptly. I asked about their historical records – this is where my perspective got turned upside down.

They have “water books” for 1960-1980.

These are the annual publications of water record, but much of the supporting data was destroyed between 1980 and 2007. That means no staff gauge readings, no measurement data, and no field visit reports. The published data can never be re-visited; all that work is lost and preserved only by the annual summary reports.

Think about that, all the effort, time, and money that went into that data collection is gone.

Now think about your data. How much do you know about your data storage and back-up, both digital and hard copies? How often is your database backed-up? Where is it backed-up? How do you store your hard copies? Is all your data in your database? What would happen if there was a fire, flood, earthquake, etc.? Would all your data survive including hard copies? If you have ever stood under a bridge in a rainstorm to collect a single data point or managed the budget for field programs, you know how important that data is.

Today we have the ability to capture environmental data digitally through either digital collection or by scanning hard copies and storing them digitally.

Older databases and systems couldn’t manage metadata such as scanned sheets and pictures – these ended up being stored in different locations from time-series data. What if you stored all your data, metadata included, in a central database accessible to everyone within your organization? What if all that data and metadata was backed-up off-site in one or more locations? These locations could be anywhere: in a different building, country, in the cloud, or all three. Central data entry and storage of all your data and metadata along with proper back-up procedures provides security and ensures that your data will survive for the long term.

Modern data management systems can protect valuable environmental data over the life of the data.

That provides piece of mind. It was a privilege to provide AQUARIUS training to the water professionals with the United Nations and the Afghanistan Ministry of Energy and Water, knowing that accurate and timely water data will be safe and available for generations to come.

5 responses to “Afghanistan Training & Water Data Security”

  1. Gerald Dörflinger November 7, 2013 at 2:55 am

    Hi Jamison,
    here in Cyprus we also have lost our “supporting data” during the invasion by Turkey in 1974, this is really a bad thing if you are left with the mean daily flows in the hydrological yearbooks.
    Another interesting point you raise is that “projects for agriculture, power generation, and infrastructure re-building all need water data”. This is obviously very true, but usually not recognized by decision makers responsible for the funding of monitoring programs. I would even say that some of them consider it “easier” (=less restrictive) to make decisions without that data, and that may also have a bearing on funding.
    And, very important to work towards being able to store all the data, metadata, scanned historic sheets & tables, calculations, pictures, etc. etc. , in a central database with regular back-ups. I think that there are many countries that do not have such a facility yet, the country I am working in included.
    keep up the posts, even though there is not much discussion and posting yet – it may come in the future.
    best,
    Gerald

    • Thanks Gerald,

      Unfortunately I fear the loss of data to such circumstances happen far more often than I ever realized. Not only is there the loss of the data but usually after such invasions/military actions there is a need to re-develop and the data that should help guide that is gone. To make things worse is that funding is usually most available immediately following such events so quick decisions need to be made in order to access that funding.

      Through my work here I have been fortunate enough to see a lot of groups, agencies, governments, etc. committed to fact based decision making and actively seek information to help guide decisions. I think more and more people want policy guided by data. Unfortunately, there will always be decision makers that feel they are right regardless of what the data says or who even actively suppress that data but I think that is becoming harder and harder to do.

      Sometimes I think that data can be overwhelming both in volume and variety. Central databases such as AQUARIUS are designed to handle large volumes and varieties of data make data management less intimidating. Integrated tools, functions and reports make getting information from your data easier.

      What I love about this topic is that we get to discuss all the information that surrounds the data. The data (i.e. stage, discharge) are just numbers, without context. The supporting data and metadata provide context and allow that data to become information and that is powerful.

      I wish you well

      Cheers
      Jamie

  2. What we learn about security and field data collection takes a while to learn. Most of us associate security at a remote site with physical security. What makes your post interesting is that you realize that security is all about safeguarding the data collection process. No one thinks about the little steps that can be done within the Site that protects the collection. Outside of the physical component what are the simple steps that one can take to keep the collection operational?

    Battery management. Actually the life’s blood to the site. Lose the power, lose the collection. Monitor the battery’s charge in your collection. Looking at voltages and how they fluctuate over the course of an evening often lets you know the true nature of a battery’s condition. More proactive measures for battery management happen in the site’s electrical design – that is, use a solar regulator that isolates it’s load when the battery becomes discharged. We commonly wire the data logger to the battery and the telemetry equipment to the load side of the regulator. Yes, you lose the telemetry portion of the site but you still have the data logger collecting data. Is this an aspect of security? You bet this is.

    Data Logging Files. Let’s think this out a bit. Most casual users size their logs too small in a modern data logger. Data can rollover in a small log if the site is not visited on a regular basis. Bigger is better when your visits don’t happen as timely as you would like. What else? Newer data loggers have the ability to log to external media as well as the internal flash memory contained on the data logger’s motherboard. Log to both locations for the same data. In the event of the data logger failing in the field – remove the external media and you have saved your collection. is this data security? Yes.

    Sensors. If you design your sites to have redundant sensing for your most critical measurements – you are ahead of the curve. If you lose a sensor on a site that only has one sensor – you’ve lost the collection. Have a second sensor that is collecting along with the primary not only saves you if one fails but provides a means of verifying anomalies a site may have. Is this more expensive? Yes and no. In the event of a sensor failure – it’s priceless. Is this data security? Yes.

    Meta Data. From your post I’m thinking about the paper logbook that is maintained on the site. If anything qualifies as site meta data – the book is it. We learned first hand from a site break in and the logbook being tossed into the river that this is a serious loss. After this episode, we integrated key components of the written log into the flash memory files within the data logger (not an easy task). A note of interest here. The DCP in the photo (Sutron 9210B) has the ability to directly enter field notes directly into the collection log. An interactive form of meta data. This feature was developed by the vendor by a request from an end user. It is nice to know that a vendor listens to the needs of the community.

    A great post. Thanks for sharing.

    Dave

    • Hi Dave

      I couldn’t agree more. You have taken my post one step further and encompassed a cradle to grave approach to data security. You can’t protect data you didn’t collect. Thinking about each step in the collection process and safe guarding that step is so important. Asking “What if?” over and over again is so important. Then planning and designing systems to address the answers to those questions.

      Unfortunately a lot these measures are over looked because of cost. Managers understandably see the cost of equipment, systems and employee time but likely don’t directly see the cost of missing data. It doesn’t mean they don’t worry about it or don’t do their best to avoid it but budget constraints are very real.

      Digitizing data, in all forms, for me is so important. Memory is cheap and with high speed networks data can be stored in numerous locations easily, just like your example of internal and external logging. If observed/written data can be inputted digitally to begin with you limit transcription errors and avoid redundant data handling. That data can be immediately backed up on multiple devices in the field. This makes the workflow more secure and more efficient.

      I agree that it is important for vendors to engage, listen and respond to the end user. It is through that collaboration that innovative solutions and tools are developed for real world applications.

      Cheers
      Jamie

      • Hi Jamie,

        I agree that some managers look at base costs for the fielding of a site. I recall having a conversation with a colleague in the USGS a couple of years ago. He kicked out the cost to support his gages was $21K/yr per site. That was assuming no failures or vandalism. When servicing multiple gaging sites, we tend to work on them in a batch process. That is, we work a geographical area of co-located sites. This saves operating costs.

        When a site goes down due to a bad sensor, power issue or vandalism – we have to react rather quickly to resolve the issue. To make an unscheduled repair trip costs big money. Think of the total expenses in employee time, O&M costs, rolling stock, travel costs. This can easily come to several thousand dollars in the case of bringing the site back online. Also, to compound the costs, the hydrographer then has to reconstruct the missing record. More time and money…

        The bottom line is that some thought in the planning and deployment of a site often can minimize the losses that can happen at a later time.

        Why are sites minimally designed for the most part? Many people that deploy sites don’t think the overall process out. In some cases I’ve seen is that the site is wired up in the field in an Ad Hoc fashion (makeshift solution, inadequate planning, or improvised installation). Has anyone here ever seen a guide on designing and deploying a site???

        You also mention high speed networks. This usually happens once the data hits the server. Field telemetry is often the choke point – data wise. Some modern Data Loggers support IP based communications however there are limitations in a remote site. GSM/CDMA solutions are pricey for data transfers. You think of the monthly ‘bit load’ costs and work from there.

        Anyway, the options given to us today in what we can have in a site is much greater today than what was available when I first got started. And that was a while ago.

Join the conversation