After a very lengthy and rigorous process the USGS has selected AQUARIUS as the platform for replacement of its aging Automated Data Processing System (ADAPS). Obviously, this is very good news for Aquatic Informatics, but I would like to speak to why I believe the decision of the USGS to choose a commercial solution, rather than re-build in-house, is very good news for global hydrometry.
It should be no surprise to readers of this blog that I am a big fan of the USGS.
There is no comparable agency in the world in terms of developing and sharing the best available methods, techniques and technologies for stream gauging. If you want to talk to the foremost experts and thought leaders in almost any aspect of hydrometry, chances are pretty good you need to be talking to someone from the USGS.
This opportunity to work even more closely with the USGS to incorporate all of this collective wisdom into AQUARIUS will result in a process of rapid evolution. The commercial version of AQUARIUS will inherit this DNA but without a requirement for any hard-coded, agency-specific, architecture. This means that every user of AQUARIUS will be able to benefit from USGS process engineering without compromise to existing workflows, product suite, priorities or service criteria. One obvious advantage will be in choice of units (English vs. SI).
What I would like to speak to is how a rising tide floats all boats.
I have been involved with migration of customers data from all sorts of different systems both competing commercial systems and custom, in-house designed, legacy data management systems. In every case, the first thing I ask for is: where are the reference standards documents? If I can understand the standard operating procedures for data processing and quality control and assurance then I can understand how to accommodate those standards in AQUARIUS. In almost every case, published standards documents either do not exist or they are largely irrelevant to the operational software and database because they are so old and obsolete.
The USGS is the one agency that not only openly publishes its standards documents but writes them to be specifically meaningful and it keeps them up to date with emerging technology. This means that any user of USGS standards compliant software has the benefit of clear traceability to the relevant normative reference standard.
The USGS decision to forego in-house redevelopment of their ADAPS system and choose a commercial solution will have a huge impact on the quality of all hydrometric data as the standards for data handling are forced, by virtue of comparative advantage, to come out from behind proprietary fortresses and become open to scrutiny.
Water Survey of Canada, and other hydrometric operators that have chosen AQUARIUS, no longer need to be burdened with the responsibility for maintaining their own, unique, standards documents for data handling. The USGS will maintain its standards and it will verify that AQUARIUS is fully compliant. Any end-user of data can be assured that data produced by USGS standards compliant software are different only with respect to the hydrological signal that is being measured not by differences in the numerical recipes used for data production.
I do not expect that every data producer will choose to adopt the USGS standards for hydrometric data production. However, I do believe that the end-users of their data will start to raise more questions about the normative references upon which the credibility and defensibility of the data rely.
The new OGC WaterML2.0 standard for data interoperability makes it easier for hydrometric data search discovery and access. Increasingly, data searches will be feature based (i.e. basin, eco-region, geo-political region, or river length) rather than agency specific. Inter-comparability of the data returned from several agencies will be of paramount importance. The most valuable indicator of data inter-comparability is the normative standards. In this new paradigm if there is no apparent traceability to these references end-users of the data will demand it.
I also believe the increased use of USGS standards provide payback for the USGS as well. The techniques and methods papers will come under closer scrutiny as more agencies formally adopt them as a normative reference. This will result in more rigorous peer review, which is always a good thing, and will also create opportunities for partnering for maintenance of the standards. The Water Survey of Canada has already partnered with the USGS for writing updated ADCP standards and has shared in the significant cost of doing the necessary research behind that standard.
To be perfectly clear, data credibility is larger than just software.
The entire Quality Management System is relevant including: site selection and management; field procedures; field visit scheduling; controls and procedures for instrument procurement calibration and maintenance; training; accreditation; data handling procedures; data security; quality control procedure; quality assurance; and audits of the quality management system. USGS compliant software ensures traceability to source and the ability to explain every transformation along the path from the sensor to the end-user. This is an important component of data credibility but does not infer that none of these other factors matter.
The USGS made a very good call to opt for a commercial solution over an in-house re-build. I cannot say whether any of these serendipitous benefits had any bearing on their choice or not. There are a host of very pragmatic reasons why agency-specific solutions are reaching the end of their life-cycle and have become uneconomical to maintain or re-build.