Modernizing Dredging Quality Management

By Jeffrey Mroz, CompTIA

A project to migrate a national dredging database and tracking system for the U.S. Army Corps of Engineers to a cloud service provider required taking several unique considerations into account in order to ensure seamless service and address future capability and security needs. 

Cloud technology saves organizations time and money, improves data security, and increases transparency in data and analytics. It scales easily, enhances data recovery, supports efficient collaboration, and benefits the environment. However, while the advantages are plentiful, the challenges of cloud migration abound as well. Migrating to the cloud is a lengthy, complex, and costly process. For government agencies, additional challenges exist, as was the case for the U.S. Army Corps of Engineers in an effort to move its database and tracking system for Dredging Quality Management from an on-prem server to cloud service provider Microsoft Azure.

The DQM database and tracking system currently receives data from over 320 scows, 24 hopper dredges, and more than 80 pipeline dredges. These numbers grow continuously as dredging contracts and permits are finalized, regularly increasing the volume of insights that the DQM database and tracking system receives.

In 2020, the Corps contracted with Woolpert to assist its cloud migration. But the agency simply could not risk encountering failures or interruptions in service since the DQM database and tracking system play a critical role in the nation’s infrastructure operations and engineering needs. Completing the move required specialized care and expertise to not only avoid interruptions in service but also position it for the needs of the future.

Essential System

The DQM database and tracking system was developed more than a decade ago for the USACE National DQM Program. The platform is a standardized, remote monitoring and documentation system that gathers insights from multiple dredging vessels. This allows it to deliver timely analytics, reporting, dredge certifications, data quality control, and data management.

The database and tracking system are critical to the Corps, as the agency is tasked with maintaining and improving nearly 12,000-mi of shallow-draft inland and intracoastal waterways; 13,000-mi of deep-draft coastal channels; and 400 ports, harbors, and turning basins. Preserving these federal channels and waterways is crucial for maintaining a strong economy and bolstering the country’s national security and long-term competitiveness.

Approximately $2 trillion in annual commerce from the U.S. marine transportation industry uses these routes, and more than 48 percent of all consumer goods pass through these harbors. To ensure that these federal channels remain operable, the Corps spends billions on dredging activities; to remove dredged material in 2020 alone the amount surpassed $2.5 billion.

Standardizing Effort. The DQM database and tracking system delivers visibility into this complex process and provides data to help the Corps and the dredging industry better preserve federal channels and waterways. Hundreds of sensors on vessels nationwide gather dredging activity data and transmit near-real-time insights to the database, where they are processed, stored, and made available for viewing and analysis. The Corps uses this data to create standardized dredge project monitoring, enforce performance-based instrumentation requirements, track historical dredging operations, and hold contractors accountable through the review and analysis of dredging operations.

To keep the dredging database moving forward while waiting for features to be released for government entities, the project team wrote temporary code for the missing aspects. Image courtesy Orestegaspari (iStock)

The DQM database and tracking system currently receives data from over 320 scows, 24 hopper dredges, and more than 80 pipeline dredges. These numbers grow continuously as dredging contracts and permits are finalized, regularly increasing the volume of insights that the DQM database and tracking system receives.

Predictably, as the amount of data expanded, the need to migrate to a cloud platform grew. While the DQM was state-of-the-art when first developed, technology advancements, coupled with the system starting to run out of its lifecycle, necessitated an upgrade.

Migrating Challenges

Migrating an on-prem system to the cloud presented multiple unique challenges that do not exist with cloud-native applications. Transitioning an on-prem system to the cloud without the degradation of service is difficult. To do so requires maintaining two systems simultaneously and keeping them synced, in parity, and very tightly woven together (with the understanding that one of them will at some point be turned off).

Allowing the DQM to experience interruptions was not an option, given its criticality. Adding to the challenge was the amount of system rewrites required to ensure a smooth transition.

The DQM database and tracking system were originally developed on Oracle, so the language was written for that service provider. To successfully migrate the system to Azure, it was critical to rewrite the code, data, and procedures that kept the system running and effective.

On top of that, unavoidable obstacles related to Azure’s commercial and government clouds surfaced during the migration. Government resources are often three or four versions behind commercial iterations. This meant that the team had to wait anywhere from six months to a year-and-a-half for features and resources available for commercial clients to become available for government entities.

The extended time was not wasted. The project team used those waiting periods to develop workarounds adapting to the lack of features. To move the migration forward, the team implemented a creative solution: write temporary code to compensate for absent tools until the necessary features were released for government use. This tactic helped continue the DQM migration without disrupting functionality or further delaying timelines.

Attention to Security. Another concern during the database migration was security. As part of the transition to the cloud, the project team used Databricks, a big data platform that captures, stores, and enriches insights.

Being able to store insights in a data lake was effective. But, like any emergent solution, security baselining was an issue (this entirely new technology maintains data in a central repository in original forms for processing and analysis). Without established security standards or technical integration guidelines, the project team was presented with both a problem and an opportunity: evaluating the security of the insights in the data lake.

Because there were no existing security requirements to consider, the team had the additional responsibility of learning the intricacies of the technology to create best practices for security. This took time, but after completing a deep dive into the emergent technology, personnel were empowered to enhance the data lake’s security and make the system better overall.

Ensuring Success

As the cloud migration continued, the project team took things a step further to enhance the long-term success of the transition by moving the insights from the DQM beyond the data lake and into a data lakehouse. A data lakehouse is an open standards-based cloud storage solution that enables deep data analysis and processing as well as curation and publication of data for reporting and business intelligence purposes.

With this evolution, the application’s computer power, storage, overhead, security, and auditing no longer existed on a single system. Instead, they were split into their own unique components. This will allow for partners or other authorized users to connect directly to the data without touching the computing resources. The data lakehouse also offered advanced security features, letting users limit who has access to certain data and dictate how much access someone has to the insights.

A data lakehouse for the new cloud system offers additional security features and greater insights into its collected data. Image courtesy Focus_On_Nature (iStock).

Another benefit is the opportunity to gather the most high-quality data, an essential advantage for the Corps. Those involved in the National DQM Program collect information from all the dredges, conduct deep data analysis and processing, and publish insights for the dredging industry before the cycle repeats itself.

Supporting Vessels. The data lakehouse provides insight into each dredging vessel’s progress. The National DQM Program receives many types of data, including probabilistic, deterministic, spatial, and latitudinal/longitudinal data. These insights are analyzed and processed to show which dredging phase each specific vessel is in—and that insight matters. Toward the end of a dredging cycle, vessels must offload the material they have gathered, and the sediment deposit location is critically important. Specific areas have been designated for offloading to prevent backfills or other unintended consequences. With the insights in the data lakehouse, the program can analyze a dredging vessel’s data to determine what stage it is entering to ensure the proper measures are taken.

The data lakehouse even provides visibility into the number of loads that every dredging vessel dumps and how many cycles they have gone through, helping to keep the dredging contractors accountable.

Enhancing Operations

Migrating the DQM to the cloud and moving to a data lakehouse will enhance national dredging operations. Modernizing the legacy system with newer technology for data storage and processing will enable the Corps to reduce its imprint on the cloud while processing and enriching data in less than a minute after collecting it from a dredging vessel. 

The investment also will allow for the agency to incorporate other analysis tools into its workflows that can deliver faster aggregation as well as more and better reports.

Cloud migration requires highly skilled and knowledgeable workers. For public sector organizations that are looking to transition their systems from on-prem servers to the cloud, additional challenges exist. However, with the right team in place, modernization efforts can be carried out to better position programs, and the data they rely on, for the needs of the future.


Jeffrey Mroz, CompTIA, is Geospatial Technical Consult, Woolpert, jeff.mroz@woolpert.com.


More News from TME