Bringing Government IT Modernization and Innovation to the Forefront

Data Quality: Is Your Agency Ready to Maintain it?

shutterstock_114106825Data is one of the most valuable assets that government agencies hold, and not only from a cyber-security perspective. With government agencies being inundated with petabytes of information, the ability to parse that data to leverage the maximum amount of information for mission-critical activities is essential. However, if data quality is poor, the ability to manipulate and apply that data to different activities both within and between agencies is compromised. In the end, not only is time to deliver on mission increased, but the costs of delivering on the mission are also increased as data needs to be accessed multiple times in order to derive value.

In a recent article featured on Reality Check, Dan Virgillito shared that “federal [agencies] are spending $1.4 billion to $7 billion on cloud computing systems annually.” Yet, while it is a smart move to migrate data to the cloud, the costs of cloud adoption soon outweigh the benefits if data quality is poor. Adopt cloud with vigor, but know that cloud adoption and migrating to the cloud is not without challenges.

For Virgillito, the biggest challenges to maintaining data quality are:

Fragmentation of critical data: Data fragmentation can happen when the cloud is scaled to accommodate growth and can affect data quality in terms of completeness, validity, format, accessibility and accuracy.
Data redefinition: In data redefinition, a decrease in data quality can happen as data is redefined in order to move it from one format to another. For example, synchronized data can become unsynchronized through this process and that can greatly affect data quality.
Outsourcing functions: Sometimes outsourcing functions may not be able to keep up with the data quality assured by cloud service providers.

Virgillito continues by saying, “Government agencies should also exercise discretion in selection of cloud service providers,” and goes on to comment that demonstrated specialization and experience with the public sector needs to be top of mind when choosing a cloud service provider. For effective data quality management, agencies should consider the following:

• Combining quality management with business intelligence
• Modifying architecture to meet broader data quality needs
• Focusing on data stewardship

Although there are concerns for data management and quality for agency IT teams to keep in mind when migrating to the cloud, help is at hand. By following the above recommendations and working with vendors who are able to demonstrate experience with complex public sector deployments, agencies will be able to manage their data better while maintaining quality.

To learn more about data quality management read Dan Virgillito’s article in its entirety here.


Enter your email below to get our articles sent straight to your inbox!