ModernGOV

Bringing Government IT Modernization and Innovation to the Forefront

How do you define Big Data?

shutterstock_147295391 Modernization and innovation are the new drivers for IT purchases in the federal government. In a recent broadcast on Federal News Radio, moderator Jim Flyzik, along with distinguished panelists, set-out to discuss these drivers and the most effective strategies for inspiring modernization. Flyzik turned to leading government technologists including Zach Goldstein, Chief Information Officer & Director for High Performance Computing at NOAA, Jim St. Pierre, Deputy Director, Information Technology Lab, NIST and Duncan McCarthy, Technical Executive, Innovation, National Geospatial-Intelligence Agency to talk about some of the big issues that they, and their peers, are dealing with when addressing IT modernization.

Discussions about the challenges of IT modernization in government usually focus on how to achieve transformation while grappling with budgetary constraints and facing rules and regulations from another era. However, another challenge lies in how to handle the outputs of the new IT, such as how to manage, or even define, big data. During the panel, Goldstein and McCarthy spoke at length on the challenges involved in establishing which attributes we can rely on to accurately define big data and how it is different from traditional data, large data and open data.

McCarthy explained that NGA “struggled with …what actually defines big data whether it is many pieces of information or a large piece of information.” And, depending on how data is defined, that definition determines what the NGA’s IT needs will be. In the NGA’s case, the quick and substantial increase in the number of satellites collecting images is a growing concern. With private sector companies planning to launch 200+ satellites over the next decade, the number of images (and amount of data) that needs to be collected and stored is significantly increasing. The challenge, due to this increase, is agencies will need to update business processes in order to continue meeting their mission. McCarthy continued, “You can’t operate with the same paradigm where you take the imagery down and quickly disseminate it to people who look at it.”

Chris Borneman, Vice President, Software AG Government Solutions, shared that his team is focused on developing solutions that address three core issues that are the major impediments to defining and leveraging data for government agencies. The first is how to control access to data and by extension- understanding the life cycle and also tracking programmatic access to data. This is followed by how to get big data online and how to correlate big data in real-time. Without these elements firmly in place, agencies stand no chance of applying data in meaningful ways.

Along those lines, change agents within individual agencies are embracing challenge as a means to innovate and establish a guide to big data. For example, NIST visionary Jim St. Pierre may have aided in laying the groundwork to a long lasting solution, stating that, “despite wide spread agreement that big data will have a huge impact, there are still some fundamental questions that can be confusing for people who want to use or integrate Big Data.” NIST in an attempt to offer a management framework to this ever challenging issue started a big data public working group in 2013. Since then the group has created a roadmap to establishing taxonomies, definitions, and use cases in order to offer a management blue print for the big data environment.

To listen to view the broadcast in its entirety, click here.

SUBSCRIBE

Enter your email below to get our articles sent straight to your inbox!