ModernGOV

Bringing Government IT Modernization and Innovation to the Forefront

Revisiting the Promises and Headaches of Big Data

shutterstock_196477304There are plenty of challenges facing the exploitation of Big Data by the federal government – from the quality and reliability of the data to the many disparate systems on which information is stored. But the challenges are common to most federal agencies, not unique to each one.

That is one conclusion to be drawn from the Advanced Technology Academic Research Center (ATARC) Big Data Summit, held in Washington on December 8th, 2015.

In the day’s first panel, Jeff Chen, Chief Data Scientist for the U.S. Department of Commerce, said his office, the Commerce Data Service (CDS), is focused on two concerns: establishing data as a core function, and how to make use of it.

“On a day-to-day basis, two things emerge,” Chen said. The first is the question of “readiness,” which focuses on employees; the second is “usability: how to best use data to further the economy or better understand society.”

By readiness, Chen said, the challenge is to assess skill levels among Commerce employees and establish the requirements for disciplines such as data science. It takes training to be able to work with Big Data, like knowing how to scrub raw data so that it’s usable and having the skills to design the analytics to tease out what the data can reveal.

The broader question of usability hinges on building a partnership between CDS and end users, he said, “to get end users to help us understand what the data can be used for.”

Linda Powell, Chief Data Officer (CDO) for the Consumer Financial Protection Bureau, observed that data scientists and engineers have to understand the agency’s mission and lingo.

“I have to be able to talk to [users] in their language. I have to understand what they’re trying to accomplish,” Powell said.

Most government data remains siloed, stored in proprietary systems, and not thoroughly validated. Dan Morgan, CDO for the Department of Transportation, said that part of unlocking the value of Big Data is to “build a culture that doesn’t punish people for the flaws in their data … that doesn’t punish people for their problems.”

The second panel focused on how to utilize data and analytics to enhance agencies’ missions.

The granddaddy of Big Data generation in the federal government might be the National Oceanic and Atmospheric Administration (NOAA), which generates terabytes of information from sensors and instruments all over the world.

“We’re working with five big cloud providers [to move] datasets over for open access, entire historical datasets,” said Amy Gaskins, NOAA’s Big Data Project Director. “Traditionally our primary customers were the weather industry, [but] now we’re looking at agriculture, retail, oil and gas, and commodities traders,” to name a few.

“Big Data has huge potential for detecting fraud”, said Caryl Brzymialkiewicz, Assistant Inspector General and CDO at the Office of the Inspector General for the Department of Health and Human Services. She described a case where a physician told dozens of patients they had cancer and prescribed chemotherapy. Many of them didn’t have cancer but received chemo, while other patients did have cancer and were told they were getting chemo but received placebos instead. The problem was, from her perspective, the data indicating what was going on was so outside the expected parameters, at first HHS thought the problem was with the data.

“[We need] analytics to support evidence-based decision-making,” she said. “Do we have the data we need to support the mission? Do we have data quality?”

Tim Lowden, the acting program manager for the General Services Administration’s Digital Analytics Program, said his office is working on just one program: “How [citizens] interact with government websites.”

Lowden said GSA has web analytics code that it is looking to include on all federal websites that will provide information on what visitors are looking for, what they’re downloading, whether they’re using a desktop computer or mobile device, which operating systems they’re using – everything that could provide a snapshot of citizen interaction with agencies.

He called up the dashboard for the web analytics to demonstrate; at that moment, there were more than 89,000 visitors to the National Weather Service site, which “dominates,” he said.

There are more than 4,000 government websites and more than 45 agencies. “Across 1.5 billion page views, less than 2% were using IE 8.0 or older,” Lowden said, which will provide useful information to agencies wondering how long to support outdated operating systems.

The day’s events provided clear evidence that federal agencies have their work cut out for them in the era of Big Data, both in terms of managing data and leveraging it to enhance their mission capability and delivery of services to all stakeholders. To learn more about the technologies that can support agencies as they pursue these goals, by streamlining analytics or enhancing in-memory computing capabilities, click here.

Category: Data Trove

SUBSCRIBE

Enter your email below to get our articles sent straight to your inbox!