Last month, some of the brightest minds in the IT business got together at the Adobe Coldfusion Summit to exchange ideas and best practices on how to successfully deliver web applications to market. One of the defining themes of this year’s summit was how best to address issues of speed and scalability – the two most common complaints for both web application developers and end users.
Data is data and at its most basic level, it is merely lines of code. However, we continue to become more and more connected each day, with 40% of the world’s population online and 1.75 billion of us using smart phones every day. Those in the business world as well as in the public sector are striving to determine how best to utilize Big Data, and by extension Big Data Analytics, to improve efficiency and mitigate cost.
Data consolidation, an inherent aspect of agency modernization efforts, is an even more viable option for agencies as cloud and super computing take IT to levels barely imaginable 30 years ago.
The Federal Buying season is upon us and while there are many new empowerments in place for agency CIOs through FITARA, many are still approaching this upcoming buying season with apprehension.
According to research firm IDC Corp., more than 200 billion devices will be connected by 2020, which far exceeds the number of devices anyone considered possible just a few years ago. With such expansive connectivity, it is difficult to gauge what the effects of the IoT will have on businesses, local and federal governments and even private citizens.
Data is one of the most valuable assets that government agencies hold, and not only from a cyber-security perspective. With government agencies being inundated with petabytes of information, the ability to parse that data to leverage the maximum amount of information for mission-critical activities is essential. However, if data quality is poor, the ability to manipulate and apply that data to different activities both within and between agencies is compromised. In the end, not only is time to deliver on mission increased, but the costs of delivering on the mission are also increased as data needs to be accessed multiple times in order to derive value.
One of the common jokes about the Internet of Things (IoT) is how your refrigerator might one day hack your house and run off with the contents of your bank account. While the scenario may seem absurd, it might also not be too far from the truth. As more and more devices connect to the Internet, innocuous endpoints – like refrigerators, washing machines, and televisions, are likely to be the gateways to valuable data opening the way for cyber attacks that we’ve only just begun to imagine.
It seems odd to think about blizzards in July as the summer heat takes its toll on Washington. Yet, for most government IT leaders, the blizzard they’re thinking about isn’t the next Snowpocalypse, but the data blizzard that’s hitting them as more data is collected from citizens, from Internet-connected devices, and from agency activities. At the most basic level, most agencies are equipped to deal with an inundation of data through their ability to store petabytes of data on premise or in the cloud.
The topic of IT portfolio management as an essential component to an agency’s ability to meet agency mission while mitigating costs, came up frequently in the presentations delivered at Digital Government Institute’s (DGI) Enterprise Architecture conference in late April. As a follow on, DGI is now hosting a webinar on June 23rd focused on EA and IT Portfolio Management, titled, “Fueling Transformational Success through IT Portfolio Management & Enterprise Architecture.