Bringing Government IT Modernization and Innovation to the Forefront

Supercharging Web Applications for Speed and Responsiveness without Breaking the Budget

shutterstock_139834159Last month, some of the brightest minds in the IT business got together at the Adobe Coldfusion Summit to exchange ideas and best practices on how to successfully deliver web applications to market. One of the defining themes of this year’s summit was how best to address issues of speed and scalability – the two most common complaints for both web application developers and end users.

Data is as Data Does: The Benefits of Big Data Analytics

shutterstock_151687703Data is data and at its most basic level, it is merely lines of code. However, we continue to become more and more connected each day, with 40% of the world’s population online and 1.75 billion of us using smart phones every day. Those in the business world as well as in the public sector are striving to determine how best to utilize Big Data, and by extension Big Data Analytics, to improve efficiency and mitigate cost.

High-Performance Without a Supercomputer – Consider In-Memory Computing

Data center consolidationWhen most hear the term “High-Performance Computing” or HPC, they immediately think well-funded laboratories, CERN’s Large Hadron Collider facility, and global institutions routinely processing terabytes, petabytes, exabytes and more data for specific scientific and research missions. While this kind of specialized super-computing continues to evolve, many more organizations have demanding and growing requirements for larger-scale data processing—without the benefits of super-computing capabilities.

What’s New for the Federal HPC Community?

shutterstock_189293378 In the last days of July 2015, the White House released a new Executive Order (EO) focused on creating a National Strategic Computing Initiative (NSCI). The purpose of this EO is to establish a “cohesive, strategic effort within the Federal Government and a close collaboration between the public and private sectors” that will “maximize the benefits of high-performance computing (HPC) research, development, and deployment.”

Software AG Government Solutions In the News: In-Memory Computing

Big data was big news in 2014, and for good reason—data is at the heart of every organization. But data that’s locked away or hard for your users to access where and when they need it doesn’t help anyone. In-memory computing is a transformative technology that drastically scales your current application environment to ensure that users can access what they need, from multiple apps, as quickly as possible. In-memory computing is already becoming a core component of high performing applications within government and with great results. Best of all, it allows you to achieve this massive scale all while supercharging the speed of your enterprise applications so you can tackle big data quickly.

Data Center Consolidation in Government: Progress and Best Practices

Recently, Software AG Government Solutions Chief Solutions Architect, Chris Steel, was part of a panel discussion about data center consolidation in government on Federal News Radio. Other panelists included Mike Krieger, Deputy CIO, US Army; Ed Dorris, CIO, OCC/Department of Treasury; Mary Givvines, Deputy Director for Office of Information Services, NRC; Vaughn Stewart, Chief Technical Evangelist, Pure Storage and Tony Evans, Director, Worldwide Defense Systems, Schneider Electric. The panel was moderated by Jim Flyzik of the Flyzik Group.


Enter your email below to get our articles sent straight to your inbox!