ModernGOV

Bringing Government IT Modernization and Innovation to the Forefront

Software AG Government Solutions In the News: In-Memory Computing

In-memory computing

Big data was big news in 2014, and for good reason—data is at the heart of every organization. But data that’s locked away or hard for your users to access where and when they need it doesn’t help anyone. In-memory computing is a transformative technology that drastically scales your current application environment to ensure that users can access what they need, from multiple apps, as quickly as possible. In-memory computing is already becoming a core component of high performing applications within government and with great results. Best of all, it allows you to achieve this massive scale all while supercharging the speed of your enterprise applications so you can tackle big data quickly.

Software AG Government Solutions has been highlighted in the news recently sharing insights about the ways in-memory computing can help bring agencies up to speed and what agencies need to consider when evaluating in-memory computing solutions.

Fabien Sanglier, Principal Architect, Software AG Government Solutions, shared six crucial attributes of a high-performance in-memory architecture on Data Center Knowledge. In the article, he shared six of the most important concerns to address when evaluating in-memory data management solutions:

1)      Predictable, Extremely Low Latency. Look for in-memory management solutions that can manage terabytes of data without suffering from garbage collection pauses.

2)      Easy Scaling with Minimal Server Footprint. Scaling to terabytes of in-memory data should be easy and shouldn’t require the cost and complexity of dozens of servers and hundreds of Virtual Machines. Your in-memory management solution should be able to scale up as much as possible on each machine so that you’re not saddled with managing and monitoring a 100-node data grid.

3)      Fault Tolerance and High Availability. Mission-critical applications demand fault tolerance and high availability. The volatile nature of in-memory data requires a data management solution that delivers five nines (99.999 percent) uptime with no data loss and no single points of failure.

4)      Distributed In-Memory Stores with Data Consistency Guarantees. In-memory architectures must ensure the consistency and durability of critical data across that array. Ideally, you’ll have flexibility in choosing the appropriate level of consistency guarantees, from eventual and strong consistency up to transactional consistency.

5)      Fast Restartability. In-memory architectures must allow for quickly bringing machines back online after maintenance or other outages.

6)      Advanced In-Memory Monitoring and Management Tools. In dynamic, large-scale application deployments, visibility and management capabilities are critical to optimizing performance and reacting to changing conditions. Your in-memory architecture should be supplemented with a clear dashboard for understanding up-to-the-millisecond performance of in-memory stores, along with easy-to-use tools for configuring in-memory data sets.

Read more of Fabien’s insights about in-memory architecture here.

Darryn Graham, Chief Architect, Software AG Government Solutions shared four ways in-memory computing can bring agencies up to speed with FCW. In the article, he discusses how Federal agencies with mission-critical applications are learning to unshackle themselves from slow, disk-bound databases and embrace the tremendous benefits that come with managing data in-memory. He highlights four key reasons why in-memory computing is making inroads at agencies:

1)      Speed and acceleration. With in-memory computing, federal IT teams can analyze data at a speed that improves its relevancy in decision-making, helping agencies meet ever-shrinking decision windows.

2)      Easy to use and easy to add. In-memory computing satisfies the “need it now” demands of users waiting for tabulations and evaluations. There is also no simpler way to store data than in its native format in memory.

3)      Cost savings and enhanced storage capabilities. With a precipitous drop in the cost of RAM in the past decade, in-memory computing has become a budget-friendly option for federal agencies. When procurement officials can buy a 96 gigabyte server for less than $5,000, in-memory storage of data makes smart fiscal and technical sense.

4)      Higher throughput with real-time processing. In-memory computing significantly lowers system latency, which leads directly to dramatically higher throughput. Agencies that run high-volume transactions can use in-memory data to boost processing capacity without adding computing power.

Read more of Darryn’s insights here.

In-memory computing offers unprecedented opportunities for innovation within government. To learn more and to access a free trial offer, visit the Software AG Government Solutions website.

Category: Data Trove

SUBSCRIBE

Enter your email below to get our articles sent straight to your inbox!