“Any time scalability and performance are an issue”

That’s the answer to the question, “When do we need in-memory computing?” The question came up in a meeting with a technology industry analyst today and his answer was so simple, so direct, that it needed to be repeated.

In-memory computing is the answer whenever scalability and performance are an issue.

Those close to technology probably know this but for the marketplace as a whole, in-memory technology is still a relatively unknown space. Where it is known it is often erroneously associated only with analytics. If that wasn’t enough, there is confusion over the concepts of the in-memory data grid versus the in-memory database.

It would be worthwhile to sort this out.

In-memory data grid (IMDG)

An in-memory data grid uses a computer’s main memory instead of reading and writing to a disk. Data is stored as name-value pairs, allowing for much greater flexibility in the type of data being stored. There’s no complex data schematic required. In a fast-changing technology landscape, this can be a life-saver.

In our Big Data world, in-memory computing allows a wide variety of data types to be managed easily, and on the fly. Most importantly, IMDG’s scale horizontally (grow in storage and processing capacity) as resources are added or taken away. This happens dynamically and for some products, with no administration. It isn’t any coincidence that interest in IMDG technology is growing rapidly alongside awareness of Big Data’s challenges.

IMDG’s aren’t as well-known and experienced developers need to get their heads around the new paradigm to be able to take full advantage of the benefits. This will come with time. It has to, as the marketplace for in-memory computing is expected to double annually for the next four years.

In-memory database (IMDB)

An IMDB still uses a computer’s main memory but is set up to manage the relational structure that has become the architecture of databases for decades. An IMDB supports standard ways to call data using Structured Query Language (SQL) and is very familiar to application and database developers. Not as flexible to scale horizontally, IMDB’s are typically expanded by installing additional large servers and/or data appliances. This is not as flexible or inexpensive and doesn’t take advantage of commodity hardware the way IMDG’s do.

Early in-memory applications were essentially cache storage used by a traditional database where speed was an issue. IMDB is well-known and companies find it easy to find resources that can work with it right away.

But here’s a note of caution: IMDG products are chipping away at the features of IMDB and we can expect a single in-memory product to handle all functionality in the near future. There are things that IMDG’s can handle that will make them the eventual winners such as acting as a compute grid (in-memory MapReduce), ‘pushing’ events to clients on a network as data changes, and managing streaming queries.

Much more than analytics

The most powerful real-time analytics applications use in-memory data to allow for very fast crunching using drag and drop controls. Because this was one of the most visually impressive and business-facing uses, in-memory computing quickly became associated tightly with analytics.

In reality, in-memory computing supports high volume transactions in telecom, financial markets and gaming. All three have incredibly high transaction rates that would overwhelm a traditional database. Though we started in outlier use cases like these, in-memory is becoming the de facto way to work with any data challenge. E-commerce, logistics and transportation, and commercial banks are adopting in-memory computing as their way to manage data dealing with things like fraud detection, sensor data and security.

Urgency’s best friend

And it isn’t just in the back end the in-memory’s speed and flexibility matters. In a Big Data world, as information arrives with velocity, volume, variety and volatility, there needs to be pre-processing that immediately sorts the interesting from ‘that which can be figured out later’. In-memory computing performs the role of filter for the things that need to be dealt with first for any number of reasons. It is urgency’s best friend.

The traditional relational database isn’t dead, but in-memory computing is increasingly taking over its place business operations. This is trend most analysts see continuing as the pressure of Big Data makes performance and scalability a major issue for most organizations.


Tags: , , , , ,

Categories: Data Analytics / Big Data, Patterns / Rules / Events, Real-time

Author:Chris Taylor

Reimagining the way work is done through big data, analytics, and event processing. There's no end to what we can change and improve. I wear myself out...

Subscribe to the blog

Subscribe and receive an email when new articles are published

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: