Share This

SAP adds in-memory processing and greater Hadoop integration to keep existing customers, not to draw new ones

In-memory processing is one of those technologies that has a basic definition, but it can be implemented multiple ways. The core idea is consistent enough: When processing one or more queries from a given data source, keep as much of the data resident in memory as possible, rather than copying it selectively from storage.

In a previous related article posted on MSR Communications, Nikita Ivanov of GridGain gave us a simple definition of what he think in-memory computing is all about. InfoWorld expresses that in-memory databases are fast becoming the rule and not the exception. With Oracle, IBM, Microsoft, Pivotal — there’s barely a major database vendor or analytics technology that isn’t getting a shot of in-memory processing power.

When adding SAP’s Business Warehouse product to that list by using the term “in-memory data fabric” to describe the kind of in-memory processing it has built, a whole new perspective of in-memory computing changes.

Click here to read the full article on InfoWorld featuring GridGain!