Big Data, Cheap Storage Bring In-Memory Analytics Into Spotlight

In-memory analytics, like virtualization and the cloud, is an old idea that's been given new life. In this case, the combination of big data, inexpensive commodity storage and parallel processing make it possible to analyze terabytes of data without slowing systems to a crawl.

By Allen Bernard
Thu, December 06, 2012

CIO — If you're paying attention to big data, lately you've probably heard terms such as in-memory analytics or in-memory technologies. Like many tech trends that appear new only because their histories are obscured by newer and sexier tech, or because time has yet to catch up with them—server virtualization and the cloud are just reinventions from the mainframe days, after all—in-memory is a term being resurrected by two trends today: big data and cheap, fast commodity storage, particularly DRAM.

"In-memory has been around a long, long time," says David Smith, vice president of marketing and community for Revolution Analytics, a commercial provider of software, services and support for R, the open source programming language underpinning much of the predictive analytics landscape. "Now that we have big data, it's only the availability of terabyte (TB) systems and massive parallel processing [that makes] in-memory more interesting."

If you haven't already, you'll start to see offerings, including SAP HANA and Oracle Exalytics, which aim to bring big data and analytics together on the same box. Or you can also get HANA as a platform supported in the cloud by Amazon Web Services or SAP's NetWeaver platform, which includes Java and some middleware.

News: SAP Puts HANA In-Memory Database on Amazon Web Services

Meanwhile, analytics providers from SAS, Cognos, Pentaho, Tableau and Jaspersoft have all rolled out offerings to take advantage of the in-memory buzz, even if some of these offerings are mere bolt-ons to their existing product suite, says Gary Nakamura, general manager of in-memory database player Terracotta, a SoftwareAG company.

"They're saying, 'Hey, we're putting 10 gigs of memory into our product capability because that's all it can handle, but were calling it an in-memory solution,'" Nakamura says. The question, he adds, is whether they can scale to handle real-world problems and data flows. (To be fair, Terracotta has just released two competing products, BigMemory Max and Big Memory Go, the latter of which is free up to 32 GB. Both products scale into the TB range and can run on virtual machines or in distributed environments.)

In-Memory Technology Removes Latency From Analytics

"What is comes down to," says Shawn Blevins, executive vice president of sales and general manager at Opera Solutions, is that each product has "an actual layer where we can stage the data model itself, not just the data—and they exist in the same platform and the same box in flash memory."

Perspective: How Big Data Brings BI, Predictive Analytics Together

Continue Reading

Our Commenting Policies