Not All In-Memory Analytics Tools Are Created Equal

I get many questions from clients interested in evaluating different in-memory technologies. My first advice is not to mix apples and oranges and clearly understand the differences between in-memory indexes, in-memory OLAP, in-memory ROLAP, in-memory spreadsheets, and other approaches. See more details in my recent blog entry "I forget: what's in-memory?" to understand the differences. Then once you zero in on a particular segment, you can indeed do an apples-to-apples comparison. Let's say we pick the category of in-memory associative indexes, which would include Microsoft PowerPivot, QlikTech, and TIBCO Spotfire. We also sometimes run across Advizor Solutions, but typically in smaller clients (and we do not include them in The Forrester Wave™ process). I recommend a three-step approach to compare these four tools:

  1. First, compare all of the commodity features of the vendors and tools like data integration and portal integration, operational features like administration, security, and others. You can leverage the detailed evaluation behind our slightly outdated 2008 BI Forrester Wave, if you are in a hurry, or you can wait for another month or so and the 2010 update will be published (it's in the last stages of editing at this point). Or if you are a Forrester IT client — not a vendor — client, send me a note and I'll share a draft preview with you.
  2. Next I recommend using our BI consultants directory to figure out how many trained professionals for each tool are out there, so that you can feel comfortable that you can get help with the tools in your specific region and industry. 
  3. Last but not least, evaluate features of these vendors that are truly differentiated. For example:
  • Memory optimization. Compression ratios. What’s the theoretical and practical limit on the size of the in-memory data models? What is the largest data model a vendor has in production? Since QlikTech has been in this game the longest, I typically hear from my clients that QlikTech has larger in production models than the others.
  • Load time. How long does it take to load the model into memory? Load speed per 1Gb, for example. This'll largely depend on your particular hardware and infrastructure, so you need to run your own tests.
  • Memory swapping. What is the vendor approach for handling models that are larger than what fits into a single memory space? Microsoft and QlikTech in-memory models are limited to what will fit into memory at one time. TIBCO Spotfire, however, can swap parts of the model in and out of memory, so you can indeed work with models that are larger than what would fit into memory at one time.
  • Incremental updates. What is the vendor approach for delta, incremental, etc., updates to the in-memory model without rebuilding, reloading the whole model? Here, only QlikTech has a solution and can update its model row by row.
  • Thick and thin clients. What are the differences between the thick-- and thin-client versions in terms of functionality, interactivity, etc.? What functionality do you lose when you run the in-memory model on a server and access via browser?
  • Access by third-party tools. Can you access the in-memory model via industry-standard SQL and MDX reporting and analysis tools? Only Microsoft will let you do that by publishing PowerPivot models to SharePoint and turning them into a SSAS cube. 
  • GUI. Microsoft users will find a familiar Excel-like interface, while QlikView and TIBCO Spotfire UIs are more visual.
  • Advanced analytics. Are you interested in combining advanced analytics (statistical analysis, predictive modeling, etc.) with your traditional BI reporting and analysis? QlikTech does not offer that capability. PowerPivot will leverage a few basic statistical and data mining functions that come with Excel and SQL Server. TIBCO Spotfire does offer advanced analytics functionality, S+ technology acquired from Insightful in 2008.

Did I miss anything?


A few additional criteria for evaluating in-memory BI platforms


Nice article about in-memory technology. I'm glad to hear you make the point that not all in-memory solutions are the same. In-memory isn't an end in itself; it's an enabler. I respectfully submit a few other evaluation criteria we see our customers considering as they compare in-memory BI platforms:

* Which user interface makes the best use of the fact that the data is in memory? Does the product provide an associative experience -- allowing the user to wander through the data, making selections and observing interactions.... does it work the way the mind works?
* Can one single product be used to address all of the following: fully-fledged dashboards, sophisticated analysis, and reports?
* Does the product have all of the required functionality to extract data from all required sources as well as keep the data updated in whatever way is required (for example , incremental updates -- as you mentioned)?
* Can end users create their own supplementary analysis through a zero-footprint client and share this analysis with other users? Or do developers or power users have to do this?


Erica Driver, QlikTech Inc.

In memory analytics - being different

This post demonstrates to me the excitement that can be stirred by a new technology. Although 'in memory' is not new and BI as stated above is not always the correct term to use. There is a huge potential for the products in the in-memory analytics market space – probably the biggest that we have seen for a long time as the idea of users not IT accessing corporate data to build their own analytics becomes more widely accepted. At last the chasm of capability between desktop and central server deployed BI solutions can be filled with user driven in memory analytic tools, either on the desktop or on the server or even in the cloud.
I am continuously fascinated by what 'analysts' have to say about the IT world that we live in. In Memory to me is has the same constraints as formula 1 racing scene – we have to deal with suboptimal conditions. Those constraints being driven by hardware manufacturers and providers of internet services. IBM for example is making huge strides to build faster and better in memory ready machines – yet the use of the memory and memory type is constrained by manufacturers and by the OS and then further still by the deployment and speed of the application by varying browser capabilities and so on.
We thus have, as ever, two conditions, 'lab' and the 'real world'. In the lab I am sure one product can be proven to be better than the rest technically, however in the real world we have to deal with the constraints imposed on us that affect the practicalities of doing business.
The company I work for only sells QlikView, and the reasons we work with QlikTech are in themselves reasons to select software. We like the company we like the product our customers do too. Further and much more importantly we like the prospect of better things to come. The right way of utilising technology to enable analytics today is not going to be the same as tomorrow. With the demise of IE and the trend towards Ajax the true flavour of this sector has not fully materialised. We are happy to be working with an innovative and progressive company that is not afraid and cares.
My questions, that we live by everyday are;
• can we satisfy customers needs with the software, will it do more? and
• does the software provide the ROI that customer need?
The answer is of course yes and further still users like QlikView. How does one measure this and more importantly how do you measure passion and the Ferrari-affect, to use an F1 analogy, that contagious/viral applications have I do not know.
Back in our constrained world in-memory analytics does have a place in the strategic software stack and every company should have, in my opinion, an open mind to at least utilising one of the many available to ensure that IT costs are not overrunning to build analytics/reports that users for years have been crying out to do.
As a footnote I’d suggest that more customers will have more than one in memory product unlike traditional BI tools. When selecting look for why they are different, not why they are the same. QlikView has patented technology in data association and this makes for some most insightful data analytics. Insight is of course something that drives business decisions and is clearly more beneficial than the innermost workings of memory optimisation... do Ferrari fans care about braking or fuel efficiency, or just winning?

Adrian, thanks for sharing

Adrian, thanks for sharing your thoughts

A few additional criteria to consider

First let me say that I agree entirely with Andre; the intelligent part of business intelligence is people and not software. That's why I also fully agree with Erica; in-memory is just a technology.

A few additional points you might want to consider:

1) There are some features which are fundamental in pretty much any type of analysis. I think that the management of time data is one of these. The ability to do time based analysis automatically so long as the appropriate fields exist in the data is fundamental. The ability to drill down into data is also an essential feature for data analysis software to have. Some consideration of whether these features exist and how user friendly they are, while not particularly objective and easy to measure, would be useful.

2) I also think that some measure of the investment involved in implementing and maintaining the software and data would also be useful.

Does implementation take weeks, months, years.
How much involvement does there need to be from IT when updating data, what about creating new views on data or adding data to analyses?

Again not easy to measure but important when selecting software.

Aran Nathanson

In-Memory BI


A few comments to the article itself and to some of the good folks who replied.

1. Boris, good categorization of in-memory products, although I believe many customers compare these tools to each other due to lack of understanding what they truly provide.

2. Erica, not very subtle taking an article that attempts to be unbiased, and throw in "criteria" that happen to be QlikTech's mantras but not necessarily have anything to do with in-memory technology. It took some time for QlikTech to drop the "associative in memory technology" nonsense and admit that it the term associative only applies to how the user interface functions, and let me tell you - that is not the only way to get a decent BI experience out of in-memory technology. Although it's useful.

3. Andrew, QlikView's in-memory technology is not columnar. They take a table, place it in RAM and apply token-based compression which is basically another term for indexing strings.

4. The reason QlikTech don't bother connect to OLAP (apart from the technical challenge of presentation multi dimensional models in the flat form QlikView supports) is because most of the customers don't need it. Where OLAP exists, QlikView is either useless or it is used for tasks that are too much of a pain to do via the OLAP solution. They just connect to the underlying data warehouse.

5. PowerPivot is Microsoft's way of giving up on mass marketing BI via the Analysis Services platform. Unless you are using a tool for personal analysis on a desktop, PowerPivot won't change much within the BI landscape. QlikTech understood this and started providing their personal version (which is pretty much Excel on steroids) for free. The other way to use PowerPivot, with SQL Server R2 and SharePoint, is just too much of a hassle to go through with other solutions with significantly simpler product architectures.

6. Finally, the whole in-memory technology issue is way over-rated. It became commonly discussed since 64-bit computing became popular, but the truth is it is not the only way of getting "fast, powerful, easy to use" business intelligence. 64-bit computing, along with new column-based technologies that take advantage of 21st century chipsets, do not require data to be fully memory-resident to provide a decent alternative to hefty OLAP projects.

7. Please remember that OLAP is mainly good for multi-user applications and that is not something PowerPivot will replace. QlikView focus on these types of deals now (Personal is free, remember?) but the truth is that technologically speaking, in-memory architecture is a very bad idea for scalable, multi-user, BI apps. QlikView customers buying it a year or two ago, know exactly what that means (more hardware, less historical data and/or less simultaneous users).

Founder of SiSense and
Author of 'The ElastiCube Chronicles'

QlikView data is stored in tabular format

To clear things up, QlikView data is stored in a tabular format -- a "record based table representation," specifically. This straight from the source: Håkan Wolgé, the braintrust behind much of QlikView. More info here:

Erica Driver, QlikTech