I forget: what's in-memory?

Boris Evelson

By Boris Evelson
 

In-memory analytics are all abuzz for multiple reasons. Speed of querying, reporting and analysis is just one. Flexibility, agility, rapid prototyping is another. While there are many more reasons, not all in-memory approaches are created equal. Let’s look at the 5 options buyers have today:
 

1. In-memory OLAP. Classic MOLAP cube loaded entirely in memory

Vendors: IBM Cognos TM1, Actuate BIRT
Pros

  • Fast reporting, querying and analysts since the entire model and data are all in memory.
  • Ability to write back.
  • Accessible by 3rd party MDX tools (IBM Cognos TM1 specifically)

Cons

  • Requires traditional multidimensional data modeling.
  • Limited to single physical memory space (theoretical limit of 3Tb, but we haven’t seen production implementations of more than 300Gb – this applies to the other in-memory solutions as well)

 

2. In-memory ROLAP. ROLAP metadata loaded entirely in memory.

Vendors: MicroStrategy
Pros

  • Speeds up reporting, querying and analysis since metadata is all in memory.
  • Not limited by physical memory

Cons

  • Only metadata, not entire data model is in memory, although MicroStrategy can build complete cubes from the subset of data held entirely in memory
  • Requires traditional multidimensional data modeling.

 

3. In memory inverted index. Index (with data) loaded into memory

Vendors: SAP BusinessObjects (BI Accelerator), Endeca

Pros

  • Fast reporting, querying and analysts since the entire index is in memory
  • Less modelling required than an OLAP based solution
Read more

Asset Virtualization – When Avatars Are Field Engineers

Holger Kisker

Smoke and fire is all around you, the sound of the alarm makes you dizzy and people are running in panic to escape the inferno while you have to find your way to safety. This is not a scene in the latest video game but actually training for e.g. field engineers in an exact virtual copy of a real world environment such as oil platforms or manufacturing plants.

In a recent discussion with VRcontext, a company based in Brussels and specialized since 10 years in asset virtualization, I was fascinated by the possibilities to create virtual copies of real world large, extremely complex assets simply from scanning existing CAD plans or on-site laser scans. It’s not just the 3D virtualization but the integration of the virtual world with Enterprise Asset Management (EAM), ERP, LIMS, P&ID and other systems that allows users to track, identify and locate every single piece of equipment in the real and virtual world.

These solutions are used today for safety training simulations as well as to increase operational efficiency e.g. in asset maintenance processes. There are still areas for further improvements, like the integration of RFID tags or sensor readings. However, as the technology further matures I can see future use cases all over the place – from the virtualization of any kind of location that is difficult or dangerous to enter to simple office buildings for a ‘company campus tour’ or a ‘virtual meeting’. And it doesn’t require super-computing power – it all runs on low-spec, ‘standard’ PCs and the models are only taking few GBytes storage.

So if you are bored of running around in Second Life or World Of Warcraft, if you ever have the chance, exchange your virtual sword for a wrench and visit the ‘real’ virtual world of a fascinating oil rig or refinery.

Please leave a comment or contact me directly.

Kind regards,

Holger Kisker

The Battle Of Partner Eco-Systems

Holger Kisker

On the need to analyze, compare and rate partner eco-systems – please vote.

The world is becoming more and more complex and so are the business challenges and their related IT solutions. Today no single vendor can provide complete end-to-end solutions from physical assets to business process optimization. Some large vendors like IBM, Oracle or HP, have extended their solution footprint to cover more and more of the four IT core markets hardware, middleware software, business applications and services but still require complementary partner solutions to cover end-to-end processes. Two examples of emerging complex IT solutions include:

  • Smart Computing integrates the physical world with business process optimization via four steps: Awareness (sensors, tags etc.), Analysis (analytic solutions), Alternatives (business applications with decision support) and Action (feedback loop into the physical world). A few specialized vendors such as Savi Technology can cover the whole portfolio from sensors to business applications for selected scenarios. However, in general a complete solution requires many partners working closely together to enable an end-to-end process.
  • Cloud Computing includes different IT resources (typically infrastructure, middleware and applications) which are offered in pay-by-use, self-service models via the internet. The seamless consumption of these resources for the end user anytime and anywhere however requires multiple technologies, processes and a challenging governance model often with many different stakeholder involved, behind the scene.
Read more

Three Top Questions To Ask a BI Vendor

Boris Evelson

By Boris Evelson

 

An editor from a leading IT magazine asked me this question just now, so I thought I'd also blog about it. Here it goes:

 

Q1: What are the capabilities of your services organization to help clients not just with implementing your BI tool, but with their overall BI strategy.

 

The reason I ask this as a top question, is that most BI vendors these days have modern, scalable, function rich, robust BI tools. So a real challenge today is not with the tools, but with governance, integration, support, organizational structures, processes, etc – something that only experienced consultants can help with.
 
Q2:  Do you provide all components necessary for an end to end BI environment (data integration, data cleansing, data warehousing, performance management, portals, etc in addition to reports, queries, OLAP and dashboards)?
 
If a vendor does not you'll have to integrate these components from multiple vendors.
 
Read more

Number of people using BI

Boris Evelson

By Boris Evelson

 

A number of clients ask me "how many people do you think use BI". Not an easy question to answer, will not be an exact science, and will have many caveats. But here we go:

 

  1. First, let's assume that we are only talking about what we all consider "traditional BI" apps. Let's exclude home grown apps built using spreadsheets and desktop databases. Let's also exclude operational reporting apps that are embedded in ERP, CRM and other applications.
  2. Then, let's cut out everyone who only gets the results of a BI report/analysis in a static form, such as a hardcopy or a non interactive PDF file. So if you're not creating, modifying, viewing via a portal, sorting, filtering, ranking, drilling, etc, you probably do not require a BI product license and I am not counting you.
  3. I'll just attempt to do this for the US for now. If the approach works, we'll try it for other major regions and countries.
  4. Number of businesses with over 100 employees (a reasonable cut off for a business size that would consider using what we define as traditional BI) in the US in 2004 was 107,119
  5. US Dept of Labor provides ranges as in "firms with 500-749 employees". For each range I take a middle number. For the last range "firms with over 10,000" I use an average of 15,000 employees.
  6. This gives us 66 million (66,595,553) workers employed by US firms who could potentially use BI
  7. Next we take the data from our latest BDS numbers on BI which tell us that 54% of the firms are using BI which gives us 35 million (35,961,598) workers employed by US firms that use BI
Read more

Elastic Caching Platforms Balance Performance, Scalability, And Fault Tolerance

Mike Gualtieri

Fast Access To Data Is The Primary Purpose Of Caching

Developers have always used data caching to improve application performance. (CPU registers are data caches!) The closer the data is to the application code, the faster the application will run because you avoid the access latency caused by disk and/or network. Local caching is fastest because you cache the data in the same memory as the code itself. Need to render a drop-down list faster? Read the list from the database once, and then cache it in a Java HashMap. Need to avoid the performance-sapping disk trashing of an SQL call to repeatedly render a personalized user’s Web page? Cache the user profile and the rendered page fragments in the user session.

Although local caching is fine for Web applications that run on one or two application servers, it is insufficient if any or all of the following conditions apply:

  • The data is too big to fit in the application server memory space.
  • Cached data is updated and shared by users across multiple application servers.
  • User requests, and therefore user sessions, are not bound to a particular application server.
  • Failover is required without data loss.

To overcome these scaling challenges, application architects often give up on caching and instead turn to the clustering features provided by relational database management systems (RDBMSes). The problem: It is often at the expense of performance and can be very costly to scale up. So, how can firms get improved performance along with scale and fault tolerance?

Elastic Caching Platforms Balance Performance With Scalability And Availability

Read more

SAP Middleware Directions: More Open Source, In-Memory Stuff

John R. Rymer

At the 15 March press and analyst Q&A by SAP co-CEOs Jim Hagemann Snabe and Bill McDermott, new middleware boss Vishal Sikka shed more light on the company's intentions for NetWeaver. Many of SAP's business applications customers use NetWeaver, both as a foundation for SAP's applications and to extend those applications using integration, portals, and custom developed apps. For about a year, the question has been how much additional investment SAP will put into NetWeaver.

Sikka made two comments that indicate how he's thinking about the NetWeaver portfolio.

1. In response to my question about whether SAP is concerned that Oracle's ownership of Java will put it at a disadvantage, Sikka started by highlighting SAP's work on Java performance, but then noted the availability of good open-source Java software to support the requirements of SAP customers.

Read more

A Dire Year For Banking Platform Vendors

Jost Hoppermann

For the past couple of years, I have worked on the analysis of global banking platform deals at this time of the year. Currently, I’m again working on the results of a global banking platform deals survey, this time for the year 2009. Accenture and CSC did not participate in 2009, and former participants Fiserv and InfrasoftTech continued their absence from the survey, which started about two years ago. The 2009 survey began with confirmed submissions from a total of 19 banking platform vendors.

We would have been glad to see more participating vendors, in particular some of the more regionally oriented ones. However, US vendor Jack Henry & Associates as well as multiple regional vendors in Eastern Europe, Asia, and South America did not participate. Nevertheless, the survey saw some “newcomers” from the Americas, Europe, and the Middle East, for example, Top Systems in Uruguay, Eri Bancaire in Switzerland, and Path Solutions in Kuwait. Consequently, the survey now covers banking platform vendors in all regions of the world except Africa and Central America.

However, 19 was not the final vendor count: One of the 19 vendors, France-based banking platform vendor Viveo, dropped out of the survey because Temenos acquired it shortly before Viveo provided its data. Another vendor simply told us that it only saw business with existing clients and, in the absence of any business with new clients, it saw no sense in participating. While all other participating vendors won business with new clients (whether the rules of the game allowed Forrester to count that business or not), 2009 was not the best of times.

Read more

Progress Software’s Coming Out Party

John R. Rymer

We all need to revisit our understanding of Progress Software. On March 4, I was introduced to the “new and improved” Progress at the company’s annual briefing for industry and financial analysts. The company is a new enterprise software vendor with 25 years of experience. If you know about Progress, it is likely through an ISV solution based on the OpenEdge database/4GL. Or perhaps through the Sonic enterprise service bus ... or the Actional SOA management product.

How you should think about Progress Software now (see Figure):

First, Progress Software has a new mission, which it calls“operational responsiveness.” To achieve this mission, Progress will primarily seek to help enterprises develop real-time, event-based architectures that extend existing systems. Real-time, event-based systems let companies see what’s going on in their business processes at any given moment, and to act while transactions and interactions are in flight to fix problems, ensure compliance, add revenue opportunities, and/or cut costs. Example scenarios:

Read more

Who are the BI Personas?

Boris Evelson

The world is changing. The traditional lines of demarcation between IT and business, developers and end users, producers and consumers of info no longer work. But every time I attempted to create a matrix of BI personas in the new world, I ended up with so many dimensions (business vs. IT, consumers vs producers, strategic vs tactical vs operational decisions, departmental vs. line of business vs enterprise cross functional roles, running canned reports vs. ad-hoc queries, and many others, i ended up with something quite unreadable. But there still has to be something that on the one hand shows the realities of the new BI world, yet something that fits onto a single PPT. Here's my first attempt at it (click on the small image to see the full one).

 


In this diagram I attempt to show

  • Who's consuming vs. producing the information, how heavy or light that task is. What's interesting is that all our research shows is that most of the BI personas now are both consumers and producers of info.
  • Who's using what style of BI as in reports, queries, dashboards and OLAP
  • Who is using BI only as reports and dashboards embedded in enterprise apps (such as ERP, CRM, others), which usually means canned reports and prebuilt dashboards, vs BI as a standalone app
  • Who's using non traditional BI apps, such as the ones allow you to explore (vs just report and analyze) and allow you to perform that analysis without limitations of an underlying data model
  • Who's a producer and a consumer of advanced analytics
  • And finally show the level of reliance on IT by every group

As always, all comments, suggestions and criticism are very welcome! HD