Defining a successful BI strategy is a lot more than gathering requirements and selecting a vendor. While it’s been a subject of many books, I know few of you have time to read them, so here’s a short version.
First defining what BI is and what it is not. Is it just reporting, analytics and dashboards? Or does it involve ETL, DW, portal, MDM, etc., as well?
If the former, you then need to define linkages, dependencies, overlaps and integration with all of the latter (including - very importantly - integration and coordination with the higher level enterprise architecture efforts). If latter, it’s a whole different subject. You then really do need to read a few thick books.
Ensure senior business executive commitment and top down mandate. If you cannot get that, do not proceed until you do. Two ways to “sell BI” to them (even though that’s not a good position to be in):
Educate them on BI ROI. Here's where you'd build a high level BI business case.
This year SAPPHIRE officially changed its name and became SAPPHIRE NOW. Why? Different answers from different people. Those that should know said: "The new name stresses the urgency." Urgency for whom, SAP? And will the next SAPPHIRE be named SAPPHIRE THEN? Never change a successful brand.
Another premiere for SAPPHIRE was the simultaneous show in Orlando, US and Frankfurt, Germany. With 5,000 attendees in Frankfurt, 10,500 in Orlando and 35,000 online participants, this was the biggest SAPPHIRE event ever. I must admit I was concerned going to Frankfurt while everyone in Walldorf desperately tried to escape to Orlando. Who wants to attend a second-hand event? But now I’m a believer. SAP managed to balance the important parts of the show between Orlando and Frankfurt. Keynotes were held simultaneously in both locations via virtual video connection and speakers in both cities. In general I never had the feeling I would miss anything important in Frankfurt simply because it was the smaller event overall. It didn’t make a difference if I couldn’t attend another 400 presentations in Frankfurt or 800 in Orlando from the total of 1,200+ presentations – I had a packed agenda and got all that I expected and needed, including 1:1 meetings with SAP executives like Jim Snabe. The simultaneous, virtual set-up not only helped to save a lot of cost, it created a sense of a bigger virtual community and underlined SAP’s ambitions for more sustainability. To all that traveled intercontinental: Shame on you, next year stay in your home region!
Like every show SAPPHIRE 2010 had its stars as well:
I get this request almost on a weekly basis: "Boris, my BI vendor is offering me the following discount, is it a good deal or not?" The first question is what are you comparing it to? It reminds me of an old joke: Q. How much is 5 times 5. A. Depends on whether you're buying or selling. Many of the vendors do not publish or reveal list prices, or even if they do, they are revealed only under NDA to each client, so good luck comparing what the vendor told you and what they told another client. So what ARE you comparing it to?
Another problem, IMHO, is that many of the vendors muddy the waters with CPU based prices, clock speed based prices, etc. Yes, CPU, server, core based prices make sense if you are growing and want to lock in a good deal now, before you grow and expand. But in the end, you, the buyer, still need to figure out how much the software costs you per seat, per user. So with both of these challenges in mind I looked through my 20+ years of notes on BI contracts and per seat license costs and came up with the following. Notice, an interesting X-factor (obviously, I fixed the numbers a bit to have it look nicely like that):
BI output consumer, no interactivity $300
BI output consumer, with light (sort, filter, rank) interactivity $600 (or 2x)
BI output consumer with heavy interactivity (interactive dashboards, search, etc.) $1,200 (or 4x)
My friend and highly respected colleague, Wayne Eckerson from TDWI, posted a great article called “Purple BI People”. In the article he described some of the best practices for business and IT alignment, and cross-functional skills sets needed for successful and effective BI professionals. Wayne, I loved the blue cow analogy, you know that I always think in metaphors, analogies, similies and associations. But, while I completely agree with Wayne in his near term assessment, best practices and recommendations, I would like to suggest another long term point of view.
Can business and IT ever align on BI? Can business ever be satisfied with IT for delivering successful and effective BI applications? Is there such a thing as BT (Business Technology, the phrase that Forrester coined and promotes) in BI?
I used to think we could deliver on that promise. Not so sure it’s that straightforward now. Just look at some of the hopelessly diametrically opposing business and IT priorities. I hear the following complaints from my clients day in and day out:
Business is all about revenue generation. While IT can support that, much more often cost cutting is IT's highest priority.
Business wants solutions now. Not tomorrow. IT needs to go through due diligence of testing and approving BI applications. Right now, on demand does not sit well with IT.
Business wants to react to constantly changing BI requirements. IT has to plan.
Business sometimes is willing to do something “quick and dirty” – even at the expense of potentially jeopardizing accuracy and adherence, compliance with standards. IT is all about compliance and sticking with standards.
I have long resisted and will continue to resist for the foreseeable future any notions that the BI market is commoditizing. A single simple look at the BI maturity in enterprises and next gen BI technologies is a simple proof that we are far, very far, from any kinds of commoditization. Consolidation is quite a different story. Last week's SAP acquisition of Sybase and my roaming the exhibitor / partner floor at SAPPHIRE in Orlando are two more proofs. On a huge SAPPHIRE exhibition floor I could count software partners by the number of the fingers on my hands. Why? Because everyone who matters has been acquired by a competitor! Most of the exhibitors were management consultancies, systems integrators and other SAP implementation partners. Hence, a lesson to independent BI vendors: offer your own full BI stack or position yourself for an acquisition. No other long-term options in my mind.
But as always I welcome all and any comments and opposing views.
SAP gets its own relational (Sybase ASE) and analytical (Sybase IQ) DBMS. Why is this a positive since SAP already has tight partnerships with major DBMS and DW vendors such as Oracle, IBM, Microsoft, Teradata, and HP? Simple. First, SAP can now control the code. Second, SAP can now potentially reduce reliance on DBMS partners, most of whom (Oracle, IBM, Microsoft) have their own full software stacks and therefore compete, often putting a strain on partnership relationships. True, Sybase ASE has a rather low market penetration, other than on Wall St (see Stefan Ried's blog), but since SAP BW takes care of most of the traditional RDBMS design and implementation tasks, Sybase could be positioned as a black box engine under BW, that does not require separate design, administration and maintenance environment. *** Update. SAP just confirmed that each of its applications can run on an independent database, so having mixed DBMS platforms under ERP and BW will not be an issue.
SAP also gets highly relevant (for low latency BI) and currently missing CEP technolgy from the Sybase Aleri acquisition and an OEM version of Coral8.
SAP customers may also benefit from advanced analytics from Fuzzy Logix, integrated and embdded in SybaseIQ
Sybase gets a badly needed BI front end on top of its Sybase IQ analytical DBMS. While Sybase is leading the market in the columnar DBMS, it is somewhat challenged selling and positioning the product with the business buyers, since they can’t really see, feel, or touch it.
What is BI? There are two prevailing definitions out there – broad and narrow. The broad definition (using our own) is that BI is a set of methodologies, processes, architectures, and technologies that transform raw data into meaningful and useful information used to enable more effective strategic, tactical, and operational insight and decision-making. But if we stick to this definition then shouldn’t we include data integration, data quality, master data management, data warehousing and portals in BI? I know lots of folks would disagree and fit these into data management or information management segments, but not BI.
Then, the narrow definition is used when referring to just the top layers of the BI architectural stack such as reporting, analytics and dashboards. But even there, as Jim Kobielus and I discovered as we were preparing to launch our BI TechRadar 2010 research, we could count over 20 (!) product categories such as Advanced Analytics, Analytical Performance Management, Scorecards, BI appliances and BI SaaS, BI specific DBMS, BI Workspaces, Dashboards, Geospatial analytics, Low Latency BI, Metadata Generated BI Apps, Non modeled exploration and In-memory analytics, OLAP, Open Source BI and SaaS BI, Packaged BI Apps, Process / Content Analytics, Production reports and ad-hoc query builders, Search UI for BI, Social Network / Media Analytics, Text analytics, Web Analytics.
To make matters worse, some folks out there are now trying to clearly separate BI and analytics, by trying to push a “core, traditional BI is commoditized, analytics is where differentiation is today” message. Hmmm, I thought I was building analytical apps using OLAP starting back in the early 80’s.
There’s a lot of hype out there by many vendors who claim that they have tools and technologies to enable BI end user self service. Do they? When you analyze whether your BI vendor can support end user self service, consider the following types of “self service” and related BI tool requirements:
#1. Self service for average, casual users.
What do these users need to do?
Run and lightly customize canned reports and dashboards
Run ad hoc queries
Add calculated measures
Fulfill their BI requirements with little or no training (typically one needs search-like, not point-and-click UI for this)
What capabilities do they need for this?
Report and dashboard templates
Customizable prompts, sorts, filters, and ranks
Report, query, dashboard building wizards
Semantic layer (not all BI vendor have a rich semantic layer)
Prompting for columns (not all BI vendors let you do that)
Drill anywhere (only BI vendors with ROLAP and multisourcing / data federation provide this capability)
#2. Self service for advanced, power users
What do these users need to do?
Perform what-if scenarios (this often requires write back, which very few BI vendors allow)
Add metrics, measures, and hierarchies not supported by the underlying data model (typically one needs some kind of in-memory analytics capability for this)
Explore based on new (not previously defined) entity relationships (typically one needs some kind of in-memory analytics capability for this)
Not knowing exactly what one is looking for (typically one needs search-like UI for this)
How do you know if your BI application has high, low or no ROI? How do you know that what the business users requested last month and you spent countless of hours and sleepless nights working on is actually being used? How do you know if your BI applications are efficient and effective? I don't have all the answers, but here's what I recommend.
Start with collecting basic data about your BI environment. The data model (hint, it's a classical multidimensional model exercise) should have the following components:
Requests (these should be available from your help desk and project/portfolio management applications), such as
In-memory analytics are all abuzz for multiple reasons. Speed of querying, reporting and analysis is just one. Flexibility, agility, rapid prototyping is another. While there are many more reasons, not all in-memory approaches are created equal. Let’s look at the 5 options buyers have today:
1. In-memory OLAP. Classic MOLAP cube loaded entirely in memory
Vendors: IBM Cognos TM1, Actuate BIRT
Fast reporting, querying and analysts since the entire model and data are all in memory.
Ability to write back.
Accessible by 3rd party MDX tools (IBM Cognos TM1 specifically)
Requires traditional multidimensional data modeling.
Limited to single physical memory space (theoretical limit of 3Tb, but we haven’t seen production implementations of more than 300Gb – this applies to the other in-memory solutions as well)
2. In-memory ROLAP. ROLAP metadata loaded entirely in memory.
Speeds up reporting, querying and analysis since metadata is all in memory.
Not limited by physical memory
Only metadata, not entire data model is in memory, although MicroStrategy can build complete cubes from the subset of data held entirely in memory
Requires traditional multidimensional data modeling.
3. In memory inverted index. Index (with data) loaded into memory
Vendors: SAP BusinessObjects (BI Accelerator), Endeca
Fast reporting, querying and analysts since the entire index is in memory
Less modelling required than an OLAP based solution