BI on BI

Boris Evelson

By Boris Evelson

How do you know if your BI application has high, low or no ROI? How do you know that what the business users requested last month and you spent countless of hours and sleepless nights working on is actually being used? How do you know if your BI applications are efficient and effective? I don't have all the answers, but here's what I recommend.

Start with collecting basic data about your BI environment. The data model (hint, it's a classical multidimensional model exercise) should have the following components:

  •  Requests (these should be available from your help desk and project/portfolio management applications), such as
    • User provisioning
    • New applications
    • New data sources
    • Data model changes
    • New/changed metrics
    • New/changed reports
    • New report delivery options
  • Usage (these should be available from your DBMS and BI apps log files or from www.appfluent.com or www.teleran.com) by
    • Person
    • Time of day
    • Database
    • BI application
    • Report
    • Index
    • Aggregate
    • KMI/KPM
  • Track additional events like
    • Application usage vs. using application/report just to download or export data
    • Incomplete/cancelled queries
Read more

2008 Wasn’t Great, 2009 Was Worse: Global Banking Platform Deals 2009

Jost Hoppermann

Next week, I will present first results of Forrester’s 2009 global banking platform deals survey. A total of 17 banking platform vendors submitted their 2009 deals for evaluation. One year ago, the same set of deals would have represented at least 19 vendors: In the 2009 survey, FIS’s deals include those of acquired US-based Metavante, and Temenos’ deals include those of acquired French Viveo. These theoretically 19 participating vendors submitted a total of 1,068 banking platform deals to evaluate, a steep increase compared with the about 870 submitted deals for 2008.

We had to classify a large share of these 1,068 banking platform deals as extended business or even as a simple renewed license — if the vendors did not already submitted them with the according tag. Forrester’s “rules of the game” did not allow us to recognize further deals, for example, because a non-financial-services firm signed a deal. Overall, Forrester counted 269 of the submitted deals as 2009 new named customers, compared with 290 for 2008. In the past, Forrester sorted the vendors into four buckets: Global Power Sellers, Global Challengers, Pursuers, and Base Players. The Pursuers and in particular the Global Challengers saw only minor changes in the previous years. 2009 has shaken this stable structure, and we will see many vendors in groups they haven’t been in before.

Read more

Third-party Database Tools Still Matter

Noel Yuhanna

Over the past year, I have received numerous inquiries asking me whether third-party database tools that focus on performance and tuning, backup recovery, replication, upgrade, troubleshooting, and migration capabilities matter anymore now that leading DBMS providers such as Oracle, IBM, and Microsoft are offering improved automation and broader coverage. 

I find that third-party tools complement well with native database tools in assisting DBAs, developers and operational staff in their day-to-day activities. Last year, I had the opportunity to speak to dozens of enterprises that support hundreds and thousands of databases across various DBMSes. Most enterprises reported they saw at least a 20 percent IT staff productivity when using a third-party database tool.

Third-party vendor tools remain equally important because they support:

Read more

Future App Servers -- Radically Different

John R. Rymer

I was lucky enough last week [22 March 2010] to moderate a panel at EclipseCon on the future of application servers. The panelists did a great job, but I thought were far too conservative in their views. I agree with them that many customers want evolutionary change from today to future app servers, but I see requirements driving app servers toward radical change. Inevitably.

The changes I see:

 

Requirement

Response

Get more value from servers, get responsive, get agile and flexible

Virtualized everything, dynamic provisioning, automated change management

Govern rising application stack complexity

Lean, fit to purpose app servers, profiles and other standard configurations, modeling and metadata-based development and deployment

Provide “Internet scale”

Scale-out app servers, data tiers, network capacity, modular/layered designs, stateless architectures

 

Read more

Natural user interfaces - notes from the field

Jeffrey Hammond

Last week I was once again hustling through a brutal travel week (10,000 miles in the air and two packed red-eyes) when I came across something really interesting. It was ~ 9 AM and I'd just gotten off AA flight 4389 from Toronto. I was a bit bleary eyed from a 4 AM call with a Finnish customer and was just trying to schlep my way to the Admiral's club for a cup of coffee when I stumbled across Accenture's Interactive Network display at the juncture of terminal H and K.

 

THis is a picture of a screen for the Accenture Interactive Network, at American's terminal at O'Hare

 

So what? You might ask, it's just a big screen and we already know our future is minority report -right? Yes - those of us in the echo chamber might know that, but what really struck me was watching my fellow travelers and how they interacted with the display. I sat and watched for about 10 minutes (while forgetting about the sorely needed cuppa joe) and just watched people as they started to walk past, then pause, then go up to the screen and start playing with it. On average folks would stay for a few minutes and read some of the latest news feeds, then hurry on to their next stop. But what I really found intriguing was how they interacted with the system:

 

Read more

I forget: what's in-memory?

Boris Evelson

By Boris Evelson
 

In-memory analytics are all abuzz for multiple reasons. Speed of querying, reporting and analysis is just one. Flexibility, agility, rapid prototyping is another. While there are many more reasons, not all in-memory approaches are created equal. Let’s look at the 5 options buyers have today:
 

1. In-memory OLAP. Classic MOLAP cube loaded entirely in memory

Vendors: IBM Cognos TM1, Actuate BIRT
Pros

  • Fast reporting, querying and analysts since the entire model and data are all in memory.
  • Ability to write back.
  • Accessible by 3rd party MDX tools (IBM Cognos TM1 specifically)

Cons

  • Requires traditional multidimensional data modeling.
  • Limited to single physical memory space (theoretical limit of 3Tb, but we haven’t seen production implementations of more than 300Gb – this applies to the other in-memory solutions as well)

 

2. In-memory ROLAP. ROLAP metadata loaded entirely in memory.

Vendors: MicroStrategy
Pros

  • Speeds up reporting, querying and analysis since metadata is all in memory.
  • Not limited by physical memory

Cons

  • Only metadata, not entire data model is in memory, although MicroStrategy can build complete cubes from the subset of data held entirely in memory
  • Requires traditional multidimensional data modeling.

 

3. In memory inverted index. Index (with data) loaded into memory

Vendors: SAP BusinessObjects (BI Accelerator), Endeca

Pros

  • Fast reporting, querying and analysts since the entire index is in memory
  • Less modelling required than an OLAP based solution
Read more

Asset Virtualization – When Avatars Are Field Engineers

Holger Kisker

Smoke and fire is all around you, the sound of the alarm makes you dizzy and people are running in panic to escape the inferno while you have to find your way to safety. This is not a scene in the latest video game but actually training for e.g. field engineers in an exact virtual copy of a real world environment such as oil platforms or manufacturing plants.

In a recent discussion with VRcontext, a company based in Brussels and specialized since 10 years in asset virtualization, I was fascinated by the possibilities to create virtual copies of real world large, extremely complex assets simply from scanning existing CAD plans or on-site laser scans. It’s not just the 3D virtualization but the integration of the virtual world with Enterprise Asset Management (EAM), ERP, LIMS, P&ID and other systems that allows users to track, identify and locate every single piece of equipment in the real and virtual world.

These solutions are used today for safety training simulations as well as to increase operational efficiency e.g. in asset maintenance processes. There are still areas for further improvements, like the integration of RFID tags or sensor readings. However, as the technology further matures I can see future use cases all over the place – from the virtualization of any kind of location that is difficult or dangerous to enter to simple office buildings for a ‘company campus tour’ or a ‘virtual meeting’. And it doesn’t require super-computing power – it all runs on low-spec, ‘standard’ PCs and the models are only taking few GBytes storage.

So if you are bored of running around in Second Life or World Of Warcraft, if you ever have the chance, exchange your virtual sword for a wrench and visit the ‘real’ virtual world of a fascinating oil rig or refinery.

Please leave a comment or contact me directly.

Kind regards,

Holger Kisker

The Battle Of Partner Eco-Systems

Holger Kisker

On the need to analyze, compare and rate partner eco-systems – please vote.

The world is becoming more and more complex and so are the business challenges and their related IT solutions. Today no single vendor can provide complete end-to-end solutions from physical assets to business process optimization. Some large vendors like IBM, Oracle or HP, have extended their solution footprint to cover more and more of the four IT core markets hardware, middleware software, business applications and services but still require complementary partner solutions to cover end-to-end processes. Two examples of emerging complex IT solutions include:

  • Smart Computing integrates the physical world with business process optimization via four steps: Awareness (sensors, tags etc.), Analysis (analytic solutions), Alternatives (business applications with decision support) and Action (feedback loop into the physical world). A few specialized vendors such as Savi Technology can cover the whole portfolio from sensors to business applications for selected scenarios. However, in general a complete solution requires many partners working closely together to enable an end-to-end process.
  • Cloud Computing includes different IT resources (typically infrastructure, middleware and applications) which are offered in pay-by-use, self-service models via the internet. The seamless consumption of these resources for the end user anytime and anywhere however requires multiple technologies, processes and a challenging governance model often with many different stakeholder involved, behind the scene.
Read more

Three Top Questions To Ask a BI Vendor

Boris Evelson

By Boris Evelson

 

An editor from a leading IT magazine asked me this question just now, so I thought I'd also blog about it. Here it goes:

 

Q1: What are the capabilities of your services organization to help clients not just with implementing your BI tool, but with their overall BI strategy.

 

The reason I ask this as a top question, is that most BI vendors these days have modern, scalable, function rich, robust BI tools. So a real challenge today is not with the tools, but with governance, integration, support, organizational structures, processes, etc – something that only experienced consultants can help with.
 
Q2:  Do you provide all components necessary for an end to end BI environment (data integration, data cleansing, data warehousing, performance management, portals, etc in addition to reports, queries, OLAP and dashboards)?
 
If a vendor does not you'll have to integrate these components from multiple vendors.
 
Read more

Number of people using BI

Boris Evelson

By Boris Evelson

 

A number of clients ask me "how many people do you think use BI". Not an easy question to answer, will not be an exact science, and will have many caveats. But here we go:

 

  1. First, let's assume that we are only talking about what we all consider "traditional BI" apps. Let's exclude home grown apps built using spreadsheets and desktop databases. Let's also exclude operational reporting apps that are embedded in ERP, CRM and other applications.
  2. Then, let's cut out everyone who only gets the results of a BI report/analysis in a static form, such as a hardcopy or a non interactive PDF file. So if you're not creating, modifying, viewing via a portal, sorting, filtering, ranking, drilling, etc, you probably do not require a BI product license and I am not counting you.
  3. I'll just attempt to do this for the US for now. If the approach works, we'll try it for other major regions and countries.
  4. Number of businesses with over 100 employees (a reasonable cut off for a business size that would consider using what we define as traditional BI) in the US in 2004 was 107,119
  5. US Dept of Labor provides ranges as in "firms with 500-749 employees". For each range I take a middle number. For the last range "firms with over 10,000" I use an average of 15,000 employees.
  6. This gives us 66 million (66,595,553) workers employed by US firms who could potentially use BI
  7. Next we take the data from our latest BDS numbers on BI which tell us that 54% of the firms are using BI which gives us 35 million (35,961,598) workers employed by US firms that use BI
Read more