Not all BI self service capabilities are created equal

Boris Evelson

By Boris Evelson

There’s a lot of hype out there by many vendors who claim that they have tools and technologies to enable BI end user self service. Do they? When you analyze whether your BI vendor can support end user self service, consider the following types of “self service” and related BI tool requirements:

#1. Self service for average, casual users.

  • What do these users need to do?
    • Run and lightly customize canned reports and dashboards
    • Run ad hoc queries
    • Add calculated measures
    • Collaborate
    • Fulfill their BI requirements with little or no training (typically one needs search-like, not point-and-click UI for this)
  • What capabilities do they need for this?
    • Report and dashboard templates
    • Customizable prompts, sorts, filters, and ranks
    • Report, query, dashboard building wizards
    • Portal
    • Semantic layer (not all BI vendor have a rich semantic layer)
    • Prompting for columns (not all BI vendors let you do that)
    • Drill anywhere  (only BI vendors with ROLAP and multisourcing / data federation provide this capability)

#2. Self service for advanced, power users

  • What do these users need to do?
    • Perform what-if scenarios (this often requires write back, which very few BI vendors allow)
    • Add metrics, measures, and hierarchies not supported by the underlying data model (typically one needs some kind of in-memory analytics capability for this)
    • Explore based on new (not previously defined) entity relationships (typically one needs some kind of in-memory analytics capability for this)
    • Not knowing exactly what one is looking for (typically one needs search-like UI for this)
Read more

The Definition of Complexity Is A Complex Matter

Jost Hoppermann

Recently, I discussed complexity with a banker working on measuring and managing complexity in a North American bank. His approach is very interesting: He found a way to operationalize complexity measurement and thus to provide concrete data to manage it. While I’m not in a position to disclose any more details, we also talked about the nature of complexity. In absence of any other definition of complexity, I offered a draft definition which I have assembled over time based on a number of “official” definitions. Complexity is the condition of:

Read more

Is Risk Based Testing Part of Your Test Planning?

Margo Visitacion

Recently, I’ve been getting more inquiries around risk based testing.   In addition to agile test methods and test estimation, test teams turning their eyes to risk based testing is just another positive step in integrating quality through out the SDLC.  Yes, I still see QA engineers as having to put their evangelist hats on to educate their developer brothers and sisters that quality is more than just testing (don’t get me wrong, consistent unit and integration testing is a beautiful thing), however, any time that business and technology partners can think about impact and dependencies in their approach to a solid, workable application elevates quality to the next level. 

Keep asking those questions about risk based testing – and make sure that you’re covering all of the angles.  Make sure that you’re covering:

Read more

BI on BI

Boris Evelson

By Boris Evelson

How do you know if your BI application has high, low or no ROI? How do you know that what the business users requested last month and you spent countless of hours and sleepless nights working on is actually being used? How do you know if your BI applications are efficient and effective? I don't have all the answers, but here's what I recommend.

Start with collecting basic data about your BI environment. The data model (hint, it's a classical multidimensional model exercise) should have the following components:

  •  Requests (these should be available from your help desk and project/portfolio management applications), such as
    • User provisioning
    • New applications
    • New data sources
    • Data model changes
    • New/changed metrics
    • New/changed reports
    • New report delivery options
  • Usage (these should be available from your DBMS and BI apps log files or from www.appfluent.com or www.teleran.com) by
    • Person
    • Time of day
    • Database
    • BI application
    • Report
    • Index
    • Aggregate
    • KMI/KPM
  • Track additional events like
    • Application usage vs. using application/report just to download or export data
    • Incomplete/cancelled queries
Read more

2008 Wasn’t Great, 2009 Was Worse: Global Banking Platform Deals 2009

Jost Hoppermann

Next week, I will present first results of Forrester’s 2009 global banking platform deals survey. A total of 17 banking platform vendors submitted their 2009 deals for evaluation. One year ago, the same set of deals would have represented at least 19 vendors: In the 2009 survey, FIS’s deals include those of acquired US-based Metavante, and Temenos’ deals include those of acquired French Viveo. These theoretically 19 participating vendors submitted a total of 1,068 banking platform deals to evaluate, a steep increase compared with the about 870 submitted deals for 2008.

We had to classify a large share of these 1,068 banking platform deals as extended business or even as a simple renewed license — if the vendors did not already submitted them with the according tag. Forrester’s “rules of the game” did not allow us to recognize further deals, for example, because a non-financial-services firm signed a deal. Overall, Forrester counted 269 of the submitted deals as 2009 new named customers, compared with 290 for 2008. In the past, Forrester sorted the vendors into four buckets: Global Power Sellers, Global Challengers, Pursuers, and Base Players. The Pursuers and in particular the Global Challengers saw only minor changes in the previous years. 2009 has shaken this stable structure, and we will see many vendors in groups they haven’t been in before.

Read more

Third-party Database Tools Still Matter

Noel Yuhanna

Over the past year, I have received numerous inquiries asking me whether third-party database tools that focus on performance and tuning, backup recovery, replication, upgrade, troubleshooting, and migration capabilities matter anymore now that leading DBMS providers such as Oracle, IBM, and Microsoft are offering improved automation and broader coverage. 

I find that third-party tools complement well with native database tools in assisting DBAs, developers and operational staff in their day-to-day activities. Last year, I had the opportunity to speak to dozens of enterprises that support hundreds and thousands of databases across various DBMSes. Most enterprises reported they saw at least a 20 percent IT staff productivity when using a third-party database tool.

Third-party vendor tools remain equally important because they support:

Read more

Future App Servers -- Radically Different

John R. Rymer

I was lucky enough last week [22 March 2010] to moderate a panel at EclipseCon on the future of application servers. The panelists did a great job, but I thought were far too conservative in their views. I agree with them that many customers want evolutionary change from today to future app servers, but I see requirements driving app servers toward radical change. Inevitably.

The changes I see:

 

Requirement

Response

Get more value from servers, get responsive, get agile and flexible

Virtualized everything, dynamic provisioning, automated change management

Govern rising application stack complexity

Lean, fit to purpose app servers, profiles and other standard configurations, modeling and metadata-based development and deployment

Provide “Internet scale”

Scale-out app servers, data tiers, network capacity, modular/layered designs, stateless architectures

 

Read more

Natural user interfaces - notes from the field

Jeffrey Hammond

Last week I was once again hustling through a brutal travel week (10,000 miles in the air and two packed red-eyes) when I came across something really interesting. It was ~ 9 AM and I'd just gotten off AA flight 4389 from Toronto. I was a bit bleary eyed from a 4 AM call with a Finnish customer and was just trying to schlep my way to the Admiral's club for a cup of coffee when I stumbled across Accenture's Interactive Network display at the juncture of terminal H and K.

 

THis is a picture of a screen for the Accenture Interactive Network, at American's terminal at O'Hare

 

So what? You might ask, it's just a big screen and we already know our future is minority report -right? Yes - those of us in the echo chamber might know that, but what really struck me was watching my fellow travelers and how they interacted with the display. I sat and watched for about 10 minutes (while forgetting about the sorely needed cuppa joe) and just watched people as they started to walk past, then pause, then go up to the screen and start playing with it. On average folks would stay for a few minutes and read some of the latest news feeds, then hurry on to their next stop. But what I really found intriguing was how they interacted with the system:

 

Read more

I forget: what's in-memory?

Boris Evelson

By Boris Evelson
 

In-memory analytics are all abuzz for multiple reasons. Speed of querying, reporting and analysis is just one. Flexibility, agility, rapid prototyping is another. While there are many more reasons, not all in-memory approaches are created equal. Let’s look at the 5 options buyers have today:
 

1. In-memory OLAP. Classic MOLAP cube loaded entirely in memory

Vendors: IBM Cognos TM1, Actuate BIRT
Pros

  • Fast reporting, querying and analysts since the entire model and data are all in memory.
  • Ability to write back.
  • Accessible by 3rd party MDX tools (IBM Cognos TM1 specifically)

Cons

  • Requires traditional multidimensional data modeling.
  • Limited to single physical memory space (theoretical limit of 3Tb, but we haven’t seen production implementations of more than 300Gb – this applies to the other in-memory solutions as well)

 

2. In-memory ROLAP. ROLAP metadata loaded entirely in memory.

Vendors: MicroStrategy
Pros

  • Speeds up reporting, querying and analysis since metadata is all in memory.
  • Not limited by physical memory

Cons

  • Only metadata, not entire data model is in memory, although MicroStrategy can build complete cubes from the subset of data held entirely in memory
  • Requires traditional multidimensional data modeling.

 

3. In memory inverted index. Index (with data) loaded into memory

Vendors: SAP BusinessObjects (BI Accelerator), Endeca

Pros

  • Fast reporting, querying and analysts since the entire index is in memory
  • Less modelling required than an OLAP based solution
Read more

Asset Virtualization – When Avatars Are Field Engineers

Holger Kisker

Smoke and fire is all around you, the sound of the alarm makes you dizzy and people are running in panic to escape the inferno while you have to find your way to safety. This is not a scene in the latest video game but actually training for e.g. field engineers in an exact virtual copy of a real world environment such as oil platforms or manufacturing plants.

In a recent discussion with VRcontext, a company based in Brussels and specialized since 10 years in asset virtualization, I was fascinated by the possibilities to create virtual copies of real world large, extremely complex assets simply from scanning existing CAD plans or on-site laser scans. It’s not just the 3D virtualization but the integration of the virtual world with Enterprise Asset Management (EAM), ERP, LIMS, P&ID and other systems that allows users to track, identify and locate every single piece of equipment in the real and virtual world.

These solutions are used today for safety training simulations as well as to increase operational efficiency e.g. in asset maintenance processes. There are still areas for further improvements, like the integration of RFID tags or sensor readings. However, as the technology further matures I can see future use cases all over the place – from the virtualization of any kind of location that is difficult or dangerous to enter to simple office buildings for a ‘company campus tour’ or a ‘virtual meeting’. And it doesn’t require super-computing power – it all runs on low-spec, ‘standard’ PCs and the models are only taking few GBytes storage.

So if you are bored of running around in Second Life or World Of Warcraft, if you ever have the chance, exchange your virtual sword for a wrench and visit the ‘real’ virtual world of a fascinating oil rig or refinery.

Please leave a comment or contact me directly.

Kind regards,

Holger Kisker