An Analyst's Day

Holger Kisker

5:30am, the family sleeps and it’s time to prepare – today is Analyst Day in Frankfurt. I’m on the road 2h45min before the event starts (1h20min should be sufficient) but sometimes the traffic is terrible. Last week I missed a flight because the highway was completely closed after an accident and I had to give up after 3h driving for nothing. When the concern of missing an appointment slowly turns into certainty, these are the moments that cost me some of my (remaining) hair.

(Of course) I arrive much too early, but other analysts are already there (probably they don’t sleep at all). Plenty of time to look through my presentation again for some final adjustments and for some small talk with customers that arrived early.

1min before the kick-off, I make the last slide changes and load it to the presentation laptop. Another analyst colleague goes first. I have seen some of the slides a hundred times and look around at the faces of the attendees. For most, it’s the first time they see e.g. our market sizing and forecasting data, and they make hectic notes into their notebooks. They don’t know yet that we will distribute all slides after the event. I’m getting a bit nervous, but I’m used to it. When I'm not nervous any more before a presentation, it’ll get boring for me and the audience, and I should probably do something else.

Read more

Want to know what Forrester's lead data analysts are thinking about BI and the data domain?

Boris Evelson

By Boris Evelson

What is BI? There are two prevailing definitions out there – broad and narrow. The broad definition (using our own) is that BI is a set of methodologies, processes, architectures, and technologies that transform raw data into meaningful and useful information used to enable more effective strategic, tactical, and operational insight and decision-making. But if we stick to this definition then shouldn’t we include data integration, data quality, master data management, data warehousing and portals in BI? I know lots of folks would disagree and fit these into data management or information management segments, but not BI.

Then, the narrow definition is used when referring to just the top layers of the BI architectural stack such as reporting, analytics and dashboards. But even there, as Jim Kobielus and I discovered as we were preparing to launch our BI TechRadar 2010 research, we could count over 20 (!) product categories such as Advanced Analytics, Analytical Performance Management, Scorecards, BI appliances and BI SaaS, BI specific DBMS, BI Workspaces, Dashboards, Geospatial analytics, Low Latency BI, Metadata Generated BI Apps, Non modeled exploration and In-memory analytics, OLAP, Open Source BI and SaaS BI, Packaged BI Apps, Process / Content Analytics, Production reports and ad-hoc query builders, Search UI for BI, Social Network / Media Analytics, Text analytics, Web Analytics.

 

To make matters worse, some folks out there are now trying to clearly separate BI and analytics, by trying to push a “core, traditional BI is commoditized, analytics is where differentiation is today” message. Hmmm, I thought I was building analytical apps using OLAP starting back in the early 80’s.

 

Read more

Application Assessments -- How Do You Decide What Matters?

Phil Murphy

So you need to formulate an application modernization decision -- what to do with a given application -- how do you begin that decision making process? In the past, modernization decisions were often simply declared -- "We are moving to this technology" -- for a number of reasons, such as, it:

  • Keeps us current on technology.
  • Provides a more acceptable user-interface or integration capability.
  • Increases our exposure to access by external customers.
  • Increases the volume of business transaction we can process.
  • Trades custom/bespoke applications for standardized application packages such as ERP, payroll, human resources, etc.

Fast-forward to today -- you could simply go with your gut -- declare a solution based on what you currently know (or think you know) about the application in question. But it's a new day baby -- a proposal like that, without proper justification, is likely to be met with one of two responses from management:

Read more

Not all BI self service capabilities are created equal

Boris Evelson

By Boris Evelson

There’s a lot of hype out there by many vendors who claim that they have tools and technologies to enable BI end user self service. Do they? When you analyze whether your BI vendor can support end user self service, consider the following types of “self service” and related BI tool requirements:

#1. Self service for average, casual users.

  • What do these users need to do?
    • Run and lightly customize canned reports and dashboards
    • Run ad hoc queries
    • Add calculated measures
    • Collaborate
    • Fulfill their BI requirements with little or no training (typically one needs search-like, not point-and-click UI for this)
  • What capabilities do they need for this?
    • Report and dashboard templates
    • Customizable prompts, sorts, filters, and ranks
    • Report, query, dashboard building wizards
    • Portal
    • Semantic layer (not all BI vendor have a rich semantic layer)
    • Prompting for columns (not all BI vendors let you do that)
    • Drill anywhere  (only BI vendors with ROLAP and multisourcing / data federation provide this capability)

#2. Self service for advanced, power users

  • What do these users need to do?
    • Perform what-if scenarios (this often requires write back, which very few BI vendors allow)
    • Add metrics, measures, and hierarchies not supported by the underlying data model (typically one needs some kind of in-memory analytics capability for this)
    • Explore based on new (not previously defined) entity relationships (typically one needs some kind of in-memory analytics capability for this)
    • Not knowing exactly what one is looking for (typically one needs search-like UI for this)
Read more

The Definition of Complexity Is A Complex Matter

Jost Hoppermann

Recently, I discussed complexity with a banker working on measuring and managing complexity in a North American bank. His approach is very interesting: He found a way to operationalize complexity measurement and thus to provide concrete data to manage it. While I’m not in a position to disclose any more details, we also talked about the nature of complexity. In absence of any other definition of complexity, I offered a draft definition which I have assembled over time based on a number of “official” definitions. Complexity is the condition of:

Read more

Is Risk Based Testing Part of Your Test Planning?

Margo Visitacion

Recently, I’ve been getting more inquiries around risk based testing.   In addition to agile test methods and test estimation, test teams turning their eyes to risk based testing is just another positive step in integrating quality through out the SDLC.  Yes, I still see QA engineers as having to put their evangelist hats on to educate their developer brothers and sisters that quality is more than just testing (don’t get me wrong, consistent unit and integration testing is a beautiful thing), however, any time that business and technology partners can think about impact and dependencies in their approach to a solid, workable application elevates quality to the next level. 

Keep asking those questions about risk based testing – and make sure that you’re covering all of the angles.  Make sure that you’re covering:

Read more

BI on BI

Boris Evelson

By Boris Evelson

How do you know if your BI application has high, low or no ROI? How do you know that what the business users requested last month and you spent countless of hours and sleepless nights working on is actually being used? How do you know if your BI applications are efficient and effective? I don't have all the answers, but here's what I recommend.

Start with collecting basic data about your BI environment. The data model (hint, it's a classical multidimensional model exercise) should have the following components:

  •  Requests (these should be available from your help desk and project/portfolio management applications), such as
    • User provisioning
    • New applications
    • New data sources
    • Data model changes
    • New/changed metrics
    • New/changed reports
    • New report delivery options
  • Usage (these should be available from your DBMS and BI apps log files or from www.appfluent.com or www.teleran.com) by
    • Person
    • Time of day
    • Database
    • BI application
    • Report
    • Index
    • Aggregate
    • KMI/KPM
  • Track additional events like
    • Application usage vs. using application/report just to download or export data
    • Incomplete/cancelled queries
Read more

2008 Wasn’t Great, 2009 Was Worse: Global Banking Platform Deals 2009

Jost Hoppermann

Next week, I will present first results of Forrester’s 2009 global banking platform deals survey. A total of 17 banking platform vendors submitted their 2009 deals for evaluation. One year ago, the same set of deals would have represented at least 19 vendors: In the 2009 survey, FIS’s deals include those of acquired US-based Metavante, and Temenos’ deals include those of acquired French Viveo. These theoretically 19 participating vendors submitted a total of 1,068 banking platform deals to evaluate, a steep increase compared with the about 870 submitted deals for 2008.

We had to classify a large share of these 1,068 banking platform deals as extended business or even as a simple renewed license — if the vendors did not already submitted them with the according tag. Forrester’s “rules of the game” did not allow us to recognize further deals, for example, because a non-financial-services firm signed a deal. Overall, Forrester counted 269 of the submitted deals as 2009 new named customers, compared with 290 for 2008. In the past, Forrester sorted the vendors into four buckets: Global Power Sellers, Global Challengers, Pursuers, and Base Players. The Pursuers and in particular the Global Challengers saw only minor changes in the previous years. 2009 has shaken this stable structure, and we will see many vendors in groups they haven’t been in before.

Read more

Third-party Database Tools Still Matter

Noel Yuhanna

Over the past year, I have received numerous inquiries asking me whether third-party database tools that focus on performance and tuning, backup recovery, replication, upgrade, troubleshooting, and migration capabilities matter anymore now that leading DBMS providers such as Oracle, IBM, and Microsoft are offering improved automation and broader coverage. 

I find that third-party tools complement well with native database tools in assisting DBAs, developers and operational staff in their day-to-day activities. Last year, I had the opportunity to speak to dozens of enterprises that support hundreds and thousands of databases across various DBMSes. Most enterprises reported they saw at least a 20 percent IT staff productivity when using a third-party database tool.

Third-party vendor tools remain equally important because they support:

Read more

Future App Servers -- Radically Different

John R. Rymer

I was lucky enough last week [22 March 2010] to moderate a panel at EclipseCon on the future of application servers. The panelists did a great job, but I thought were far too conservative in their views. I agree with them that many customers want evolutionary change from today to future app servers, but I see requirements driving app servers toward radical change. Inevitably.

The changes I see:

 

Requirement

Response

Get more value from servers, get responsive, get agile and flexible

Virtualized everything, dynamic provisioning, automated change management

Govern rising application stack complexity

Lean, fit to purpose app servers, profiles and other standard configurations, modeling and metadata-based development and deployment

Provide “Internet scale”

Scale-out app servers, data tiers, network capacity, modular/layered designs, stateless architectures

 

Read more