Many large organizations have finally “seen the light” and are trying to figure out the best way to treat their critical data as the trusted asset it should be. As a result, master data management (MDM) strategies and the enabling architectures, organizational and governance models, methodologies, and technologies that support the delivery of MDM capabilities are…in a word…HOT! But the concept of MDM -- and the homegrown or vendor-enabled technologies that attempt to deliver that elusive “single version of truth,” “golden record,” or “360-degree view” -- has been around for decades in one form or another (e.g., data warehousing, BI, data quality, EII, CRM, ERP, etc. have all at one time or another promised to deliver that single version of truth in one form or another).
The current market view of MDM has matured significantly over the past 5 years, and today many organizations are on their way to successfully delivering multi-domain/multi-form master data solutions across various physical and federated architectural approaches. But the long-term evolution of the MDM concept is far from over. There remains a tremendous gap in what limited business value most MDM efforts deliver today compared to what all MDM and data management evangelists feel MDM is capable of delivering in terms of business optimization, risk mitigation, and competitive differentiation.
What will the next evolution of the MDM concept look like in the next 3, 5, and 10 years? Will the next breakthrough be one that’s focused on technology enablement? How about information architecture? Data governance and stewardship? Alignment with other enterprise IT and business strategies?
My colleague Margo Visitacion and I are finishing up a new report, Seven Pragmatic Practices To Improve Software Quality, that will publish in a few weeks. We realized that not everyone has the same definition of quality. More often than not application development professionals define software quality as just meaning fewer bugs. But software quality means a whole lot more than just fewer bugs.
Forrester defines software quality as:
Software that meets business requirements, provides a satisfying user experience, and has fewer defects.
What It Means: Quality Is A Team Sport
Quality must move beyond the purview of just QA professionals and must become an integrated part of the entire software development life cycle (SDLC) to reduce schedule-killing rework when business requirements are misunderstood, improve user satisfaction, and reduce the risks of untested nonfunctional requirements such as security and performance.
The "smart money" seems to be betting against SAP. I hear all the time about the company's bleak prospects for the future. A client conversation last week reminded me of how strong SAP’s position is, despite its many issues.
This client, a worldwide manufacturer, is investing hundreds of millions of dollars in SAP software for its worldwide supply chain, financial management and reporting, inventory and order management, etc. The new SAP environment will replace hundreds of disparate applications and, ideally, result in far more efficient operations, far better visibility into operations, and far more uniform products around the world. The members of this client’s SAP implementation team have finished SAP implementation marathons before (at other employers). They know the good, the bad, the ugly.
In this manufacturer, SAP is sticky for four reasons.
We all know that the war of fighting the proliferation of spreadsheets (as BI or as any other applications) in enterprises has been fought and lost. Gone are the days when BI and performance management vendors web sites had “let us come in and help you get rid of your spreadsheets” message in big bold letters on front pages. In my personal experience – implementing hundreds of BI platforms and solutions – the more BI apps you deliver, the more spreadsheets you end up with. Rolling out a BI application often just means an easier way for someone to access and export data to a spreadsheet. Even though some of the in memory analytics tools are beginning to chip away at the main reasons why spreadsheets in BI are so ubiquitous (self service BI with no modeling or analysis constraints, and little to no reliance on IT), the spreadsheets for BI are here to stay for a long, long, long time.
With that in mind, let me offer a few best practices for controlling and managing (not getting rid of !) spreadsheets as a BI tool:
Create a spreadsheet governance policy. Make it flexible – if it’s not, people will fight it. Here are a few examples of such policies:
- Spreadsheets can be used for reporting and analysis that support processes that do not go beyond individuals or small work groups vs. cross functional, cross enterprise processes
- Spreadsheets can be used for reporting and analysis that are not part of mission critical processes
Whoever said BI market is commoditizing, consolidating and getting very mature? Nothing can be farther from the truth. On the buy side, Forrester still sees tons of less-than-successful BI environments, applications and implementations as demonstrated by Forrester's recent BI Maturity survey. On the vendor/sell side, Forrester also sees a flurry of activity from the startups, small vendors and large, leading BI vendors constantly leapfrogging each other with every major and minor release.
In terms of the amount of BI activity that Forrester sees from our clients (from inquiries, advisories and consulting) there’s no question that SAP BusinessObjects and IBM Cognos continue to dominate client interest. Over the past couple of years Microsoft has typically taken the third place, SAS fourth place and Oracle the distant fifth. But ever since Siebel and Hyperion acquisitions, the landscape has been changing, and we now often see Oracle jumping into third place, sometimes leapfrogging even Microsoft in the levels of monthly interest from Forrester clients.
As I discuss with clients the developing notions of Forrester's Business Capability Architecture (see blog post #1 and blog post #2), I have found it important to distinguish between different levels of scope for technology strategy. The primary distinctions have to do with (a) the degree to which a strategy (and the architecture it promulgates) aims to account for future change and (b) the breadth of business and technology scenarios addressed by the strategy.
Thus, I define a four-part technology strategy taxonomy along a two-dimensional continuum with (a) and (b) as axes (IOW, the four parts are archetypes that will occur in varying degrees of purity), to wit:
Type 1: project strategy for successful solution delivery. With Type 1 strategy, the goal is simply to achieve successful project delivery. It is beyond the strategy’s scope to consider anything not necessary to deliver a solution that operates according to immediate business requirements. Future changes to the business and future changes in technology are not considered by the strategy (unless explicitly documented in the requirements). The classic case for a Type 1 strategy is when an organization definitively knows that the business scenario addressed by the solution is short-lived and will not encounter significant business or technology change during the solution’s lifetime (history argues that this is a risky assumption, yet sometimes it is valid).