We all know that the war of fighting the proliferation of spreadsheets (as BI or as any other applications) in enterprises has been fought and lost. Gone are the days when BI and performance management vendors web sites had “let us come in and help you get rid of your spreadsheets” message in big bold letters on front pages. In my personal experience – implementing hundreds of BI platforms and solutions – the more BI apps you deliver, the more spreadsheets you end up with. Rolling out a BI application often just means an easier way for someone to access and export data to a spreadsheet. Even though some of the in memory analytics tools are beginning to chip away at the main reasons why spreadsheets in BI are so ubiquitous (self service BI with no modeling or analysis constraints, and little to no reliance on IT), the spreadsheets for BI are here to stay for a long, long, long time.
With that in mind, let me offer a few best practices for controlling and managing (not getting rid of !) spreadsheets as a BI tool:
Create a spreadsheet governance policy. Make it flexible – if it’s not, people will fight it. Here are a few examples of such policies:
- Spreadsheets can be used for reporting and analysis that support processes that do not go beyond individuals or small work groups vs. cross functional, cross enterprise processes
- Spreadsheets can be used for reporting and analysis that are not part of mission critical processes
Whoever said BI market is commoditizing, consolidating and getting very mature? Nothing can be farther from the truth. On the buy side, Forrester still sees tons of less-than-successful BI environments, applications and implementations as demonstrated by Forrester's recent BI Maturity survey. On the vendor/sell side, Forrester also sees a flurry of activity from the startups, small vendors and large, leading BI vendors constantly leapfrogging each other with every major and minor release.
In terms of the amount of BI activity that Forrester sees from our clients (from inquiries, advisories and consulting) there’s no question that SAP BusinessObjects and IBM Cognos continue to dominate client interest. Over the past couple of years Microsoft has typically taken the third place, SAS fourth place and Oracle the distant fifth. But ever since Siebel and Hyperion acquisitions, the landscape has been changing, and we now often see Oracle jumping into third place, sometimes leapfrogging even Microsoft in the levels of monthly interest from Forrester clients.
Broadens the definition of metadata beyond “data on data” to include business rules, process models, application parameters, application rights, and policies.
Provides guidance to help evangelize to the business the importance of metadata, not by talking about metadata but by pointing out the value it provides against risks.
Recommends demonstrating to IT the transversality of metadata to IT internal siloed systems.
Advocates extending data governance to include metadata. The main impact of data governance should be to build the life cycle for metadata, but data governance evangelists reserve little concern for metadata at this point.
I will co-author the next document on metadata with Gene Leganza; this document will develop the next practice metadata architecture based partially but not only on a metadata exchange infrastructure. For a lot of people, metadata architecture is a Holy Grail. The upcoming document will demonstrate that metadata architecture will become an important step to ease the trend called “industrialization of IT,” sometimes also called “ERP for IT” or “Lean IT.”
In preparation for this upcoming document, please share with us your own experiences in bringing more attention to metadata.
Google announced yesterday that it is buying ITA Software for $700 million. ITA does two main things: airline eCommerce and reservations management solutions and a cross-airline flight comparison tool called QPX, used by most of the major travel comparison Web sites including Kayak, Orbitz, and Microsoft Bing.
Google purchased it for the QPX product in a classic example of buying technology instead of either building it in-house or licensing it.
Today, Bing, Microsoft’s search offering, offers a solutionthat is based on QPX to help customers search for flight information on the Bing Web site. Google has nothing comparable; instead, they direct customers to other travel specific sites (see the screenshots below).
Google is focused on the goal of staying at least half a step ahead of Microsoft in all aspects of search technology; in order to stay ahead of Microsoft in this area, Google had three major options: 1) License the technology; 2) Build it themselves; 3) Buy ITA.
Licensing the technology would mean that Google would end up with a solution equivalent to Microsoft’s and not as robust as specialized Web sites like Kayak. Building the technology would take several years, allowing Microsoft and other competitors to continue to differentiate themselves and pull ahead.
This left the acquisition as the only viable path to regaining leadership in this area, while at the same time placing Microsoft in the awkward position of relying on Google-owned technology as the backend for one of their major search features.
Yesterday I attended Computacenter’s Analyst Event. It’s a major independent provider of IT infrastructure services in Europe, ranging from reselling hardware and software to managing data centers and providing outsourced desktop management. My main interest was how it manages the potential conflict between properly advising the client and maximizing revenue from selling software. For instance, clients often ask me if it's dangerous to employ their value-added reseller (VAR) to advise them on license management in case the reseller tips off its vendors about a potential source of licence revenue.
An excellent customer case study at the event provided another example. A UK water company engaged Computacenter to implement a new desktop strategy involving 90% fully virtualized thin clients. Such a project creates major licensing challenges on both the desktop and server sides, because the software companies haven’t enhanced their models to properly cope with this scenario. The VAR’s dilemma is whether to design a solution that will be cheapest for the customer or one that will be most lucrative for itself.
As we said in our recent report “Refresher Course: Hiring VARs,” sourcing managers should decide whether they want their VARs to provide design and integration services like these or merely process orders at a minimum margin.
Computacenter will do either, but they clearly want to do more of the VA part and less (proportionately) of the R. So, according to their executives, they have no hesitation doing what is best for the customer even if it reduces their commission in the short term. But they didn’t think many of their competitors would take the same view.
So why is PC power management important to IBM customers?
While IBM already offers its customers energy-efficient servers and their “Tivoli Monitoring for Energy Management” software for the data center, bigger opportunities for savings exist across distributed IT assets, like PCs, monitors, phones, and printers. In fact, Forrester finds that distributed IT assets consume 55% of IT’s total energy footprint versus only 45% in the data center. And the extent of these savings can add up. For example, BigFix cites a large US public school district with 80,000 PCs saving $2.1 million in annual energy costs (or $26 per PC per year) using BigFix’s Power Management software.
Russian president Dmitry Medvedev toured Silicon Valley last week, meeting with tech vendor executives and local entrepreneurs. At Cisco, Medvedev participated in a telepresence session and used a Flip video camera for the first time. He met with representatives of public organizations and academic and business circles at Stanford University. And, more informally, AmBAR, the American Business Association of Russian-Speaking Professionals, hosted a session in a café in Palo Alto with local students and entrepreneurs. In each setting, the Russian president hoped to gain an understanding of what makes the Silicon Valley tick and glean some of the best practices developed in the region to take home and apply to his new Skolkovo initiative. He has been talking about diversifying the economy for some time. But with this initiative he has plans to develop a Russian “Silicon Valley” just outside of Moscow. This new “inno-grad” (from “innovation” and the Russian word for city – think Leningrad) will promote Medvedev’s new modernization directions, including advancements in IT, telecommunications, and also biomedical and nuclear technologies. He aims to attract local and foreign high-tech companies with infrastructure, tax incentives, and other government support.
As I discuss with clients the developing notions of Forrester's Business Capability Architecture (see blog post #1 and blog post #2), I have found it important to distinguish between different levels of scope for technology strategy. The primary distinctions have to do with (a) the degree to which a strategy (and the architecture it promulgates) aims to account for future change and (b) the breadth of business and technology scenarios addressed by the strategy.
Thus, I define a four-part technology strategy taxonomy along a two-dimensional continuum with (a) and (b) as axes (IOW, the four parts are archetypes that will occur in varying degrees of purity), to wit:
Type 1: project strategy for successful solution delivery. With Type 1 strategy, the goal is simply to achieve successful project delivery. It is beyond the strategy’s scope to consider anything not necessary to deliver a solution that operates according to immediate business requirements. Future changes to the business and future changes in technology are not considered by the strategy (unless explicitly documented in the requirements). The classic case for a Type 1 strategy is when an organization definitively knows that the business scenario addressed by the solution is short-lived and will not encounter significant business or technology change during the solution’s lifetime (history argues that this is a risky assumption, yet sometimes it is valid).
With Microsoft's fiscal year end coming to a close today, I wanted to spend some time focusing on future licensing direction. Windows Intune is a significant offering from Microsoft that blends cloud-based management, on-premises tools from the Microsoft Desktop Optimization Pack (MDOP), and Windows – as a subscription service. Let’s put Intune aside for a moment.
Like all software vendors, Microsoft is keen on pulling customers into an annuity relationship for their offerings – a dependable revenue stream that isn’t as vulnerable to things like economic downturns or anything that might delay a purchase. When Microsoft first introduced the Software Assurance (SA) program, it was primarily just upgrade rights – while a license was covered under SA, you had rights to deploy any new versions that came out during that time. Over the years Microsoft has refined the program, adding different benefits to incent customers into the program – but the primary focus of value has remained upgrade rights. Unlike other vendors, Microsoft included security patches and updates as part of a license, so their “software maintenance” program has always been something a little different.
I recently finished the draft of my report on the ecosystem of innovation services providers. This report, to be published in July, explores the landscape of companies that are unified by a single purpose: they are dedicated to helping their clients unleash their own innovation potential. These are not companies who simply use "innovation" as a marketing buzzword. Rather, they are dedicated to the discipline of innovation – and bring unique innovation expertise to clients in wide variety of corporate roles. This report builds on much of Forrester’s previous work related to Innovation Networks and Innovation Management, but expands the "ecosystem" to consider all of the companies I interact with that have a distinct innnovation focus. In the report, I explore the offerings of:
Strategy consulting organizations
Technology service providers
Product management firms
Outsourced product development firms
Idea management/solution generation companies
Other niche service providers (including training program, design firms, and others)
I argue that this ecosystem of providers will be an increasingly important part of a comprehensive innovation strategy. However, it will be up to very knowledgeable and “connected” individuals within companies to help manage the diverse players, and connect suppliers to the right role, at the right point in the innovation process. I also argue that this is an opportunity for SVM professionals who want to play a more strategic role in their organizations.