Last year, Internet inventor Tim Berners-Lee called for access to raw data as the next step in the evolution of the Internet. Apparently Transport For London (TFL, UK) was listening and has recently opened its doors to the commercial use of large amounts of primary data sets and live feeds. The data newly available includes: tube and train traffic data, feeds from live traffic cameras, Oyster card top-up locations, pier and station locations, cycle hire locations, and riverboat timetables. Following this up, TFL has announced plans to release further information on bus stops, routes, timetables and schedules. Access to this data represents an opportunity for developers to create travel applications based on real-time information. In one such example a web-based mash-up plots the approximate position of every single underground train. While interesting to Londoners who may be able to navigate their morning commute a little better (there's still no escaping the inevitable squeeze on the Central Line), this is a compelling move by TFL to allow access to the same data it uses to power its own information boards. As we see it, such access:
The "smart money" seems to be betting against SAP. I hear all the time about the company's bleak prospects for the future. A client conversation last week reminded me of how strong SAP’s position is, despite its many issues.
This client, a worldwide manufacturer, is investing hundreds of millions of dollars in SAP software for its worldwide supply chain, financial management and reporting, inventory and order management, etc. The new SAP environment will replace hundreds of disparate applications and, ideally, result in far more efficient operations, far better visibility into operations, and far more uniform products around the world. The members of this client’s SAP implementation team have finished SAP implementation marathons before (at other employers). They know the good, the bad, the ugly.
In this manufacturer, SAP is sticky for four reasons.
We all know that the war of fighting the proliferation of spreadsheets (as BI or as any other applications) in enterprises has been fought and lost. Gone are the days when BI and performance management vendors web sites had “let us come in and help you get rid of your spreadsheets” message in big bold letters on front pages. In my personal experience – implementing hundreds of BI platforms and solutions – the more BI apps you deliver, the more spreadsheets you end up with. Rolling out a BI application often just means an easier way for someone to access and export data to a spreadsheet. Even though some of the in memory analytics tools are beginning to chip away at the main reasons why spreadsheets in BI are so ubiquitous (self service BI with no modeling or analysis constraints, and little to no reliance on IT), the spreadsheets for BI are here to stay for a long, long, long time.
With that in mind, let me offer a few best practices for controlling and managing (not getting rid of !) spreadsheets as a BI tool:
Create a spreadsheet governance policy. Make it flexible – if it’s not, people will fight it. Here are a few examples of such policies:
- Spreadsheets can be used for reporting and analysis that support processes that do not go beyond individuals or small work groups vs. cross functional, cross enterprise processes
- Spreadsheets can be used for reporting and analysis that are not part of mission critical processes
Whoever said BI market is commoditizing, consolidating and getting very mature? Nothing can be farther from the truth. On the buy side, Forrester still sees tons of less-than-successful BI environments, applications and implementations as demonstrated by Forrester's recent BI Maturity survey. On the vendor/sell side, Forrester also sees a flurry of activity from the startups, small vendors and large, leading BI vendors constantly leapfrogging each other with every major and minor release.
In terms of the amount of BI activity that Forrester sees from our clients (from inquiries, advisories and consulting) there’s no question that SAP BusinessObjects and IBM Cognos continue to dominate client interest. Over the past couple of years Microsoft has typically taken the third place, SAS fourth place and Oracle the distant fifth. But ever since Siebel and Hyperion acquisitions, the landscape has been changing, and we now often see Oracle jumping into third place, sometimes leapfrogging even Microsoft in the levels of monthly interest from Forrester clients.
Broadens the definition of metadata beyond “data on data” to include business rules, process models, application parameters, application rights, and policies.
Provides guidance to help evangelize to the business the importance of metadata, not by talking about metadata but by pointing out the value it provides against risks.
Recommends demonstrating to IT the transversality of metadata to IT internal siloed systems.
Advocates extending data governance to include metadata. The main impact of data governance should be to build the life cycle for metadata, but data governance evangelists reserve little concern for metadata at this point.
I will co-author the next document on metadata with Gene Leganza; this document will develop the next practice metadata architecture based partially but not only on a metadata exchange infrastructure. For a lot of people, metadata architecture is a Holy Grail. The upcoming document will demonstrate that metadata architecture will become an important step to ease the trend called “industrialization of IT,” sometimes also called “ERP for IT” or “Lean IT.”
In preparation for this upcoming document, please share with us your own experiences in bringing more attention to metadata.
Google announced yesterday that it is buying ITA Software for $700 million. ITA does two main things: airline eCommerce and reservations management solutions and a cross-airline flight comparison tool called QPX, used by most of the major travel comparison Web sites including Kayak, Orbitz, and Microsoft Bing.
Google purchased it for the QPX product in a classic example of buying technology instead of either building it in-house or licensing it.
Today, Bing, Microsoft’s search offering, offers a solutionthat is based on QPX to help customers search for flight information on the Bing Web site. Google has nothing comparable; instead, they direct customers to other travel specific sites (see the screenshots below).
Google is focused on the goal of staying at least half a step ahead of Microsoft in all aspects of search technology; in order to stay ahead of Microsoft in this area, Google had three major options: 1) License the technology; 2) Build it themselves; 3) Buy ITA.
Licensing the technology would mean that Google would end up with a solution equivalent to Microsoft’s and not as robust as specialized Web sites like Kayak. Building the technology would take several years, allowing Microsoft and other competitors to continue to differentiate themselves and pull ahead.
This left the acquisition as the only viable path to regaining leadership in this area, while at the same time placing Microsoft in the awkward position of relying on Google-owned technology as the backend for one of their major search features.
Yesterday I attended Computacenter’s Analyst Event. It’s a major independent provider of IT infrastructure services in Europe, ranging from reselling hardware and software to managing data centers and providing outsourced desktop management. My main interest was how it manages the potential conflict between properly advising the client and maximizing revenue from selling software. For instance, clients often ask me if it's dangerous to employ their value-added reseller (VAR) to advise them on license management in case the reseller tips off its vendors about a potential source of licence revenue.
An excellent customer case study at the event provided another example. A UK water company engaged Computacenter to implement a new desktop strategy involving 90% fully virtualized thin clients. Such a project creates major licensing challenges on both the desktop and server sides, because the software companies haven’t enhanced their models to properly cope with this scenario. The VAR’s dilemma is whether to design a solution that will be cheapest for the customer or one that will be most lucrative for itself.
As we said in our recent report “Refresher Course: Hiring VARs,” sourcing managers should decide whether they want their VARs to provide design and integration services like these or merely process orders at a minimum margin.
Computacenter will do either, but they clearly want to do more of the VA part and less (proportionately) of the R. So, according to their executives, they have no hesitation doing what is best for the customer even if it reduces their commission in the short term. But they didn’t think many of their competitors would take the same view.
So why is PC power management important to IBM customers?
While IBM already offers its customers energy-efficient servers and their “Tivoli Monitoring for Energy Management” software for the data center, bigger opportunities for savings exist across distributed IT assets, like PCs, monitors, phones, and printers. In fact, Forrester finds that distributed IT assets consume 55% of IT’s total energy footprint versus only 45% in the data center. And the extent of these savings can add up. For example, BigFix cites a large US public school district with 80,000 PCs saving $2.1 million in annual energy costs (or $26 per PC per year) using BigFix’s Power Management software.
Russian president Dmitry Medvedev toured Silicon Valley last week, meeting with tech vendor executives and local entrepreneurs. At Cisco, Medvedev participated in a telepresence session and used a Flip video camera for the first time. He met with representatives of public organizations and academic and business circles at Stanford University. And, more informally, AmBAR, the American Business Association of Russian-Speaking Professionals, hosted a session in a café in Palo Alto with local students and entrepreneurs. In each setting, the Russian president hoped to gain an understanding of what makes the Silicon Valley tick and glean some of the best practices developed in the region to take home and apply to his new Skolkovo initiative. He has been talking about diversifying the economy for some time. But with this initiative he has plans to develop a Russian “Silicon Valley” just outside of Moscow. This new “inno-grad” (from “innovation” and the Russian word for city – think Leningrad) will promote Medvedev’s new modernization directions, including advancements in IT, telecommunications, and also biomedical and nuclear technologies. He aims to attract local and foreign high-tech companies with infrastructure, tax incentives, and other government support.
As I discuss with clients the developing notions of Forrester's Business Capability Architecture (see blog post #1 and blog post #2), I have found it important to distinguish between different levels of scope for technology strategy. The primary distinctions have to do with (a) the degree to which a strategy (and the architecture it promulgates) aims to account for future change and (b) the breadth of business and technology scenarios addressed by the strategy.
Thus, I define a four-part technology strategy taxonomy along a two-dimensional continuum with (a) and (b) as axes (IOW, the four parts are archetypes that will occur in varying degrees of purity), to wit:
Type 1: project strategy for successful solution delivery. With Type 1 strategy, the goal is simply to achieve successful project delivery. It is beyond the strategy’s scope to consider anything not necessary to deliver a solution that operates according to immediate business requirements. Future changes to the business and future changes in technology are not considered by the strategy (unless explicitly documented in the requirements). The classic case for a Type 1 strategy is when an organization definitively knows that the business scenario addressed by the solution is short-lived and will not encounter significant business or technology change during the solution’s lifetime (history argues that this is a risky assumption, yet sometimes it is valid).