There has been a great deal of talk over the past few years about what acronym will replace WCM (web content management). Web experience management? Web site management? Web engagement management? Web experience optimization? The list goes on and on.
Certainly, the evolution of the WCM term makes sense on paper, since traditional content management functionality now only makes up a portion of the products that WCM vendors now offer. WCM vendors are also in the content delivery/engagement business, and are even dipping their toes into web intelligence. However, Forrester clients still overwhelmingly ask about “WCM” and that term isn’t going away any time soon.
But even without changing the acronym, it is time to start thinking about WCM beyond just managing content or siloed websites or experiences. Instead, we need to think of how WCM will interact and integrate with other solutions – like search, recommendations, eCommerce, and analytics – in the customer experience management (CXM) ecosystem in order to enable businesses to manage experiences across customer touchpoints.
How are we handling this convergence at Forrester? Several of us who cover various CXM products – like Brian Walker (commerce), Bill Band (CRM), Joe Stanhope (web analytics), and myself (WCM) – teamed up to outline what our vision of CXM looks like, including process-based tools, delivery platforms, and customer intelligence. We've created two versions of the report: one written for Content & Collaboration professionals and one for eBusiness & Channel Strategy professionals.
I need your help. I am conducting research into business intelligence (BI) software prices: averages, differences between license and subscription deals, differences between small and large vendor offerings, etc. In order to help our clients look beyond just the software pricese and consider the fully loaded total cost of ownership, I also want to throw in service and hardware costs (I already have data on annual maintenance and initial training costs). I’ve been in this market long enough to understand that the only correct answer is “It depends” — on the levels of data complexity, data cleanliness, use cases, and many other factors. But, if I could pin you down to a ballpark formula for budgeting and estimation purposes, what would that be? Here are my initial thoughts — based on experience, other relevant research, etc.
Initial hardware as a percentage of software cost = 33% to 50%
Ongoing hardware maintenance = 20% of the initial hardware cost
Initial design, build, implementation of services. Our rule of thumb has always been 300% to 700%, but that obviously varies by deal sizes. So here’s what I came up with:
Less than $100,000 in software = 100% in services
$100,000 to $500,000 in software = 300% in services
$500,000 to $2 million in software = 200% in services
$2 million to $10 million in software = 50% in services
More than $10 million in software = 25% in services
Then 20% of the initial software cost for ongoing maintenance, enhancements, and support
Thoughts? Again, I am not looking for “it depends” answers, but rather for some numbers and ranges based on your experience.
Forrester is in the middle of a major research effort on various Big Data-related topics. As part of this research, we’ll be kicking off a client survey shortly. I’d like to solicit everyone’s input on the survey questions and answer options. Here’s the first draft. What am I missing?
Scope. What is the scope of your Big Data initiative?
Status. What is the status of your Big Data initiative?
Industry. Are the questions you are trying to address with your Big Data initiative general or industry-specific?
Domains. What enterprise areas does your Big Data initiative address?
Why BigData? What are the main business requirements or inadequacies of earlier-generation BI/DW/ET technologies, applications, and architecture that are causing you to consider or implement Big Data?
Velocity of change and scope/requirements unpredictability
Analysis-driven requirements (Big Data) vs. requirements-driven analysis (traditional BI/DW)
Cost. Big Data solutions are less expensive than traditional ETL/DW/BI solutions
SAP BusinessObjects (BO) 4.0 suite is here. It’s been in the ramp-up phase since last fall; according to our sources, SAP plans to announce its general availability sometime in May, possibly at Sapphire. It’s about a year late (SAP first told Forrester that it planned to roll it out in the spring of 2010, so I wanted to include it in the latest edition of the Forrester Wave™ for enterprise BI platforms but couldn’t), and the big question is: Was it worth the wait? In my humble opinion, yes, it was! Here are seven major reasons to upgrade or to consider SAP BI if you haven’t done so before:
BO Universe (semantic layer) can now be sourced from multiple databases, overcoming a major obstacle of previous versions.
Universe can now access MOLAP (cubes from Microsoft Analysis Services, Essbase, Mondrian, etc.) data sources directly via MDX without having to “flatten them out” first. In prior versions, Universe could only access SQL sources.
There’s now a more common look and feel to individual BI products, including Crystal, WebI, Explorer, and Analysis (former BEx). This is another step in the right direction to unify SAP BI products, but it’s still not a complete solution. It will be a while before all SAP BI products are fully and seamlessly integrated, as well as other BI tools/platforms that grew more organically.
All SAP BI tools, including Xcelsius (Dashboards in 4.0), that did not have access to BO Universe now do.
There’s now a tighter integration with BW via direct exposure of BW metadata (BEx queries and InfoProviders) to all BO tools.
Forrester continues to see ever-increasing levels of interest in and adoption of business intelligence (BI) platforms, applications, and processes. But while BI maturity in enterprises continues to grow, and BI tools have become more function-rich and robust, the promise of efficient and effective BI solutions remains challenging at best and elusive at worst. Why? Two main reasons: First, BI is all about best practices and lessons learned, which only come with years of experience; and second, earlier-generation BI approaches cannot easily keep up with ever-changing business and regulatory requirements. In the attached research document, Forrester reviews the top best practices for BI and predicts what the next-generation BI technologies will be. We summarize all of this in a single über-trend and best practice: agility. IT and business pros should adopt Agile BI processes, technologies, and architectures to improve their chances of delivering successful BI initiatives.
Business intelligence (BI) software has emerged as a hot topic in the past few years; in 2011, most companies will again focus their software investment plans on BI. More than 49% of the companies that responded to our most recent Forrsights Software Survey have concrete plans to implement or expand their use of BI software within the next 24 months. But being interested in BI software and spending money to adopt BI tools and processes do not necessarily translate into successful implementations: Forrester’s most recent BI maturity survey indicated that enterprise BI maturity levels are still below average (2.75 on a scale of 5, a modest 6% increase over 2009). Why are BI maturity levels so low, given the amount of money firms spend on it? Three factors contribute to this rift and can lead to less-than-successful BI initiatives:
Implementing BI requires using best practices and building upon lessons learned.
Mobile devices and mobile Internet are everywhere. Over the past few years, Forrester has tracked continuously increasing levels of adoption and maturity for mobile business applications, but not so for mobile business intelligence (BI) applications. The adoption and maturity of mobile BI fall behind other mobile enterprise applications for multiple reasons, mainly the lack of specific business use cases and tangible ROI, as well as inadequate smartphone screen and keyboard form factors. However, larger form factor devices such as tablets and innovative approaches to online/offline BI technical architecture will boost mobile BI adoption and maturity in the near future. BP professionals must start evaluating and prototyping mobile BI platforms and applications to make sure that all key business processes and relevant information are available to knowledge workers wherever they are.
But mobile BI adoption levels are still low. Why? We see three major reasons.
Smartphones still lack the form factor appropriate for BI
The business case for mobile BI remains tough to build
Mobile device security is still a concern
Now, mobile tablet devices are a different story. Just like Baby Bear's porridge in the "Goldilocks And The Three Bears" fairy tale, tablet PCs are "just right" for mobile BI end users. So what can you do with mobile BI? Plenty!
Improve customer and partner engagement
Deliver BI in the right place, at the right time
Introduce BI for the workers without access to traditional BI applications
Improve BI efficiency via query relevance
Improve "elevator pitch" effectiveness
Give away mobile devices as an incentive to cross-sell and upsell analytic applications
I get many inquiries on the differences and pros and cons of MOLAP versus ROLAP architectures for analytics and BI. In the old days, the differences between MOLAP, DOLAP, HOLAP, and ROLAP were pretty clear. Today, given the modern scalability requirements, DOLAP has all but disappeared, and the lines between MOLAP, ROLAP, and HOLAP are getting murkier and murkier. Here are some of the reasons:
Some RDBMSes (Oracle, DB2, Microsoft) offer built-in OLAP engines, often eliminating a need to have a separate OLAP engine in BI tools.
Some of the DW-optimized DBMSes like Teradata, SybaseIQ, and Netezza partially eliminate the need for an OLAP engine with aggregate indexes, columnar architecture, or brute force table scans.
MOLAP engines like Microsoft SSAS and Oracle Essbase can do drill-throughs to detailed transactions.
Semantic layers like SAP BusinessObjects Universe have some OLAP-like functionality.
Are you interested in business intelligence, wonder about the future of the analytics market or have a question on advanced analytics technologies?
Then join the Forrester analysts Rob Karel, Boris Evelson, Clay Richardson, Gene Leganza, Noel Yuhanna, Leslie Owens, Suresh Vittal, William Frascarelli, David Frankland, Joe Stanhope, Zach Hofer-Shall, Henry Peyret and myself for an interactive TweetJam on Twitter about the state of advanced analytics on Wednesday, December 15th, 2010 from 12:00 p.m. – 1:00 p.m. EDT (18:00 – 19:00 CET) using the Twitter hashtag #dmjam. We’ll share the results of our recent research on the analytics market space and discuss how it will change with new technologies entering the scene and maturing over time.
Business intelligence is the fastest growing software market today as companies are driving business results based on deeper insights and better planning, and advanced analytics is the spearhead of BI technologies that can untap new dimensions of business performance. But what exactly is ‘advanced’ analytics, what technologies are available and how to efficiently use them?
Much more detailed information can be found in the blog of Forrester analyst James Kobielus who will lead us through the discussion during the TweetJam. Above you see an overview graphic listing the different elements of advanced analytics today, taken from his blog.
Here are some of the questions we want to debate during our TweetJam discussion:
What exactly is and isn’t advanced analytics?
What are the chief business applications of advanced analytics?
A number of clients ask me "how many people do you think use BI". Not an easy question to answer, will not be an exact science, and will have many caveats. But here we go:
First, let's assume that we are only talking about what we all consider "traditional BI" apps. Let's exclude home grown apps built using spreadsheets and desktop databases. Let's also exclude operational reporting apps that are embedded in ERP, CRM and other applications.
Then, let's cut out everyone who only gets the results of a BI report/analysis in a static form, such as a hardcopy or a non interactive PDF file. So if you're not creating, modifying, viewing via a portal, sorting, filtering, ranking, drilling, etc, you probably do not require a BI product license and I am not counting you.
I'll just attempt to do this for the US for now. If the approach works, we'll try it for other major regions and countries.
Number of businesses with over 100 employees (a reasonable cut off for a business size that would consider using what we define as traditional BI) in the US in 2004 was 107,119
US Dept of Labor provides ranges as in "firms with 500-749 employees". For each range I take a middle number. For the last range "firms with over 10,000" I use an average of 15,000 employees.
This gives us 66 million (66,595,553) workers employed by US firms who could potentially use BI
Next we take the data from our latest BDS numbers on BI which tell us that 54% of the firms are using BI which gives us 35 million (35,961,598) workers employed by US firms that use BI