Marketing mix modeling solutions have been around for quite some time, providing marketers in several key categories with complex statistical models that aim to find the correlation between past marketing activities and business outcomes, like sales or market share.
However this space has recently seen significant changes, due to a few specific dynamics:
The proliferation of digital and social media with increasing importance in the marketing mix.
Marketers' increased demand for tools that are not only able to deliver insights on past campaigns but also able to give forward-looking recommendations on how to improve marketing return on investment (ROI) in the future.
The rising role that sophisticated software plays in integrating the ever-growing number of data streams and in enabling complex analysis to be navigated and customized via powerful graphic user interfaces.
To help navigate this complex and highly relevant space for senior marketers, our research team has published the first Forrester Wave™ for vendors in the marketing mix modeling space. We screened more than 30 vendors, shortlisted six that we consider to be the key players in this very fragmented market, and ranked them according to more than 40 different criteria. The evaluation uncovered a market in which:
MarketShare, Marketing Management Analytics, and ThinkVine lead the pack.
SymphonyIRI is a Leader but lacks collaborative functionalitites.
Marketing Analytics and Ninah are competitive Strong Performers.
Yesterday, HP agreed to buy UK software firm Autonomy Corp. for $10 billion to move into the enterprise information management (EIM) software business. HP wants to add IP to its portfolio, build next-generation information platforms, and create a vehicle for services. It is following IBM’s strategy of acquiring software to sell to accompany its hardware and services. With Autonomy under its wing, HP plans to help enterprises with a big, complicated problem – how to manage unstructured information for competitive advantage. Here’s the wrinkle – Autonomy hasn’t solved that problem. In fact, it’s not a pure technology problem because content is so different than data. It’s a people, process problem, too.
Here is the Autonomy overview that HP gave investors yesterday:
Of course, this diagram doesn’t look like the heterogeneous environment of a typical multinational enterprise. Autonomy has acquired many companies to fill in the boxes here, but the reality is that companies have products from a smorgasbord of content management vendors but no incentive to stick with any one of them.
There has been a great deal of talk over the past few years about what acronym will replace WCM (web content management). Web experience management? Web site management? Web engagement management? Web experience optimization? The list goes on and on.
Certainly, the evolution of the WCM term makes sense on paper, since traditional content management functionality now only makes up a portion of the products that WCM vendors now offer. WCM vendors are also in the content delivery/engagement business, and are even dipping their toes into web intelligence. However, Forrester clients still overwhelmingly ask about “WCM” and that term isn’t going away any time soon.
But even without changing the acronym, it is time to start thinking about WCM beyond just managing content or siloed websites or experiences. Instead, we need to think of how WCM will interact and integrate with other solutions – like search, recommendations, eCommerce, and analytics – in the customer experience management (CXM) ecosystem in order to enable businesses to manage experiences across customer touchpoints.
How are we handling this convergence at Forrester? Several of us who cover various CXM products – like Brian Walker (commerce), Bill Band (CRM), Joe Stanhope (web analytics), and myself (WCM) – teamed up to outline what our vision of CXM looks like, including process-based tools, delivery platforms, and customer intelligence. We've created two versions of the report: one written for Content & Collaboration professionals and one for eBusiness & Channel Strategy professionals.
I need your help. I am conducting research into business intelligence (BI) software prices: averages, differences between license and subscription deals, differences between small and large vendor offerings, etc. In order to help our clients look beyond just the software pricese and consider the fully loaded total cost of ownership, I also want to throw in service and hardware costs (I already have data on annual maintenance and initial training costs). I’ve been in this market long enough to understand that the only correct answer is “It depends” — on the levels of data complexity, data cleanliness, use cases, and many other factors. But, if I could pin you down to a ballpark formula for budgeting and estimation purposes, what would that be? Here are my initial thoughts — based on experience, other relevant research, etc.
Initial hardware as a percentage of software cost = 33% to 50%
Ongoing hardware maintenance = 20% of the initial hardware cost
Initial design, build, implementation of services. Our rule of thumb has always been 300% to 700%, but that obviously varies by deal sizes. So here’s what I came up with:
Less than $100,000 in software = 100% in services
$100,000 to $500,000 in software = 300% in services
$500,000 to $2 million in software = 200% in services
$2 million to $10 million in software = 50% in services
More than $10 million in software = 25% in services
Then 20% of the initial software cost for ongoing maintenance, enhancements, and support
Thoughts? Again, I am not looking for “it depends” answers, but rather for some numbers and ranges based on your experience.
Forrester is in the middle of a major research effort on various Big Data-related topics. As part of this research, we’ll be kicking off a client survey shortly. I’d like to solicit everyone’s input on the survey questions and answer options. Here’s the first draft. What am I missing?
Scope. What is the scope of your Big Data initiative?
Status. What is the status of your Big Data initiative?
Industry. Are the questions you are trying to address with your Big Data initiative general or industry-specific?
Domains. What enterprise areas does your Big Data initiative address?
Why BigData? What are the main business requirements or inadequacies of earlier-generation BI/DW/ET technologies, applications, and architecture that are causing you to consider or implement Big Data?
Velocity of change and scope/requirements unpredictability
Analysis-driven requirements (Big Data) vs. requirements-driven analysis (traditional BI/DW)
Cost. Big Data solutions are less expensive than traditional ETL/DW/BI solutions
SAP BusinessObjects (BO) 4.0 suite is here. It’s been in the ramp-up phase since last fall; according to our sources, SAP plans to announce its general availability sometime in May, possibly at Sapphire. It’s about a year late (SAP first told Forrester that it planned to roll it out in the spring of 2010, so I wanted to include it in the latest edition of the Forrester Wave™ for enterprise BI platforms but couldn’t), and the big question is: Was it worth the wait? In my humble opinion, yes, it was! Here are seven major reasons to upgrade or to consider SAP BI if you haven’t done so before:
BO Universe (semantic layer) can now be sourced from multiple databases, overcoming a major obstacle of previous versions.
Universe can now access MOLAP (cubes from Microsoft Analysis Services, Essbase, Mondrian, etc.) data sources directly via MDX without having to “flatten them out” first. In prior versions, Universe could only access SQL sources.
There’s now a more common look and feel to individual BI products, including Crystal, WebI, Explorer, and Analysis (former BEx). This is another step in the right direction to unify SAP BI products, but it’s still not a complete solution. It will be a while before all SAP BI products are fully and seamlessly integrated, as well as other BI tools/platforms that grew more organically.
All SAP BI tools, including Xcelsius (Dashboards in 4.0), that did not have access to BO Universe now do.
There’s now a tighter integration with BW via direct exposure of BW metadata (BEx queries and InfoProviders) to all BO tools.
Forrester continues to see ever-increasing levels of interest in and adoption of business intelligence (BI) platforms, applications, and processes. But while BI maturity in enterprises continues to grow, and BI tools have become more function-rich and robust, the promise of efficient and effective BI solutions remains challenging at best and elusive at worst. Why? Two main reasons: First, BI is all about best practices and lessons learned, which only come with years of experience; and second, earlier-generation BI approaches cannot easily keep up with ever-changing business and regulatory requirements. In the attached research document, Forrester reviews the top best practices for BI and predicts what the next-generation BI technologies will be. We summarize all of this in a single über-trend and best practice: agility. IT and business pros should adopt Agile BI processes, technologies, and architectures to improve their chances of delivering successful BI initiatives.
Business intelligence (BI) software has emerged as a hot topic in the past few years; in 2011, most companies will again focus their software investment plans on BI. More than 49% of the companies that responded to our most recent Forrsights Software Survey have concrete plans to implement or expand their use of BI software within the next 24 months. But being interested in BI software and spending money to adopt BI tools and processes do not necessarily translate into successful implementations: Forrester’s most recent BI maturity survey indicated that enterprise BI maturity levels are still below average (2.75 on a scale of 5, a modest 6% increase over 2009). Why are BI maturity levels so low, given the amount of money firms spend on it? Three factors contribute to this rift and can lead to less-than-successful BI initiatives:
Implementing BI requires using best practices and building upon lessons learned.
Mobile devices and mobile Internet are everywhere. Over the past few years, Forrester has tracked continuously increasing levels of adoption and maturity for mobile business applications, but not so for mobile business intelligence (BI) applications. The adoption and maturity of mobile BI fall behind other mobile enterprise applications for multiple reasons, mainly the lack of specific business use cases and tangible ROI, as well as inadequate smartphone screen and keyboard form factors. However, larger form factor devices such as tablets and innovative approaches to online/offline BI technical architecture will boost mobile BI adoption and maturity in the near future. BP professionals must start evaluating and prototyping mobile BI platforms and applications to make sure that all key business processes and relevant information are available to knowledge workers wherever they are.
But mobile BI adoption levels are still low. Why? We see three major reasons.
Smartphones still lack the form factor appropriate for BI
The business case for mobile BI remains tough to build
Mobile device security is still a concern
Now, mobile tablet devices are a different story. Just like Baby Bear's porridge in the "Goldilocks And The Three Bears" fairy tale, tablet PCs are "just right" for mobile BI end users. So what can you do with mobile BI? Plenty!
Improve customer and partner engagement
Deliver BI in the right place, at the right time
Introduce BI for the workers without access to traditional BI applications
Improve BI efficiency via query relevance
Improve "elevator pitch" effectiveness
Give away mobile devices as an incentive to cross-sell and upsell analytic applications
I get many inquiries on the differences and pros and cons of MOLAP versus ROLAP architectures for analytics and BI. In the old days, the differences between MOLAP, DOLAP, HOLAP, and ROLAP were pretty clear. Today, given the modern scalability requirements, DOLAP has all but disappeared, and the lines between MOLAP, ROLAP, and HOLAP are getting murkier and murkier. Here are some of the reasons:
Some RDBMSes (Oracle, DB2, Microsoft) offer built-in OLAP engines, often eliminating a need to have a separate OLAP engine in BI tools.
Some of the DW-optimized DBMSes like Teradata, SybaseIQ, and Netezza partially eliminate the need for an OLAP engine with aggregate indexes, columnar architecture, or brute force table scans.
MOLAP engines like Microsoft SSAS and Oracle Essbase can do drill-throughs to detailed transactions.
Semantic layers like SAP BusinessObjects Universe have some OLAP-like functionality.