Forrester continues to see ever-increasing levels of interest in and adoption of business intelligence (BI) platforms, applications, and processes. But while BI maturity in enterprises continues to grow, and BI tools have become more function-rich and robust, the promise of efficient and effective BI solutions remains challenging at best and elusive at worst. Why? Two main reasons: First, BI is all about best practices and lessons learned, which only come with years of experience; and second, earlier-generation BI approaches cannot easily keep up with ever-changing business and regulatory requirements. In the attached research document, Forrester reviews the top best practices for BI and predicts what the next-generation BI technologies will be. We summarize all of this in a single über-trend and best practice: agility. IT and business pros should adopt Agile BI processes, technologies, and architectures to improve their chances of delivering successful BI initiatives.
Business intelligence (BI) software has emerged as a hot topic in the past few years; in 2011, most companies will again focus their software investment plans on BI. More than 49% of the companies that responded to our most recent Forrsights Software Survey have concrete plans to implement or expand their use of BI software within the next 24 months. But being interested in BI software and spending money to adopt BI tools and processes do not necessarily translate into successful implementations: Forrester’s most recent BI maturity survey indicated that enterprise BI maturity levels are still below average (2.75 on a scale of 5, a modest 6% increase over 2009). Why are BI maturity levels so low, given the amount of money firms spend on it? Three factors contribute to this rift and can lead to less-than-successful BI initiatives:
Implementing BI requires using best practices and building upon lessons learned.
I recently received a client inquiry that I see flavors of a few times per quarter. The client said that they are trying to explore ways to establish information value within their enterprise. Because people often frame data and information as an asset, then shouldn’t we be able to establish its value?
What I share with my clients is that trying to place a monetary value on data and information itself is a red herring, an effort that I highly recommend all avoid – unless you enjoy philosophical exercises that don’t translate to actual business value. (Apologies to those that fit in this camp — have fun!)
The “data is an asset” rhetoric doesn’t translate to putting a monetary value on a customer record, as an example, because data in and of itself has no value! The only value data/information has to offer — and the reason I do still consider it an “asset” at all — is in the context of the business processes, decisions, customer experiences, and competitive differentiators it can enable.
For example, a customer record doesn’t have value unless you can sell, market, or service that customer. So for each customer record, many customer intelligence analysts calculate lifetime value scores, the potential share of wallet available, the customer’s propensity to buy certain products and services, and even the cost of servicing the customer. But that doesn’t put a value on the customer record itself: It places the value based on the sales, marketing, and service processes the data supports. And that’s where the data value should live: in the consuming processes.
Why, oh, why is it that every time I hear about a BI project from an IT person, or from a business stakeholder describing how IT delivered it, with few exceptions, these are the stories plagued with multiple challenges? And why is it that when I hear a BI story about an application that was installed, built, and used by a business user, with little or no support from IT, it’s almost always a success story?
I think we all know the answer to that question. It’s all about IT/business misalignment. A business user wants flexibility, while an IT person is charged with keeping order and controlling data, applications, scope, and projects. A business user wants to react to ever-changing requirements, but an IT person needs to have a formal planning process. A businessperson wants to have a tool best-suited for the business requirements, and an IT person wants to leverage enterprise standard platforms.
Who’s right and who’s wrong? Both. The only real answer is somewhere in the middle. There’s also a new emerging alternative, especially when applied to specific domains, like customer analytics. As I have repeatedly written in multiple research documents, front-office processes are especially poorly-suited for traditional analytics. Front office processes like sales and marketing need to be infinitely more agile and reactive, as their back office cousins from finance and HR for obvious reasons.
March 6, 2011 marked my five-year anniversary with Forrester Research — and as an industry analyst. Back in 2006 I made a pivotal career decision and decided to depart the end user world where I spent almost 15 years across multiple organizations fighting the good fight building data management solutions that delivered trusted data to the people, processes, and systems that needed it. Amazing to me that I graduated college as a finance major with no IT experience and at the time wouldn’t have known a database if it crashed right in front of me.
For my first job out of school, I worked in Newark, NJ, for six years with Thomson Financial Services (TFS) on its Global Mergers & Acquisitions database product, eventually taking on the role of research manager for that product. In that time I foolishly thought I was gaining M&A expertise, but in reality I was learning how to ensure that the data about these M&A transactions was of the highest quality because data at TFS is not just an asset — it’s its product. To accomplish this, I had to became very proficient with this random programming language called SQL, which I assumed was some niche thing and most likely irrelevant beyond TFS (remember: finance major!). Still being the days of green-screened dummy terminals before the PC and GUI revolution, learning SQL was an incredible eye-opener.
First of all, congratulations, SAS AR team, for one of the most efficiently and effectively run events.
SAS needs to make up its mind whether it wants to be in the BI game or not. Despite what SAS’s senior executives have been heard saying occasionally, that “BI is dead,” SAS is not quite done with BI. After all, BI makes up 11% of SAS’s very impressive $2.4 billion annual revenue (with uninterrupted 35-year growth!). Additionally BI contributed 22% to SAS 2010 growth, just below analytics at 26%.
Even though some organizations are looking at and implementing advanced analytics such as statistical analysis, predictive modeling, and — most important — model-based decisions, there are only a handful of them. As our BI maturity survey shows year after year, BI — even basic BI — maturity is still below average in most enterprises. Add these numbers to the abysmal enterprise BI applications penetration levels in most large organizations, and you get continued, huge, and ever-expanding opportunity that no vendor in its right mind, especially a vendor with leading BI tools, should miss.
Mobile devices and mobile Internet are everywhere. Over the past few years, Forrester has tracked continuously increasing levels of adoption and maturity for mobile business applications, but not so for mobile business intelligence (BI) applications. The adoption and maturity of mobile BI fall behind other mobile enterprise applications for multiple reasons, mainly the lack of specific business use cases and tangible ROI, as well as inadequate smartphone screen and keyboard form factors. However, larger form factor devices such as tablets and innovative approaches to online/offline BI technical architecture will boost mobile BI adoption and maturity in the near future. BP professionals must start evaluating and prototyping mobile BI platforms and applications to make sure that all key business processes and relevant information are available to knowledge workers wherever they are.
But mobile BI adoption levels are still low. Why? We see three major reasons.
Smartphones still lack the form factor appropriate for BI
The business case for mobile BI remains tough to build
Mobile device security is still a concern
Now, mobile tablet devices are a different story. Just like Baby Bear's porridge in the "Goldilocks And The Three Bears" fairy tale, tablet PCs are "just right" for mobile BI end users. So what can you do with mobile BI? Plenty!
Improve customer and partner engagement
Deliver BI in the right place, at the right time
Introduce BI for the workers without access to traditional BI applications
Improve BI efficiency via query relevance
Improve "elevator pitch" effectiveness
Give away mobile devices as an incentive to cross-sell and upsell analytic applications
Companies are in a unique position today, as they have an unprecedented ability to collect information about consumers through various channels and thus create rich and deep profiles of their target customers. However, what is considered a goldmine of information has actually highlighted many pain points, including:
Consumers are being bombarded with multiple surveys across different channels by different departments. As a result, consumers feel more and more that they are being badgered for information about themselves.
A siloed department structure creates little incentive to collaborate across departments. Thus, repetition of similar projects by different departments occurs, contradictory results can be communicated internally, and learning based on a department’s successes and failures from past projects is not communicated across departments.
I get tons of questions about "how much it costs to develop an analytical application." Alas, as most of us unfortunately know, the only real answer to that question is “it depends.” It depends on the scope, requirements, technology used, corporate culture and at least a few dozen of more dimensions. However, at the risk of a huge oversimplification, in many cases we can often apply the good old 80/20 rule as follows:
~20% for software, hardware, and other data center and communications infrastructure