It seems that every week another vendor slaps “big data” into its marketing material – and it’s going to get worse. Should you look beyond the vendor hype and pay attention? Absolutely yes! Why? Because big data has the potential to shape your market’s next winners and losers.
At Forrester, we think clients must develop an intuitive understanding of big data by learning: 1) what is new about it; 2) what it is; and 3) how it will influence their market.
What is new about big data? We estimate that firms effectively utilize less than 5% of available data. Why so little? The rest is simply too expensive to deal with. Big data is new because it lets firms affordably dip into that other 95%. If two companies use data with the same effectiveness but one can handle 15% of available data and one is stuck at 5%, who do you think will win? The deal, however, is that big data is not like your traditional BI tools; it will require new processes and may totally redefine your approach to data governance.
Perhaps no one understands better than Dan Ranta, Director of Knowledge Sharing at ConocoPhillips, that the challenge of sharing knowledge is very real — while the potential payoff can be large. Seven years ago, ConocoPhillips launched a large initiative to create internal communities of practice that would enhance knowledge sharing within the firm. With operations in more than 30 countries, encompassing job sites often in remote locations, the international energy company knew that to continue on its success trajectory, it needed to rapidly and effectively harness the knowledge of its highly skilled but geographically distributed workforce.
Today, the ConocoPhillips' knowledge-sharing program — built upon 150 global "networks of excellence" — is ranked as best-in-class across industries, and has documented hundreds of millions of dollars in estimated cash flow from its start in 2004 to the present. To learn more about how firms can drive business excellence with formal, global networks, I spoke with Dan in preparation for his keynote this week at Forrester’s Content & Collaboration Forum.
1) Can you explain the reasoning behind the proactive and reactive components of your networks of excellence?
Enterprise Architecture is a challenged role in IT. While more than 50% of all IT shops – and all large IT shops (greater than $100M budget) – have an EA practice in some form, most EA teams struggle with defining a mission that is relevant to their business and executing on this mission to produce the benefits their business needs. This struggle leads to frequent re-organizations, struggles for credibility and influence, and often an EA focus on the low-hanging fruit of technology standardization.
But this is changing.
Last year, Forrester teamed up with InfoWorld to select five EA programs that were having a measurable impact on their businesses. Our purpose for this awards program was to spotlight highly effective programs that embodied practices that we could all learn from. We found EA programs that were producing results ranging from saving millions of dollars per year in IT expenditures, to guiding IT transformation into business partners, to guiding business planning.
We are seeing firms migrating away from traditional IT-centric approaches. Why? They have to: Customers are now empowered — and companies are not. So you have to ask yourself a few questions:
How long will packaged apps survive?
How long will it take before customer engagement wins out over the desire to control?
How do enterprises prepare to move to a federated deployment approach to meet process goals?
Federated deployments grab best-of-breed functions from the app Internet and SaaS-based solutions and use emerging technologies like dynamic case management, analytics, and collaboration. Figuring out how to implement these types of deployments to meet the business process needs companies will have in 2020 requires Big Thinking — and we will be addressing this as one of the main topics of discussion at Forrester’s Business Process Forum next week in Boston.
We at Forrester have written a lot about the “empowered era” in the past year. We’re talking about the empowerment of customers and employees, the consumerization of technology, and grass-roots-based, tech-enabled innovation. There are lots of great case studies around illustrating these forces and how they can benefit the enterprise, but those success stories are only part of the picture. Behind the scenes, there is disruption and confusion about who’s planning the road ahead regarding the technology in our organizations’ future. It used to be that the CIO made sure that happened by making it the exclusive domain of strategic planners and enterprise architects. But isn’t centralized — and IT-based — tech planning the opposite of empowerment? Wouldn’t sticking with the old approach result in missing out on all this employee innovation that’s supposed to be so powerful? Should the CIO no longer establish the technology the enterprise will use? Does the empowerment era mean the end of tech planning as we know it?
What's a customer-obsessed company? One that is deeply committed to know and engage with its customers. The three winners of our 2011 Voice of the Customer award -- Adobe, Fidelity and JetBlue -- don't just train employees to deliver great customer experiences; they monitor service satisfaction and systematically act on what they learn. My colleague Zach Hofer-Shall calls this management and analysis of customer-generated information "Social Intelligence."
I think the Voice of the Employee should share the spotlight with the Voice of the Customer.
Few clients I talk to analyze employee-generated information the way that they do customer-generated information. It's now mainstream to listen to customer opinions regarding your product's or service's shortfalls or what competitors do better. But it's cutting-edge to listen to employees as part of a consistent, automated, scalable, strategic initiative. I am not talking about reading private emails or sending an annual employee survey. Instead I mean mining solicited sources like open-ended feedback requests and unsolicited sources like wikis, content archives and public internal social profile pages.
Today, Tibco Software — best known for its SOA integration, complex event processing, and business process management suite — announced its acquisition of Nimbus Partners, a privately held business process analysis vendor based in the United Kingdom. Nimbus Partners is smaller and less well known than the other more mature and full-featured BPA solutions, such as those from ARIS, Provision, and Mega. Nimbus, which employs more than 100 people, sold process discovery and authoring tools along with its homegrown methodology for quickly capturing and managing detailed information on business processes. Nimbus’ features and ease of use appealed mostly to process architects, process analysts, and business stakeholders that wanted an environment more robust than Microsoft Visio but not as technical — or requiring as much training — as other BPA environments.
Whenever I think about big data, I can't help but think of beer – I have Dr. Eric Brewer to thank for that. Let me explain.
I've been doing a lot of big data inquiries and advisory consulting recently. For the most part, folks are just trying to figure out what it is. As I said in a previous post, the name is a misnomer – it is not just about big volume. In my upcoming report for CIOs, Expand Your Digital Horizon With Big Data, Boris Evelson and I present a definition of big data:
Big data: techniques and technologies that make handling data at extreme scale economical.
You may be less than impressed with the overly simplistic definition, but there is more than meets the eye. In the figure, Boris and I illustrate the four V's of extreme scale:
The point of this graphic is that if you just have high volume or velocity, then big data may not be appropriate. As characteristics accumulate, however, big data becomes attractive by way of cost. The two main drivers are volume and velocity, while variety and variability shift the curve. In other words, extreme scale is more economical, and more economical means more people do it, leading to more solutions, etc.
So what does this have to do with beer? I've given my four V's spiel to lots of people, but a few aren't satisfied, so I've been resorting to the CAP Theorem, which Dr. Brewer presented at conference back in 2000. I'll let you read the link for the details, but the theorem (proven by MIT) goes something like this:
Earlier this year, I was invited to participate in an internal debate across the Forrester team serving the business process professional role on “The Future of Business Process: Packaged Apps vs BPM.” Our key takeaway: Organizations need to move away from siloed views of the business process domain and develop a more holistic view of business processes across both packaged applications and BPM disciplines. In short, we agreed that business process pros should embrace “big process thinking,” as we’re beginning to call it, to deal with increasingly splintered and fragmented processes that span across packaged applications, BPM suites, on-premises solutions, cloud-based solutions, mobile platforms, and social environments.
Following this debate, key Forrester business process analysts embarked on new research to flesh out exactly how business processes — and the business process discipline — will need to evolve in the face of continuous disruption and competitive threats. Over the past three months, we interviewed firms with leading business transformation programs, industry thought leaders, and technology vendors to paint a picture of what business processes will look like in 2020. Based on these interviews, business process will evolve over the next decade to become:
Yesterday, HP agreed to buy UK software firm Autonomy Corp. for $10 billion to move into the enterprise information management (EIM) software business. HP wants to add IP to its portfolio, build next-generation information platforms, and create a vehicle for services. It is following IBM’s strategy of acquiring software to sell to accompany its hardware and services. With Autonomy under its wing, HP plans to help enterprises with a big, complicated problem – how to manage unstructured information for competitive advantage. Here’s the wrinkle – Autonomy hasn’t solved that problem. In fact, it’s not a pure technology problem because content is so different than data. It’s a people, process problem, too.
Here is the Autonomy overview that HP gave investors yesterday:
Of course, this diagram doesn’t look like the heterogeneous environment of a typical multinational enterprise. Autonomy has acquired many companies to fill in the boxes here, but the reality is that companies have products from a smorgasbord of content management vendors but no incentive to stick with any one of them.