Analytics are the steering wheel that humanity uses to drive the world — or at least that portion of the planet over which we have some influence. Without the sensors, the correlators, the aggregators, the visualizers, the solvers, and the rest of what analytic applications depend on, we would be only a passenger, not a copilot, on this, our only home.
If you’ve spent any time around advanced mathematics and analytics, you’re bound to run into the phrase “global optimization.” All in all, this has little to do with optimizing the globe we live on; instead, it refers to techniques for solving a set of equations under various constraints. Nevertheless, I love the phrase’s evocative ring, in that it suggests the Gaia Hypothesis, a controversial conjecture that the Earth is a sort of super-organism. Specifically, it models the Earth as a closed, self-regulating, virtuous feedback loop of organic and inorganic processes that, considered holistically, maintains life-sustaining homeostasis. This hypothesis suggests that the planet as a whole is continuously optimizing the conditions for our ongoing existence — and that the biosphere may perish, just like any organism, if it falls into a vicious feedback loop of its own undoing.
Today, the gap between a customer’s expectations and the customer experience they receive is huge. In our latest customer experience survey, we found that just over one-third of US brands deliver a good experience. What is even more surprising is that, in the five years that Forrester has been collecting this data, this number has not significantly changed.
Delivering good customer service is a cornerstone to delivering a good end-to-end customer experience. Yet few companies undertake efforts to follow best practices. This lack of attention to customer service has significant impacts for companies: escalating service costs, customer satisfaction numbers at rock-bottom levels, and anecdotes of poor service experiences amplified over social channels that can lead to brand erosion.
Mastering the customer service experience is hard to do. Here is a recap of my 10-step program. I’ve reordered the steps a little, but the message is still the same:
Big data is an ecosystem in which the open source approaches have the greatest momentum: the most widespread adoption and the most feverish innovation. Open source platforms are expanding their footprint in advanced analytics.
As the enterprise Hadoop market continues to mature and many companies deploy their clusters for the most demanding analytical challenges, data scientists will begin to migrate toward this new, open source-centric platform. At the same time, enterprise adoption of the open source R language will grow in 2012 and beyond, and we’ll see greater industry convergence between Hadoop and R, especially as analytics tool vendors integrate both technologies tightly into their offerings. We will also see increasing adoption of open source data integration tools, such as those commercialized by Talend and others, and of open source BI tools from Pentaho, Jaspersoft, and others.
This is happening for the following reasons:
Open source initiatives are transforming all platforms and tools. Open source infrastructure, platforms, tools, and applications — such as Linux, Apache, Eclipse, Python, Mozilla, and Android — have gained widespread adoption in many sectors of the IT world, due to advantages such as no-cost licensing, extensibility, and vibrant communities.
Open source communities are where the fresh action is. Open source communities have fostered innovative new approaches and ecosystems, increasingly getting a jump on the incumbent providers of proprietary, closed source — albeit feature-rich and robust — offerings in advanced analytics, data warehousing, and integration tools.
One of the predictions I made at the start of this year was that real-world experiments will become the new development paradigm for next best action in multichannel customer relationship management (CRM). If we consider that multichannel CRM applications are driving big data initiatives, it’s clear that real-world experiments are infusing data management and advanced analytics development best practices more broadly. Increasingly, my big data customer engagements are focusing on CRM next best action, with a keen customer interest in life-cycle management of the analytic applications needed for real-world experiments in marketing campaign and customer experience optimization.
This year and beyond, we will see enterprises place greater emphasis on real-world experiments as a fundamental best practice to be cultivated and enforced within their data science centers of excellence. In a next best action program, real-world experiments involve iterative changes to the analytics, rules, orchestrations, and other process and decision logic embedded in operational applications. You should monitor the performance of these iterations to gauge which collections of business logic deliver the intended outcomes, such as improved customer retention or reduced fulfillment time on high-priority orders.
The key use case of next best action infrastructure — aka decision automation — is to allow companies to rapidly engage in real-world experiments in production applications and, if they’re bold, in their operational business model as a whole. In a CRM context, you can implement different predictive propensity models in different channels, at different interaction points, using different call-center scripts and message contents, with different customer segments, and with other variables.